A counterexample

Consider the queueing system in Fig. 9.2. Customers arrive according to a random process. If there is a free server, service is started immediately; otherwise, the customer joins the queue and waits. Service time is random as well, and whenever a server completes a service, the first customer in the queue (if any) starts her service.… Continue reading A counterexample

CONFIDENCE INTERVALS

The sample mean is a point estimator for the expected value, in the sense that it results in an estimate that is a single number. Since this estimator it is subject to some variance, it would be nice to have some measure of how much we can trust that single number. In other words, we would like to… Continue reading CONFIDENCE INTERVALS

Sample Variance

The typical estimator of variance is sample variance: This formula can be understood as a sample counterpart of the definition of variance: It is basically an average squared deviation with respect to sample mean. When doing calculations by hand, the following rearrangement can be useful: The sample standard deviation is just S, the square root of sample variance. We… Continue reading Sample Variance

Sample Mean

The sample mean is a well-known concept from descriptive statistics: If data come from a legitimate random sample, sample mean is a statistic. A natural use of sample mean is to estimate the expected value of the underlying random variable, that is unknown in practice. It is important to understand what we are doing: We are using a… Continue reading Sample Mean

Introduction

We have modeled uncertainty using the tools of probability theory. Problems in probability theory may require a fair level of mathematical sophistication, and often students are led to believe that the involved calculations are the real difficulty. However, this is not the correct view; the real issue is that whatever we do in probability theory… Continue reading Introduction

Markov processes

In Section 7.9 we introduced stochastic processes as sequences of random variables; assuming a discrete-time stochastic process, we have a sequence of the form Xt, t = 1,2,3,…. We have also pointed out that we cannot characterize a stochastic process in terms of the marginal distributions of each variable Xt. In principle, we should assign the joint distribution of all the… Continue reading Markov processes

The memoryless property of the exponential distribution

We have introduced the exponential distribution in Section 7.6.3, where we also pointed out its link with the Poisson distribution and the Poisson process. The standard use of exponential variables to model random time between events relies on its memoryless property, which we are now able to appreciate. Consider an exponential random variable X with parameter λ, and say that X models… Continue reading The memoryless property of the exponential distribution

Computing expectations by conditioning

In this section we take advantage of a fundamental theorem concerning iterated expectation. Before formalizing the idea, let us illustrate it by a simple example. Example 8.7 You are lost in an underground mine and stand in front of two tunnels. One of the two tunnels will lead you to the surface after a 5-hour walk; the… Continue reading Computing expectations by conditioning

CONDITIONAL EXPECTATION

We are already familiar with the concept of conditional probability when events are involved. When dealing with random variables X and Y, we might wonder whether knowing something about Y, possibly even its realized value, can help us in predicting the value of X. To introduce the concepts in the simplest way, it is a good idea to work with… Continue reading CONDITIONAL EXPECTATION