The newsvendor problem with normal demand

We know from Section 7.4.4 that the optimal solution of a newsvendor problem with continuous demand is the solution of the equation i.e., the quantile of demand distribution, corresponding to probability m/(m+cμ). If we assume normal demand, with expected value μ and standard deviation σ, then the optimal order quantity (assuming that we want to maximize expected profit) is Assume that… Continue reading The newsvendor problem with normal demand

MISCELLANEOUS APPLICATIONS

In this section we outline a few applications from logistics and finance. The three examples will definitely look repetitive, and possibly boring, but this is exactly the point: Quantitative concepts may be applied to quite different situations, and this is why they are so valuable. In particular, we explore here three cases in which quantiles… Continue reading MISCELLANEOUS APPLICATIONS

Central limit theorem

cAs we noted, it is difficult to tell which distribution we obtain when summing a few i.i.d. variables. Surprisingly, we can tell something pretty general when we sum a large number of such variables. We can get a clue by looking at Fig. 7.22. We see the histogram obtained by sampling the sum of independent exponential random variables… Continue reading Central limit theorem

Distributions obtained from the normal

As we pointed out, if we sum i.i.d. random variables, we may end up with a completely different distributions, with the normal as a notable exception. However, there are ways to combine independent normal random variables that lead to new distributions that have remarkable applications, among other things, in inferential statistics. In fact, statistical tables… Continue reading Distributions obtained from the normal

The square-root rule

Consider a sequence of i.i.d. random variables observed over time, Xt, t = 1,…, T. Let μ and σ be the expected value and standard deviation of each Xt, respectively. Then, if we consider the sum over the T periods, , we have We see that the expected value scales linearly with time, whereas the standard deviation scales with the square root of time. Sometimes students and practitioners are… Continue reading The square-root rule

SUMS OF INDEPENDENT RANDOM VARIABLES

A recurring task in applications is summing random variables. If we have n random variables Xi, i = 1,…, n, we may build another random variable What can we say about the distribution of Y? The answer depends on two important features of the terms in the sum: We will clarify what we mean by “independent random variables” formally but… Continue reading SUMS OF INDEPENDENT RANDOM VARIABLES

Empirical distributions

Sometimes, no theoretical distribution seems to fit available data, and we resort to an empirical distribution. A standard way to build an empirical distribution is based on order statistics, i.e., sorted values from a sample. Assume that we have a sample of n values and order statistics X(i), i =1,…, n, where X(i) ≤ X(i+1). The value X(1) is the smallest observation and X(n) is the largest one.… Continue reading Empirical distributions

Normal distribution and its quantiles

The normal distribution is by far the most common, and misused, distribution in the theory of probability. It is also known as Gaussian distribution, but the term “normal” illustrates its central role quite aptly. Its PDF has a seemingly awkward form Fig. 7.14 PDF of two normal distributions. depending on two parameters, μ and σ2. Actually, we met such a… Continue reading Normal distribution and its quantiles