In finite-order moving-average processes, only a finite number of past realizations of white noise influence the value of Yt. This may be a limitation for those processes in which all of the previous realizations have an effect, even though this possibly fades in time. This consideration led us from forecasting using simple moving averages to exponential… Continue reading Autoregressive Processes
Month: February 2023
Moving-average processes
A finite-order moving-average process of order q, denoted by MA(q), can be expressed as where random variables are white noise, with and . These variables play the role of random shocks and drive the process. It is fairly easy to see that the process is weakly stationary. A first observation is that expected value and variance are constant: The calculation of… Continue reading Moving-average processes
A GLANCE AT ADVANCED TIME SERIES MODELING
The class of exponential smoothing methods was born out of heuristic intuition, even though methodological frameworks were later developed to provide them with a somewhat more solid justification. Despite these efforts, exponential smoothing methods do suffer from at least a couple of drawbacks: It is also worth noting that simple linear regression models share some… Continue reading A GLANCE AT ADVANCED TIME SERIES MODELING
Smoothing with trend and multiplicative seasonality
The last exponential smoothing approach we consider puts everything together and copes with additive trend and multiplicative seasonality. The Holt–Winter method is based on Eq. (11.13), which we repeat for convenience: The overall scheme uses three smoothing coefficients and it proceeds as follows All of the remarks we have made about simpler versions of exponential smoothing apply here… Continue reading Smoothing with trend and multiplicative seasonality
Smoothing with multiplicative seasonality
In this section we consider the case of pure seasonality. Forecasts are based on the demand model of Eq. (11.13), in which the trend parameter is set to : where s is the length of the seasonal cycle, i.e., a whole cycle consists of s time buckets.12 To get a grip of this model, imagine a yearly cycle consisting of 12 monthly… Continue reading Smoothing with multiplicative seasonality
Smoothing with trend
Demand may exhibit additive trend components that, in a static case, could be represented by the following demand model where B is the level and T is the trend. Looking at the demand model, linear regression seems a natural candidate to estimate these two parameters. However, level and trend might change over time, suggesting the opportunity of a dynamic… Continue reading Smoothing with trend
Stationary demand: initialization and choice of α
One obviously weird feature of Eq. (11.18) is that it involves an infinite sequence of observations. However, in real life we do not have an infinite number of observations; the sum must be truncated somewhere in the past, right before we started collecting information. The oldest term in the average, in practice, corresponds to the initialization of the algorithm. To… Continue reading Stationary demand: initialization and choice of α
Stationary demand: three views of a smoother
In this section, we deal with the case of stationary demand, as represented by Eq. (11.14). In simple exponential smoothing we estimate the level parameter Bt by a mix of new and old information: where α is a coefficient in the interval [0, 1]. In (11.16), the new information consists of the last observation of demand Yt, and the old information consists of… Continue reading Stationary demand: three views of a smoother
HEURISTIC EXPONENTIAL SMOOTHING
Exponential smoothing algorithms are a class of widely used forecasting methods that were born on the basis of heuristic insight. Originally, they lacked a proper statistical background, unlike the more sophisticated time series models that we outline in Section 11.6. More recently, attempts to justify exponential smoothing have been put forward, but the bottom line is that… Continue reading HEURISTIC EXPONENTIAL SMOOTHING
Choice of time window
In choosing the time window k, we have to consider a tradeoff between If k is large, the method has a lot of inertia and is not significantly influenced by occasional variability; however, it will be slow to adapt to systematic changes. On the contrary, if k is low, the algorithm will be very prompt, but also very nervous and… Continue reading Choice of time window