This is the third in a series of articles I am writing on time series forecasting models. The first article was a real-world application of time series forecasting that I did with US housing market data, with the purpose of demonstrating how forecasting is implemented and interpreted. In the second article, I outlined 5 simple steps to build a forecasting model aiming at beginners who have never done forecasting before. In today’s article, I’m going to show how two different techniques can be applied to the same dataset and how to evaluate and compare their performance.

Among the many different forecasting techniques out there, I’ve picked 2 models to evaluate: ARIMA and Holt-Winters exponential smoothing. Before going any further I’ll briefly describe what they are and how they differ conceptually.

ARIMA (Autoregressive Integrated Moving Average):ARIMA is arguably the most popular and widely used statistical technique for forecasting. As the name suggests, ARIMA has 3 components: a) an _Autoregressive _component models the relationship between the series and its lagged values; b) the Moving Average component predicts future value as a function of lagged forecast errors; and c) the Integrated component makes the series stationary.

ARIMA model — represented as _ARIMA(p, q, d) _— takes the following parameters:

  • p that defines the number of lags;
  • d that specifies the number of differences used; and
  • q that defines the size of moving average window

**Holt-Winters: **it is another suit of techniques that also uses historical values. However, a key distinguishing feature is the so-called “exponential smoothing”.

If decomposed, a time series will disaggregate into 3 components: trend, seasonality, and white noise (i.e., random data points). For forecasting purposes, we can predict the predictable components (i.e., trend and seasonality), but not the unpredictable terms which occur in a random fashion. Exponential smoothing can handle this kind of variability within a series by smoothing out white noise. A Moving Average can smooth training data, but it does so by taking an average of past values and by weighting them equally. On the other hand, in Exponential Smoothing, the past observations are weighted in an exponentially decreasing order. Meaning, most recent observations are given higher weights than far-away values.

#machine-learning #arima #data-science #time-series-forecasting #forecasting #big data

Comparing the performance of forecasting models: Holt-Winters vs ARIMA
2.80 GEEK