Time Series

Definitions

Time Series

Stationary

Types

Univariate

Multivariate

multiple random variables

single RV

Seasonality

Repeated patterns in the RV values

Trend

Inc/decr slope in the time series

Autocorrelation
(serial corr)

Correlation of future RV values with past with
some LAG applied

Nonstationary

When statistical properties (mean, std) change over time
and there is no trend or seasonality

Affects training methodology!!

Modeling

Evaluation

Data selection strategies

Fixed partitioning

Data = fixed chunks of T/V/T
80 train + 10 val + 10 test

Roll-forward partitioning

Add more data to training gradually
week-by-week for e.g.

Features

Moving average

Models

naive forecasting

vt = v(t-1)

click to edit

model average

vt = avg(v(t-X)..v(t-1))

click to edit

click to edit

TS can be tested for stationarity!
rolling statistic, Augmented Dickey Fuller tests, etc

Technics to convert to S

Differencing

Transformation

Several orders of the basic:
difference = previous observation - current observation
subtraction prev from current obs

Taking log, roots, etc from obs, depending on the
present trend

Moving Average

Several (= window width) data pts generate one value

Weighted moving avg

Distant points values less then more recent

Centered Moving Average

Trailing Moving Average

t-2, t-1, t

Forecasting


Steps


click to edit

Step 7: At the end we can do the future forecasting and get the future forecasted values in original scale.

Step 6: Now we will have an array of predictions which are in transformed scale. We just need to apply the reverse transformation to get the prediction values in original scale.

Step 5: We can assess the performance of a model by applying simple metrics such as residual sum of squares(RSS). Make sure to use whole data for prediction.

Step 4: Based on data analysis choose the appropriate model for time series forecasting

Step 3: Note down the transformation steps performed to make the time series stationary and make sure that the reverse transformation of data is possible to get the original scale back

Step 2: Do the analysis and identify the best method to make the time series stationary

Step 1: Understand the time series characteristics like trend, seasonality etc

Models

click to edit

Autoregression (AR)

Moving Average (MA)

Autoregressive Moving Average (ARMA)

Autoregressive Integrated Moving Average (ARIMA)

Seasonal Autoregressive Integrated Moving-Average (SARIMA)

Seasonal Autoregressive Integrated Moving-Average with Exogenous Regressors (SARIMAX)

Vector Autoregression (VAR)

Vector Autoregression Moving-Average (VARMA)

Vector Autoregression Moving-Average with Exogenous Regressors (VARMAX)

Simple Exponential Smoothing (SES)

Holt Winter’s Exponential Smoothing (HWES)

ARIMA stands for Auto-Regressive Integrated Moving Averages. It is actually a combination of AR and MA model.

ARIMA has three parameters 'p' for the order of Auto-Regressive (AR) part, 'q' for the order of Moving Average (MA) part and 'd' for the order of integrated part.

Rather than using past values of the forecast variable in a regression, a moving average model uses linear combination of past forecast errors

To figure out the order of MA model we will use ACF function

Deep Learning

Notes

Mind sequential nature of data! Shuffling is mandatory