Economics 430
Lecture 6 Characterizing Cycles
ARMA Models
1
Today’s Class
• Review of the MA(q) Process
• Review of the AR(p) Process
• Seasonal AR(p) Model: S-AR(p)
• Seasonal MA(q) Model: S-MA(q)
• Rational Distributed Lags
• Autoregressive Moving Average (ARMA) Models • R Example
• Full model: Trend + Seasonal + Cycle(s)
• R Example
• Recursive Estimation Procedures
2
Moving Average (MA) Models
• Moving-average (MA) models are always weakly stationary because they are finite linear combinations of a white noise sequence for which the first two moments are time invariant.
• Moving average processes are useful in describing phenomena in which events produce an immediate effect that only lasts for short periods of time.
• The MA model is a simple extension of the white noise series.
3
Moving Average (MA) Models
The MA(1) Process
• The first-order moving average process, MA(1) is:
1 of 4
• Unconditional mean and variance are: and
• Conditional mean and variance are: and
One-period memory of the MA(1) process.
4
Moving Average (MA) Models
The MA(1) Process
• The autocovariance function for the MA(1) process is:
θσ2, τ =1 0, otherwise
• The autocorrelation function for the MA(1) process is:
θ/(1+θ2), τ =1 0, otherwise
Sharp cutoff beyond displacement 1.
2 of 4
γ(τ) =
ρ(τ) =
5
Moving Average (MA) Models
The MA(1) Process
• Invertible MA(1) process: If |θ| < 1, then can ‘invert’ the MA(1) process. The inverted series is referred to as an autoregressive representation.
• Example: Autoregressive representation of the MA(1) process. and
• Solve for εt
3 of 4
6
Moving Average (MA) Models
The MA(1) Process
• After backward substitution:
• In lag operator notation:
4 of 4
autoregressive representation
• The lag operator polynomial has one root, which is obtained by solving for L from:
L =-1/θ
• Note: The inverse will be less than 1 in absolute value if
|θ|<1.
7
Moving Average (MA) Models The MA(q) Process
• Consider the finite-order moving average process of order q, MA(q):
where
• The higher order terms in the MA(q) process can capture more complex dynamic patterns.
• The MA(q) process is invertible provided the inverses of all of the roots are inside the unit circle.
autoregressive representation
The MA(q) approximates an infinite moving average with a finite-order moving average.
8
Autoregressive (AR) Models
• Autoregressive Models (AR) models are always invertible. However, to be stationary, the roots of Φ (L)yt=εt must lie outside the unit circle.
• Autoregressive processes are useful in describing situations in which the present value of a time series depends on its preceding values plus a random shock.
• AR processes are stochastic difference equations. They are used for modeling discrete-time stochastic dynamic processes (among others).
9
Autoregressive (AR) Models
The AR(1) Process
• The first-order autoregressive process, AR(1) is:
1 of 3
• Unconditional mean and variance are: and
• Conditional mean and variance are: and
10
Autoregressive (AR) Models
The AR(1) Process
• Yule-Walker Equation: γ(τ) = φ γ(τ-1) (recursive relation) • The autocovariance function for the AR(1) process is:
γ(τ) =φτ σ2 , τ = 0, 1, 2, ... 1-φ2
2 of 3
• The autocorrelation function for the AR(1) process is: ρ(τ) =φτ, τ = 0, 1, 2, ...
• The partial autocorrelation function for the AR(1) process is:
φ, τ =0 0, τ > 1
p(τ) =
11
Autoregressive (AR) Models
The AR(1) Process
• After backward substitution:
• In lag operator notation:
• The moving average representation for y is convergent if and only if |φ|<1 (covariance stationary condition for the AR(1) process).
3 of 3
12
Autoregressive (AR) Models The AR(p) Process
• The general pth order autoregressive process, AR(p) is:
• In lag operator form:
• The AR(p) process is covariance stationary if and only if the
inverses of all roots of Φ(L) are inside the unit circle (Σφ<1).
convergent infinite moving average
13
Seasonal Autoregressive Models: S-AR(p)
Deterministic Seasonal Cycle
4 3 2 1 0
-1
00 01 02 03 04 05 06 07 08 09 10
Q4
Q4
Q4
Q3
Q3
Q3
Q2
Q2
Y=1*Q1+0.5*Q2+0.5*Q3+3*Q4+e
14
Seasonal Autoregressive Models: S-AR(p)
Stochastic Seasonal Cycle
3
2 1 0
-1 -2 -3
We will examine these cases for this class.
20 22 24 26 28 30 32 34 36 38 40
Y=0.8*Y(-4)+e
15
Seasonal Autoregressive Models: S-AR(p)
• S-AR Model: Yt = c +φs Yt-s + εt, where s = frequency.
• Convention:
– quarterlys = 4
– monthlys = 12
– dailys = 7 or s = 5 (weekdays only)
• Example: S-AR(1) with s=4 (quarterly data) Yt =c+φ4 Yt-4 +εt
16
Seasonal Autoregressive Models: S-AR(p)
• Def: S-AR(p) Model = Seasonal AR model of order p.
• Yt =c+φ1s Yt-1s +φ2s Yt-2s +...+φps Yt-ps +εt
s=frequency • We can also express the S-AR(p) model in lag-
operator form as:
(1-φ1sLs + φ2s L2s +...+ φps Lps ) Yt = c+εt
p=order
17
Seasonal Autoregressive Models: S-AR(p)
• Example 1: Identify the correct S-AR(p) model for the process: Yt = c +φ4 Yt-4+ εt
We can express it as: Yt = c +φ1×4 Yt-1×4+ εt p=1, s=4 S-AR(1) with s=4 (Quarterly)
• Example 2: Identify the correct S-AR(p) model for the process: Yt = c +φ12 Yt-12+ εt
We can express it as: Yt = c +φ1×12 Yt-1×12+ εt p=1, s=12 S-AR(1) with s=12 (Monthly)
18
Seasonal Autoregressive Models: S-AR(p)
• Example 3: Identify the correct S-AR(p) model for the process: Yt = c +φ4 Yt-4+φ8 Yt-8 + εt
We can express it as: Yt = c +φ1×4 Yt-1×4+φ2×4 Yt-2×4+ εt p=2, s=4S-AR(2) with s=4 (Quarterly)
• Example 4: Identify the correct S-AR(p) model for the process: Yt = c +φ12 Yt-12++φ24 Yt-24 + εt
We can express it as: Yt = c +φ1×12 Yt-1×12+φ2×12 Yt-2×12+ εt p=2, s=12S-AR(2) with s=12 (Monthly)
19
Seasonal AR(1)
4
8 12 16
ACF: spikes at 1s, 2s,... Then decays to zero.
PACF: 1-spikeAR(1) Lag =4s=4(quarterly)
20
Seasonal AR(2)
4
8 12 16
ACF: spikes at 1s, 2s,... Then decays to zero.
PACF: 2-spikesAR(2) Lag =4s=4(quarterly)
21
Seasonal AR(1) Example: Monthly Clothing Sales
22000
20000
18000
16000
14000
12000
10000
8000
6000
2003 2004 2005 2006 2007 2008 2009 2010
Q: What model would you suggest? S-AR(1) with s=12
Clothing Sales (millions $)
22
Seasonal Autoregressive Models: S-MA(q)
• Def: S-MA(q) Model = Seasonal MA model of order q.
• Yt =μ+θ1s εt-1s +θ2s εt-2s +...+θqs εt-qs +εt
s=frequency • We can also express the S-MA(q) model in lag-
operator form as:
Yt =μ+(1+θ1sLs + θ2s L2s +...+ θqs Lqs )εt
p=order
23
Seasonal Autoregressive Models: S-MA(q)
• Example 1: Identify the correct S-MA(q) model for the process: Yt = μ +θ4 εt-4 + εt
We can express it as: Yt = μ +θ1×4 εt-1×4+ εt q=1, s=4 S-MA(1) with s=4 (Quarterly)
• Example 2: Identify the correct S-MA(q) model for the process: Yt = μ +θ12 εt-12 + εt
We can express it as: Yt = μ +θ1×12 εt-1×12+ εt q=1, s=12 S-MA(1) with s=12 (Monthly)
24
Seasonal Autoregressive Models: S-MA(q)
• Example 1: Identify the correct S-MA(q) model for the process: Yt = μ +θ4 εt-4 + θ8 εt-8 + εt
We can express it as: Yt = μ +θ2×4 εt-2×4+ εt q=2, s=4 S-MA(2) with s=4 (Quarterly)
• Example 2: Identify the correct S-MA(q) model for the process: Yt = μ +θ12 εt-12 + θ24 εt-24 + εt
We can express it as: Yt = μ +θ2×12 εt-2×12+ εt q=2, s=12 S-MA(2) with s=12 (Monthly)
25
Seasonal MA(1)
5 4 3 2 1 0
-1
20 22 24 26 28 30 32 34 36 38 40
4
8 12 16
PACF: spikes at 1s, 2s,... Then decays to zero. 26
Y=2+0.9*e(-4)+e
ACF: 1-spikeMA(1) Lag =4s=4(quarterly)
Seasonal MA(2)
5 4 3 2 1 0
-1
20 22 24 26 28 30 32 34 36 38 40
4
8 12 16
PACF: spikes at 1s, 2s,... Then decays to zero. 27
Y=2-e(-4)+0.25*e(-8)+e
ACF: 2-spikesMA(2) Lag =4s=4(quarterly)
Rational Distributed Lags
• Rational Distributed Lags:
(In practice use an approximation)
polynomial of degree q
polynomial of degree p
• Rational distributed lags produce models of cycles that economize on parameters.
• If p and q are small (e.g., 0, 1, or 2), then estimation of B(L) is easy. 28
Autoregressive Moving Average (ARMA) ModelsARMA(p,q)
• Recall from Wold’s approximation that: and
ARMA Models are often highly accurate and highly parsimonious.
AR(p)
MA(q)
The ARMA model combines the ideas of AR and MA models into a compact form so that the number of parameters used is kept small, achieving parsimony in parameterization.
29
Autoregressive Moving Average
(ARMA) ModelsARMA(p,q) • The ARMA(1,1) Process: Yt = φYt-1+θεt-1+εt
• In lag operator form: (1-φL)Yt = (1+θL) εt
where |φ| <1 for stationarity and |θ| <1 for invertibility.
1 of 2
(if stationary) (if invertible)
30
Autoregressive Moving Average
(ARMA) ModelsARMA(p,q) • TheARMA(p,q)Process:
2 of 2
where
and
(if stationary) (if invertible)
31
Mystery Process
ARMA(2,2)
MA(2)
AR(2)
32
Summary 1 of 3
• AR(p): Current value of Yt can be found from past values, plus a random shock εt.
–Yt isregressedonpastvaluesofYt.
• MA(q): Current value of Yt can be found from past
shocks, plus a new shock/error εt.
– The time series is regarded as a moving average (unevenly weighted, because of different coefficients) of a random shock series εt.
33
Summary 2 of 3
• ForMAmodels,ACFisusefulinspecifyingtheorder because ACF cuts off at lag q for an MA(q) series.
• ForARmodels,PACFisusefulinorderdetermination because PACF cuts off at lag p for an AR(p) process.
• For an ARMA(p,q) process, lower-order models are better. For example, ARMA(1,1) is better than AR(3).
34
Summary 3 of 3
• FullModel:Yt =T+S+C
– Trend:
– Seasonal: – Cycle:
35
Full Model 1 of 2
• The full model includes a trend, seasonal dummies, and cyclical dynamics.
Trend
Seasonality
Cycles
Innovation
36
Full Model 2 of 2
We can now construct the h-step-ahead point forecast at time T, yT+h,T .
Project the r.h.s variables on Ωt .
Make the point forecast operational by replacing unknown parameters with estimates.
Confidence Interval Density Forecast
37
Liquor Sales Example 1 of 7
Quadratic Trend Model
Strong residual seasonality (maybe even cycles)
38
Liquor Sales Example 2 of 7
Bartlett Bands
S-AR(1) with s =12
Indicate the presence of cyclical dynamics.
39
Liquor Sales Example 3 of 7
Quadratic Trend Model +Seasonality
No seasonality but strong serial correlation
Highly Predictable!
40
Liquor Sales Example 4 of 7
Residual Sample Autocorrelations
Residual Sample Partial Autocorrelations
Cutoff at 3AR(3) model
41
Liquor Sales Example 5 of 7
No structure left in the residuals!
42
Residuals Log (sales)
Recursive Estimation Procedures
• Strategy:
1. Start with a small sample of data
2. Estimate a model
3. Add an observation
4. Re-estimate the model
5. Continue until all data are used
• Useful for forecasting, stability assessment, and model selection.
43
Recursive Parameter Estimation
and Recursive Residuals
• Recursiveestimatesprovideinformationabout parameter stability.
• Linear Model: ,
• Ifthemodelcontainskparameters,startwith first k obs and estimate the model, the k+1,...
recursive parameter estimates for t=k,...T and i=1,...,k.
1 of 2
44
Recursive Parameter Estimation
and Recursive Residuals
• 1-step-ahead-forecast:
• Recursive residuals:
where rt >1 for all t. Note: rt depends on the data.
• Standardized Recursive Residuals: where
• Cumulative Sum: where t =k,…,T-1
45
2 of 2
Liquor Sales Example 6 of 7
Cumulative Sum
Used for parameter stability
Variance grows as we include more observations
What happened?
46
Liquor Sales Example 7 of 7
Breaking Parameter Model
47
Model Selection Based on Simulated
Forecasting Performance
• Cross Validation: Select among N forecasting models.
1. 2. 3. 4.
•
Start with model 1
Estimate it using all data observations except the first
Use it to forecast the first observation
Compute the squared forecast error.
Continue estimating the model with one observation deleted and then using the estimated model to forecast the deleted observation until each observation has been sequentially deleted.
Average the squared errors
Repeat procedure for model n =1,…,N
Select the model with the smallest squared error.
5. 6. 7.
1 of 2
48
Model Selection Based on Simulated
Forecasting Performance
Recursive Cross Validation:
• Let the initial estimation sample run from t =1,…,T*
• Let the `holdout sample’ run from t =T*+1,…,T
• For each model:
– Estimatethemodelusingobservationst=1,…,T*. – Use the model to forecast observation T*+1.
– Computetheassociatedsquarederror.
– Updatethesampleby1observation(T*+1)
– Estimatethemodelusingtheupdatedsamplet=1,…,T*+1.
– ForecastobservationT*+2,computedassociatedsquarederror. – Repeatpreviousstepsuntilsampleisexhausted.
• Average the squared errors in predicting observations T*+1 through T.
• Select the model with the smallest squared forecast error.
2 of 2
49