Economics 430
Lecture 7
Unit Roots, Stochastic Trends, ARIMA, Forecasting Models, and Smoothing
1
Today’s Class
• Stochastic Trends and Forecasting – Random Walk without Drift
– Random Walk with Drift
– ARIMA
• Unit Roots: Estimation and Testing
– Least Squares Regression with Unit Roots
– Effects of Unit Roots on the ACF and PACF – Unit Root Tests
• R Example
2
Stochastic Trends and Forecasting
• Often in Economics we encounter many nonstationary series (a.k.a. unit-root nonstationary), e.g., interest rates, foreign exchange rates, and the price series of an asset of interest.
• Consider an ARMA(p,q) process where one of the p roots of its autoregressive lag operator polynomial is 1 (unit root).
Δyt is covariance stationary.
A nonstationary series is integrated if its nonstationarity is undone by differencing.
3
Stochastic Trends and Forecasting
-Random Walk
• Ifonlyonedifferenceisrequired,theseriesissaidto be integrated of order 1, I(1). In general, for d differences, we have I(d) where the number of differences equals the number of unit roots.
• RandomWalk:IsanAR(1)processwithunit coefficientyt = yt-1 + εt and εt ~ WN(0,σ2).
Random walk =
the cumulative sum of white noise changes.
1 of 2
4
Stochastic Trends and Forecasting
-Random Walk
• RandomWalkwithDrift:IsanAR(1)processwith unit coefficientyt =δ + yt-1 + εt and εt ~ WN(0,σ2).
2 of 2
Stochastic Trend =
Random walk with (or without) drift
.
5
Stochastic Trends and Forecasting
-Random Walk
• Randomwalk:Givenyt=yt-1+εt,εt~WN(0,σ2),and y(0) = y0
• Random Walk with a Drift: Given yt =δ + yt-1 + εt , εt ~ WN(0,σ2), and y(0)=y0
3 of 3
6
Stochastic Trends and Forecasting -ARIMA(p,1,q)
• ARIMA: Autoregressive integrated moving average.
• The ARIMA(p,1,q) process is a stationary and invertible ARMA(p,q) process in first differences:
where
7
Stochastic Trends and Forecasting -ARIMA(p,d,q)
• In general, for the ARIMA(p,d,q) model
where
• The ARIMA(p,d,q) process is a stationary and invertible ARMA(p,q) after differencing d times.
8
Unit Roots: Estimation and Testing
LS Regression with Unit Roots
• We will consider Least Squares (LS) estimators in the case of models with unit roots:
• Letyequalarandomwalk,yt =yt-1+t.
• If we did not know that the autoregressive coefficient is 1, we can estimate it via, e.g., AR(1)yt =φyt-1 + t.
• Two implications are superconsistency and bias.
1 of 2
9
Unit Roots: Estimation and Testing
LS Regression with Unit Roots
• Superconsistency: For the unit root (φ =1) case, as the sample size T grows, -1 goes to zero very quickly (~1/T)LS estimator of a unit root is superconsistent (good for forecasting).
• For the covariance stationary case, |φ|<1, -1 goes to zero as 1/T1/2.
• The LS is biased downward, i.e., E[ ] < φTrue Bias is worst in the unit root case.
2 of 2
10
Unit Roots: Estimation and Testing
Unit Root Tests
• Serieswithunitroots,shouldbecheckedfortheir presence via e.g., a t-statistic for a 0 coefficient and
for a unit coefficient.
• For the unit root case, follows a Dickey-Fuller
Distribution.
• For the general, nonzero mean case (under the alternative hypothesis), the process is a covariance stationary AR(1) process in deviations from the mean. yt =α +φ yt-1 + εt , where α =μ(1-φ).
• Note:TheDickey-Fullerstatistictableisfor(α,φ)=(0,1). 11
1 of 4
Unit Roots: Estimation and Testing
Unit Root Tests
• The statistic, can be computed from the t-test by regressing yt on yt-1 when testing for φ=1.
(Dickey-Fuller Statistic)
• Wecanextendthemodeltoallowfordeterministic trend: yt =α +β TIMEt+φ yt-1 + εt (for φ=1, this is a random walk with drift), where α =a(1-φ)+bφ and β=b(1-φ).
2 of 4
12
Unit Roots: Estimation and Testing
Unit Root Tests
• For the general AR(p) process:
where and
• For the nonzero mean case:
3 of 4
, i =2,...,p.
where .
13
Unit Roots: Estimation and Testing
Unit Root Tests
• For the general AR(p) process with a linear trend:
4 of 4
where and
under the null hypothesis, k2=0
and .
14
Exponential Smoothing 1 of 2
• Given a random walk time series c0,t, where c0, t = c0, t-1+ηt, ηt ~WN(0,σ2η), we can consider
the time series yt as c0 plus white noise.
yt =c0,t+εt , where εt is uncorrelated with η
at all leads and lags.
• Strategy: Convert into a smoothed series , and forecasts, .
• Note: c0 is known as the local level.
15
Exponential Smoothing 2 of 2
• Algorithm:
1. Initialize at t=1:
2. Update:
3. Forecast:
• Result: One-sided moving average with exponentially declining weights.
where
16
Holt-Winters Smoothing 1 of 2
• If in addition to c0 slowly evolving, the series has a trend with a slowly evolving local slope, yt = c0,t + c1,tTIMEt +εt where,
c0,t = c0,t-1 +ηt and c1,t = c1,t-1 + vt then,
– Optimal Smoothing Algorithm = Holt-Winters
Smoothing.
• When the data-generating process is close to the one for which Holt-Winters is optimal, the forecasts may be close to optimal.
17
Holt-Winters Smoothing 2 of 2
• Algorithm:
1. Initializeatt=2:
2. Update:
3. Forecast:
18
Holt-Winters Smoothing 2 of 2
• Algorithm (Including Seasonality): 1. Initialize at t =s:
2. Update:
3. Forecast:
, ,
19