Covariance Stationarity
General Theory
Australian National University
(James Taylor) 1 / 5
8 I
Covariance Stationarity
Focusing now on cyclical models without drift
Need to assume the underlying probabilistic structure doesn’t change
If it changes, forecasting would not be possible
Use covariance stationarity (weak second order stationarity)
(James Taylor) 2 / 5
Covariance Stationarity – Definition
Let {yt} = {. . . , y�1, y0, y1, y2, . . .} be a (doubly-infinite) sequence.
{yt} is covariance stationary if
E(yt) = µ, Cov(yt , yt�s) = g(s)
The function g : N ! R is the autocovariance function.
Note g(0) = Var(yt).
(James Taylor) 3 / 5
Yoo
a
iVarcytIaE
Ht.yoN l I i64
y U Io 2
YsNG
Covariance Stationary – Pros and Cons
Covariance stationarity is very strict
Non-e.g. Anything with trend or seasonality
However, we can deal with the trend and seasonality so that what’s
left is covariant stationary
Often Overlooked: These methods allow us to solve the covariant
stationary part only. Still need to deal separately with
trend/seasonality etc.
(James Taylor) 4 / 5
CARMA ARMA
White Noise
White Noise {ut} is a special kind of covariant stationary sequence
where
Eut = 0, Var(ut) = s
2
u < •, Eutus = 0
Show this is indeed covariant stationary
This sequence is serially uncorrelated, sometimes serially independent
Examples: Sequence of iid N (0, s2), sequence of iid t(n, 0, s2).
Examples: Any sequence of iid random variables with mean zero and
finite variance
(James Taylor) 5 / 5
E IUt m Var Ut 62 i lov ut ht s f s
i
3 want tovCut i Ut s 8Ls1 forSFo
GvCut i ut s IE I ut Fut Uts TtUt s1
TEl utUts O W L
White Noise
White Noise {ut} is a special kind of covariant stationary sequence
where
Eut = 0, Var(ut) = s
2
u < •, Eutus = 0
Show this is indeed covariant stationary
This sequence is serially uncorrelated, sometimes serially independent
Examples: Sequence of iid N (0, s2), sequence of iid t(n, 0, s2).
Examples: Any sequence of iid random variables with mean zero and
finite variance
(James Taylor) 5 / 5
Covariance Stationarity
AR and MA models
Australian National University
(James Taylor) 1 / 7
8.2
Using White Noise
Recall: White Noise {ut} is a covariant stationary sequence where
Eut = 0, Var(ut) = s
2
u < •, Eutus = 0
Nice but very limiting, no persistence
But using white noise we can build interesting models
Example: Moving average of white noise - MA(1)
Example: Autoregressive process - AR(1)
(James Taylor) 2 / 7
0
Moving Average Process
Let {et} be an MA(1) process:
et = ut + yut�1
Then {et} is covariant stationary with autocovariance:
g(0) = Var(et) = s
2
u + y
2s2u
g(1) = Cov(et , et�1) = ys
2
u
g(s) = Cov(et , et�s) = 0, s = 2, 3, . . .
(James Taylor) 3 / 7
Varlutt Gu
IFlutko E UtUs _o for tts
GV Ut.us IF utUs
yo IELstl ttlutltxftlut.it0
p p e CoV EtEst E EtEs
8 0 GV Et Et Var Et test Stl
IF htt xUt1 htt tht 11
lE utttxutut itxtutD
6utdx.at NGG
6utx26u
Moving Average Process
Let {et} be an MA(1) process:
et = ut + yut�1
Then {et} is covariant stationary with autocovariance:
g(0) = Var(et) = s
2
u + y
2s2u
g(1) = Cov(et , et�1) = ys
2
u
g(s) = Cov(et , et�s) = 0, s = 2, 3, . . .
(James Taylor) 3 / 7
fLD Cov Et Et l IE Et St I I IE Iut t t Ut l ut i t XUt4
TtI utUt I t t UtUt s t t Ut t t t ut iut if
O t f O t46ud tTtO IN6I
Moving Average Process
Let {et} be an MA(1) process:
et = ut + yut�1
Then {et} is covariant stationary with autocovariance:
g(0) = Var(et) = s
2
u + y
2s2u
g(1) = Cov(et , et�1) = ys
2
u
g(s) = Cov(et , et�s) = 0, s = 2, 3, . . .
(James Taylor) 3 / 7
0
fly Gv Et Et 2 IELEt Et L IE Cut t xUi i lCUt r t tUt s1
IF utUtL t t UtUt s t t Ut IUt L t X ut IUl s
OtO t OtO
O
Moving Average Process
Let {et} be an MA(1) process:
et = ut + yut�1
Then {et} is covariant stationary with autocovariance:
g(0) = Var(et) = s
2
u + y
2s2u
g(1) = Cov(et , et�1) = ys
2
u
g(s) = Cov(et , et�s) = 0, s = 2, 3, . . .
(James Taylor) 3 / 7
hmm
Autoregressive Process
Let {et} be an AR(1) process:
et = fet�1 + ut
with |f| < 1.
Then {et} is covariant stationary with autocovariance:
g(0) = Var(et) =
s2u
1� f2
g(s) = Cov(et , et�s) = f
sg(0), s = 1, 2, . . .
(James Taylor) 4 / 7
IE Ut IE UtUs ofor tts
LetE Et m
IE Et E 01St int
p
IEM_Onto
Gvkt.cat GvlEti Etit flo1 ewor M o
Ht
8 0 Var Et GV Et Et Tt Et Et Tt 0Et lthtkfst i utl
tt 02EE.it201St Ut t Ut
82.810 120 0 64
Tco 028101 64 Ko 6h
it to
4 04.8101
Autoregressive Process
Let {et} be an AR(1) process:
et = fet�1 + ut
with |f| < 1.
Then {et} is covariant stationary with autocovariance:
g(0) = Var(et) =
s2u
1� f2
g(s) = Cov(et , et�s) = f
sg(0), s = 1, 2, . . .
(James Taylor) 4 / 7
811 GV Et Et y IE Et St Tt lolEt I t Ut l Et if
IE ofEt I t UtEt i 08 lo t O
un
Autoregressive Process
Let {et} be an AR(1) process:
et = fet�1 + ut
with |f| < 1.
Then {et} is covariant stationary with autocovariance:
g(0) = Var(et) =
s2u
1� f2
g(s) = Cov(et , et�s) = f
sg(0), s = 1, 2, . . .
(James Taylor) 4 / 7
0
(James Taylor) 4 / 7
tck Gv Et Et z IE Et Et L
It LofEt I tUtt Et if11
013tlol It ol St L t fUt t t Ut Et if
E ol Et r t olUt i Et L t UtSt if
y r lo t o t o 6 r loI
Finding g for MA(1), the xcov function
%MA(1) Process
N = 1000; u = randn(1,N);
psi = 0.7; e = zeros(1,N);
e(1) = u(1);
for n = 2:N
e(n) = u(n) + psi*u(n�1);
end
[cov e,lags e] = xcov(e,10,'coeff');
stem(lags e,cov e)
(James Taylor) 5 / 7
makedata
Eh Unt 4 Uh i
ai
plot
Finding g for AR(1), the xcov function
%AR(1) Process
N = 1000; u = randn(1,N);
y = zeros(1,N); ph = 0.7;
y(1) = ph*u(1);
for n = 2:N
y(n) = ph*y(n�1) + u(n);
end
[cov y,lags y] = xcov(y,10,'coeff');
stem(lags y,cov y)
(James Taylor) 6 / 7
makedata
yn ofyn it Un
f
T T T
plot
Visualising g
(a) MA(1) Process (b) AR(1) Process
Figure: Autocovariance functions with f = y = 0.7
(James Taylor) 7 / 7
I
o7
THI Kil
symmetric
Wold Representation Theorem
Australian National University
(James Taylor) 1 / 6
8.3
Wold Representation Theorem
Why should we care about moving average models?
Because of Wold’s Representation Theorem
Every covariant stationary process can be represented as a moving
average process
(James Taylor) 2 / 6
µA g Et Ut t t Ut I t t LUt L t t XgUt f
I
The Lag Operator
The lag operator L acts on time series
Lyt = yt�1
Similarly
L2yt = LLyt = Lyt�1 = yt�2
Indeed for any polynomial B(L) where
B(L) = b0 + b1L+ b2L
2 + · · ·+ bmLm
=) B(L)yt = boyt + b1yt�1 + b2yt�2 + · · ·+ bmyt�m
=
m
Â
i=0
biyt�i
(James Taylor) 3 / 6
Do
Lag Polynomials
The polynomial B(L) transforms yt into a weighted sum of current
and past values
The classical di↵erence operator D is a lag polynomial
Dyt = yt � yt�1 = (1� L)yt
Infinite-order lag polynomials (really formal sums):
B(L) = b0 + b1L+ b2L
2 + · · ·+ =
•
Â
i=1
biL
i
(James Taylor) 4 / 6
f exit x f ytLfLyt
Yt Yt1 Yt Lyt
or Ll 1 Yt
Infinite-order Lag Polynomials
B(L) = b0 + b1L+ b2L
2 + · · ·+ =
•
Â
i=1
biL
i
Infinite-order lag polynomials seem of little practical interest
They have infinity many parameters, so cannot be estimated from
finite data
But particular ones may not, e.g.
B(L) = 1+ bL+ b2L2 + · · · =
•
Â
i=0
biLi
Also are very important theoretically in Wold’s Theorem
(James Taylor) 5 / 6
bo b bz
f
1 parameter
TARU
Wold’s Representation Theorem
Wold’s Representation Theorem
Theorem: Let {yt} be a zero-mean covariance stationary process. Then
there is a representation
yt = B(L)ut =
•
Â
i=0
biut�i
where {ut} is white noise, b0 = 1 and •i=0 b2i < •.
(James Taylor) 6 / 6
4
Rational Polynomials
Australian National University
(James Taylor) 1 / 2
8.3A
Rational Polynomials
The problem with Wold’s Theorem is too many parameters
Sometimes we can reduce the number of parameters
B(L) = 1+ bL+ b2L2 + · · · =
•
Â
i=0
biLi =
1
1� bL
More generally, we have rational polynomials
B(L) =
Y(L)
F(L)
=
Âqi=1 yiL
i
Âpj=0 fiL
i
Has only p + q parameters.
If B(L) isn’t really rational, we can approximate with a rational
(James Taylor) 2 / 2
0
o.o
geometricsequence
Not a
EIXi
Polynomial
Rational Polynomials
The problem with Wold’s Theorem is too many parameters
Sometimes we can reduce the number of parameters
B(L) = 1+ bL+ b2L2 + · · · =
•
Â
i=0
biLi =
1
1� bL
More generally, we have rational polynomials
B(L) =
Y(L)
F(L)
=
Âqi=1 yiL
i
Âpj=0 fiL
i
Has only p + q parameters.
If B(L) isn’t really rational, we can approximate with a rational
(James Taylor) 2 / 2