程序代写代做代考 graph finance Economics 430

Economics 430
Lecture 1 Time Series Characterization
1

Today’s Class 1 of 2
• Overview of the course structure and content
• Why Study Economic Forecasting?
• Resources
• Overview of Economic Forecasting
• Six Fundamental Considerations in Forecasting – Decision Environment
– Forecast Object
– Forecast Statement
– Forecast Horizon
– Information Set
– Parsimony Principle and Shrinkage Principle
2

Today’s Class 2 of 2
• IntroductiontoTimeSeriesData
• StochasticProcessandTimeSeries
• Stationarity
– First Order Strongly Stationary
– First Order Weakly Stationary
– Second Order Weakly Stationary (Covariance Stationary)
• Transformations of Nonstationary Processes • TheAutocorrelationFunctions
• R Example
3

Course Content
1. Modeling and Forecasting Trend
2. Modeling and Forecasting Seasonality
3. Modeling and Forecasting Cycles
4. Modeling and Forecasting with Regression Models
5. Modeling and Forecasting Volatility
6. Advanced Methods
4

1.
– – – –
– –
2.
– – – – – –
1.
– –
Books:
Useful References
Applied Econometric Time Series, 3rd Ed., W. Enders (Easy)
A Companion to Economic Forecasting. M. P. Clements and D. F. Hendry (Medium)
Time Series Analysis. J. D. Hamilton (Difficult)
Time Series Applications to Finance with R and S-Plus, 2nd Ed., N.G. Chan (includes R codes)
Analysis of Financial Time Series, 3rd Ed. Ruey S. Tsay (includes R codes) http://faculty.chicagobooth.edu/ruey.tsay/teaching/teachingBusiness 41202
Time Series Analysis and its Applications with R Examples, 3rd Ed. Robert H. Shumway & David S. Stoffer (includes R codes)
Journals:
Journal of Forecasting
International Journal of Forecasting Journal of Business and Forecasting Journal of Business and Economic Statistics Review of Economics and Statistics
Journal of Applied Econometrics
Online information:
Resources for Economists (www.rfe.org) Economagic (http://www.economagic.com)
5

Overview of Economic Forecasting1 of 2
• Q1: What is a forecast?
Ans: A forecast is a statement about the future.
– Note:Forecasting=thescienceandthearttopredictafutureevent with some degree of accuracy.
• Q2: How is forecasting done by economists?
Ans: In economics, methods of forecasting include:
– Guessing,“rulesofthumb”,or“informalmodels” – Expertjudgment
– Extrapolation
– Leadingindicators
– Surveys
– Time Series models – Econometricsystems
7

Overview of Economic Forecasting2 of 2
• Q3: What are the main problems?
Ans: One of the main problems with forecasting in economics is that economies evolve over time and are subject to intermittent, and sometimes large, unanticipated shocks.
• Q4: What are some relevant forecast examples? Ans: Some of the relevant areas include:
– Operationsplanningandcontrol – Marketing
– Economics
– Financial asset management
– Financialriskmanagement – Demography
– Crisis management
8

Standard Notation
Description
Technical name
Notation
Object to analyze:
Time series
{yt}
Value at present time t :
Known value of the series
yt
Future at time t+h:
Random variable
Yt+h
Value at future time t+h:
Unknown value of the random variable
yt+h
Collection of information :
Univariate information set Multivariate information set
It ={y1,y2,…,yt}
It ={y1,y2,…,yt,x1,x2,…,xt}
Final objective:
Forecast
1-step ahead h-step ahead
ft,1 ft,h
Uncertainty:
Forecast error
et,h = yt+h − ft,h
9

Six Fundamental Considerations in Forecasting The Forecast Object 1 of 1
Event outcome Forecast Object Event timing
Time Series
• Event Outcome Forecasts: Event is certain, outcome is unknown, timing known.
• Event Timing Forecasts: Event is certain, outcome is known, timing unknown.
• Time Series Forecasts: Project the future value of a time series. (most commonly encountered)
10

Six Fundamental Considerations in Forecasting The Forecast Statement 1 of 2
• What will the forecast result be? Most common!
• Point Forecast: Single number (e.g., μ). less information
• Interval Forecast: Range of values. (~confidence interval, e.g., μ ± z*σ).
more information
• Density Forecast: Probability Distribution of the of the future value of the series (e.g., N(μ, σ2 )).
11

Six Fundamental Considerations in Forecasting The Forecast Statement 2 of 2
Probability Density
5%
-2.3%
Density Forecast
5% GDP Growth 1.3 4.3%
Point Forecast
90% Interval Forecast
12

Six Fundamental Considerations in Forecasting The Forecast Statement 2 of 2
Interval forecast
yt .. ..
Density forecast Conditional probability
density function of Yt+h
…………….. t
Information set
Point forecast ft,h t+h time
012
13

Six Fundamental Considerations in Forecasting The Forecast Horizon 1 of 2
• Forecast Horizon: Number of periods between today and the end date of the forecast.
Example: Today: Year =T
Forecast GDP for year T+5Forecast horizon of 5 steps.
• In general: h-step-ahead forecast (where h is the step length). The horizon is fixed (h is constant).
• h-step-ahead extrapolation forecast: Includes all steps from 1-step-ahead to : h-steps-ahead.
14

Six Fundamental Considerations in Forecasting The Forecast Horizon 2 of 2
Series or Forecast
In-sample historical data
Out-of-sample forecast
* 4-Step-Ahead Extrapolation Point Forecast
*
1 2 … T-1 T T+1 T+2 T+3 T+4Time
4-Step-Ahead-Point Forecast
*
*
15

Six Fundamental Considerations in Forecasting The Information Set 1 of 1
• Your forecast is conditional on the quality and quantity of your data.
• Questions:
– What information is available?
– What else could be collected? Or made available? – Quantitative or Qualitative?
– Can the data be used more efficiently?
• Univariate Information Set: ΩTunivariate={yT , yT-1 ,…,y1}
• Multivariate Information Set:
ΩTmultivariate={xT , yT, , xT-1, yT-1 …, x1, y1} 16

Six Fundamental Considerations in Forecasting Models and Complexity, The Parsimony and Shrinkage Principles 1 of 2
• Your modeling strategy should be based on your forecast task.
• For out-of-sample forecasting in business, finance, and economics, simple (parsimonious) models are better.
• Parsimony Principle: All other things being equal, simpler models are better than complex ones.
• Shrinkage Principle: Imposing restrictions on forecasting models often improves forecast performance.
KISS Principle: Keep It Sophisticatedly Simple
17

Six Fundamental Considerations in Forecasting Models and Complexity, The Parsimony and Shrinkage Principles 1 of 2
1. Decision Environment and Loss Function 2. Forecast Object
3. Forecast Statement
4. Forecast Horizon
5. Information Set
6. Methods and Complexity
18

Example: Trends
Annual Hours Worked in the OECD Countries 1979-2006
2200
2100
2000
1900
1800
1700
1600
1500
1400
1979 1981 1983 1985 1987 1989 1991 1993 1995 1997 1999 2001 2003 2005
France Japan Spain West Germany United States
Source: http://www.oecd.org
19

Example: Cycles
Unemployed Persons (Seasonally Adjusted), Monthly Data 1988-2008
11 10 9 8 7 6
5
1988 1990
1992 1994 1996
1998 2000
2002 2004
2006 2008
Source: St. Louis Federal Reserve Bank. FRED
20
Millions

Example: Seasonality
Revenue Passenger Enplanements. Monthly Data 2000-2008
70,000 65,000 60,000 55,000 50,000 45,000 40,000 35,000 30,000
Peaks occur regularly around July
Source: Bureau of Transportation Statistics
21
Jan-00 Jul-00 Jan-01 Jul-01 Jan-02 Jul-02 Jan-03 Jul-03 Jan-04 Jul-04 Jan-05 Jul-05 Jan-06 Jul-06 Jan-07 Jul-07 Jan-08
Thousands

Example: Combined
New Home Sales in the United States. Quarterly Data 2004-2008
400
350
300
250
200
150
100
50
Source: National Association of Realtors
Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 2004: 2005: 2006: 2007: 2008:
22
Thousand

Example: Combined Hypothetical Time Series
Tt =Valueofthetrendcomponentinperiodt
S = Value of the seasonal component in period t
(T ) (St ) (I t)
T =1+0.1t
S = 1.6 sin(πt/6)
t
t
It=0.7I +𝜀𝜀
𝜀𝜀t = a pure random disturbance in period t t
It = Value of the irregular component in period t
t t-1 t
23

Model Building Strategy
Q: How to find an appropriate model for a Time Series? A: There are 3 steps
1.
2. 3.
Model Specification (or identification)
• Select the types of plausible models given the data (time plot)
Model Fitting
• Follow the parsimony principle
Model Diagnostics
• Assess the quality of the model
24

Introduction to Time Series Data 1 of 3
• Q1: What is a stochastic process and what is a time series?
• Q2: What is the interpretation of a time average or any other time moment?
• Q3: What are the new tools of analysis?
25

Introduction to Time Series Data 2 of 3
• Given the ‘Time Series Sample’:
• What ‘economic mechanism’ generated this
sample, and under what conditions?
• Examples:
– ^DJI Stock Price: St
– ^DJI Returns: (St – St-1)/St-1 – Property Crimes
26

Introduction to Time Series Data 3 of 3
Stock Price
16000
14000 10
Returns
12000
10000
8000
6000
4000
2000
15
5
0
-5
-10
-15
0 -20 88 90 92 94 96 98 00 02 04 06
Upward trend
88 90 92 94 96 98 00 02 04 06
Mean reversion trend
Downward trend
Index level
Monthly returns to the DJ index
Q: Is the sample mean an appropriate estimator of the population mean? Sample Time Mean
27

Stochastic Process and Time Series 1 of 2
• Def: Stochastic Process
{Yt} = {Y1, Y2, …YT} = Collection of random variables.
Y1 Y2 YT μ2
{Yt}
μ1
μT 2 …………….. T
Each R.V can have a different μ, σ, and pdf.
1
t
28

Stochastic Process and Time Series 2 of 2
• Def: Time Series
{yt; t=1,2,…} = {y1, y2, …yT} = Sample realization of a
stochastic process.
Y1 Y2 YT …
{Yt}
μ1
μ2 ……….
μ
1 2 …………….. T
T
t
Time Series
29

Y1 Y2 YT μT
12000
10000
Nonstationary 8000
Stationarity 1 of 5
{Yt }
μ2
μ1 1
{Yt} Y1 μ1
2 ……………..
T
t
6000 4000 2000
10
1990 1992 1994 1996 1998 2000 2002
Y2 YT μ2 μT
Stationary 5 0
-5 -10 -15 -20
1
2 …………….. T
t
CLOSE
1990 1992 1994 1996 1998 2000 2002
30
RETURN

Stationarity 2 of 5
• Def: First Order Strongly Stationary
= All R.Vs have the same pdf’s (all moments are the
same).

{Yt}
μ1
μ2
μT
t
Y1 Y2 YT
2
……………..
T
31

Stationarity 3 of 5
• Def: First Order Weakly Stationary = All R.Vs have the same means.

Y1 Y2 YT
{Yt}
μ1
1
μT 2 …………….. T
μ2
t
32

Stationarity 4 of 5
• Def:SecondOrderWeaklyStationary = Covariance Stationary
i. Means:
ii. Variances:
iii. Time Independent Covariances:
The strength of the linear association between the two R.Vs only depends on how many periods (k) apart they are.
33

Stationarity 5 of 5
• “The basic idea of stationarity is that the probability laws that govern the behavior of the process do not change over time. In a sense, the process is in statistical equilibrium.” –Kung-Sik Chan
34

Transformations of Nonstationary Processes
• Q: Can we find a transformation of {yT} such that the resulting process is:
– (1) first order weakly stationary?
– Yes: Take the first difference of the data
– (2) second order weakly stationary?
– Yes: Take the log of {yT} and then take the first
difference of this transformed series.
– Interpretation: Economic returns or growth rates.
35

The Lag Operator 1 of 3
• LagOperator(L):IsalinearoperatorLsuchthatfor any value of yt : Lyt =yt-1.
• Ingeneral,form-periodsLmyt=yt-m.
• PolynomialLagOperator:B(L)=b0+b1L+…+bmLm • First Difference Operator (Δ): Δyt =(1-L) yt = yt-yt-1
36

The Lag Operator 2 of 3
• DistributedLag:Considerasecond-orderlagoperator such as (1+0.5L + 0.8L2):
(1+0.5L + 0.8L2) yt = yt + 0.5yt-1 +0.8yt-2 • Infinite-OrderLagOperator:
• Example:Infinitedistributedlagofcurrentandpast
shocks:

37

The Lag Operator 3 of 3
Dow Jones Index
16000 1500
14000
12000
10000
1000
500
8000 0
6000
4000
2000
-500
-1000
0 -1500 88 90 92 94 96 98 00 02 04 06
Dow Jones Index level
9.6 15
88 90 92 94 96 98 00 02 04 06 First difference of Index
9.2 8.8 8.4 8.0 7.6
10
5
0
-5
-10
-15
Covariance Stationary
38
7.2 -20
88 90 92 94 96 98 00 02 04 06 Log Index
88 90 92 94 96 98 00 02 04 06 First difference of the log Index (returns)

The Autocorrelation Functions 1 of 9
• Autocovariance Function (γ(t,k)): Autocovariance at displacement k is the covariance between yt and yt-k .
γ(t,k) = cov(yt, yt-k ) =E [(yt – μ)(yt-k- μ)]
– Note: γ(t,k) =γ(k), t (if the covariance of the series is
stable over time).
• Properties of the autocovariance function:
– (i) γ(k) = γ(-k), k (symmetric: depends only on displacement)
– (ii)γ(0)=cov(yt, yt)=var(yt)
– (iii) max [ |γ(k)| ] < γ(0)if γ(0)<∞γ(k) <∞ 39 The Autocorrelation Functions 2 of 9 • Autocorrelation Function (ρ Yt,Yt-k): ACF Normalized autocovariance. Does not have any units, therefore it is easier to interpret. Correlation Coefficients ρ = γk , k = 0,1, 2, ... k γ0 ACF of a Covariance Stationary Processes 40 The Autocorrelation Functions 3 of 9 Example: Annual Hours Worked per Person Employed in Germany 0.5 0.0 -0.5 -1.0 -1.5 -2.0 2005 -2.5 1800 1700 1600 1500 1400 1980 1985 1990 1995 2000 1.0 0.5 0.0 -0.5 1980 1985 1990 1995 2000 2005 Annual Hours Worked in Germany Percent annual change ACF Autocorrelogram Autocorrelation Function k 1 2 3 4 5 6 7 8 9 10 ρˆ k .22 .29 -.10 .16 -.01 .19 -.06 -.04 .09 .20 -1.0 1 2 3 4 5 6 7 8 9 10 41 displacement k Percentage Change in Working Hours in Germany: Calculation of the Autocorrelation Coefficients Yt Yt−1 Yt−3 1978 -1.0604 1979 -0.6699 -1.0604 1980 -1.1018 -0.6699 1981 -1.2413 -1.1018 -1.0604 1982 -0.6497 -1.2413 -0.6699 1983 -0.7536 -0.6497 -1.1018 1984 -0.6826 -0.7536 -1.2413 1985 -1.3733 -0.6826 -0.6497 1986 -1.1438 -1.3733 -0.7536 1987 -1.3533 -1.1438 -0.6826 1988 -0.3196 -1.3533 -1.3733 1989 -1.4574 -0.3196 -1.1438 1990 -1.4536 -1.4574 -1.3533 1991 -2.0234 -1.4536 -0.3196 1992 -0.5904 -2.0234 -1.4574 1993 -1.4550 -0.5904 -1.4536 1994 0.0264 -1.4550 -2.0234 1995 -0.7087 0.0264 -0.5904 1996 -0.9752 -0.7087 -1.4550 1997 -0.5249 -0.9752 0.0264 1998 -0.2026 -0.5249 -0.7087 1999 -0.7057 -0.2026 -0.9752 2000 -1.2126 -0.7057 -0.5249 2001 -0.8375 -1.2126 -0.2026 2002 -0.7745 -0.8375 -0.7057 2003 -0.4141 -0.7745 -1.2126 2004 0.2950 -0.4141 -0.8375 2005 -0.2879 0.2950 -0.7745 2006 -0.0492 -0.2879 -0.4141 Mean: μˆ -0.8026 Variance: γˆ0 0.2905 γk (k=1,3) 0.0651 -0.0282 ρˆk (k=1,3) 0.2240 -0.0970 ˆ 42 The Autocorrelation Functions 5 of 9 Example: Annual Hours Worked per Person Employed in Germany 1.0 0.5 0.0 -0.5 -1.0 1 2 3 4 5 6 7 8 9 10 displacement k Interpretation of : Interpretation: Since the autocorrelations are small, there is little dependence on how working hours have changed over time. 0.5 0.0 -0.5 -1.0 -1.5 -2.0 -2.5 -2.5 -2.0 -1.5 -1.0 -0.5 0.0 0.5 Yt−1 43 ACF Autocorrelogram k=1Observations that are 1 year Yt apart, move in the same direction. When Germans increase their working hours, one year later we expect to see an increase. The Autocorrelation Functions 6 of 9 • Partial Autocorrleation Function (pt(k)): PACF Provides information about the autocorrelataion between Yt and Yt+k after removing all the observations in between. • SamplePartialAutocorrelation:Giventhefitted regression:  (sample partial autocorrelation) Note: See Page 46 Textbook (a) 44 The Autocorrelation Functions 7 of 9 Statistical tests for ρt(k) and pt(k) Assume we would like to test, e.g., the null hypothesis H0: ρk = 0 against the alternative H1: ρk ≠ 0. If the sample is large enough, then: 45 The Autocorrelation Functions 8 of 9 • Box-PierceQ-Statistic: • Ljung-BoxQ-Statistic: Statistical tests of whether any of a group of autocorrelations of a time series are different from zero. Both test are really testing the joint null hypothesis: H0:ρ1 =ρ2 ==ρk =0 46 • The Autocorrelation Functions 9 of 9 Example: Annual Working Hours per Employee in the US vs. Germany. 1860 1850 1840 1830 1820 1810 1800 1980 1985 1990 1995 2000 2005 1800 1700 1600 1500 1400 US Data: ACF & PACF There is a statistically significant time dependence. 47 1980 1985 1990 1995 2000 2005