程序代写 ETW3420 Principles of Forecasting and Applications

ETW3420 Principles of Forecasting and Applications

Principles of Forecasting and Applications

Copyright By PowCoder代写 加微信 powcoder

Topic 8 Exercises

Question 1

This exercise uses data set huron giving the level of Lake Huron from 1875–1972. This was

the data set we used in one of the tutorial questions from last week.

(a) Fit a piecewise linear trend model to the Lake Huron data with a knot at 1920 and an

ARMA error structure. Report the estimated model.

#Trend for the whole time period

trend <- time(huron) #Trend after the knot at 1920 trend2 <- pmax(trend-1920, 0) #Fit piecewise linear trend model fit <- auto.arima(huron, xreg = cbind(trend,trend2)) (b) Forecast the level for the next 30 years. #Create values for regressors trend and trend2 for the next 30 years trend.fc <- max(time(huron)) + seq(30) trend2.fc <- trend.fc - 1920 #Produce forecast fc <- forecast(fit, xreg = cbind(trend.fc,trend2.fc)) #Plot forecast autoplot(fc) + autolayer(huron - residuals(fit, type='regression'), series="Fitted trend") • Note that the argument residuals(fit, type='regression') refer to the regression error, ηt and not the ARIMA errors �t. • By subtracting the regression errors from the data huron, what is left is the piecewise linear trend. Question 2 Using monthly data from January 1995 to December 2015, we will produce forecasts for Malaysia’s tourist arrivals using exponential smoothing models, ARIMA models, and STL model. We will then produce an ensemble forecasts combining the aforementioned forecasts. Download and import the dataset “tourist.csv”. Plot the data. y <- read.csv("tourist.csv") y <- ts(y[,-1], frequency = 12, start = c(1995,1)) autoplot(y) (a) Divide the data into a training and test set, with the training set being from Jan 1995 to Dec 2011. train <- window(y, end = c(2011,12)) test <- window(y, start = c(2012,1)) (b) Produce forecasts for the test set using the automated functions ETS, ARIMA and STLF. Combine these three sets of point forecasts to form the combination forecast. h <- length(test) ETS <- forecast(ets(train), h = h) ARIMA <- forecast(auto.arima(train, lambda = 0), h = h) STLF <- stlf(train, lambda = 0, h = h) Combination <- (ETS[["mean"]] + ARIMA[["mean"]] + STLF[["mean"]])/3 (c) Plot the tourist arrivals, along with the forecasts from the 4 models/method. autoplot(y) + autolayer(ETS, series = "ETS", PI = F) + autolayer(ARIMA, series = "ARIMA", PI = F) + autolayer(STLF, series = "STLF", PI = F) + autolayer(Combination, series = "Combination") (d) Which of the 4 models/method has the best forecasting performance? accuracy(ETS, test) accuracy(ARIMA, test) accuracy(STLF, test) accuracy(Combination, test) Question 3 This question will continue to use the Malaysian tourist data. For this question, we will produce forecasts using the bagging procedure and compare its forecast performance with the above 4 models/method. (a) For illustration purposes, perform bootstrapping of the training data set to generate 10 additional time series. Plot the original training data set with the bootstrapped data set.seed(123456) bootseries <- bld.mbb.bootstrap(train, 10) %>% as.data.frame %>%

ts(start = c(1995,1), frequency = 12)

autoplot(train) +

autolayer(bootseries, colour = T) +

autolayer(train, colour = F) +

ylab(“Bootstrapped series”) +

guides(colour = ‘none’)

(b) Using the baggedModel() function, produce bagged ETS forecasts for the test set

period using the ARIMA and ETS function.

set.seed(123456)

bagging.ETS <- train %>% baggedModel(bootstrapped_series =

bld.mbb.bootstrap(train, 10), fn = “ets”) %>%

forecast(h = h)

(c) Plot the tourist arrivals, along with the forecasts from the bagged ETS model.

autoplot(y) +

autolayer(bagging.ETS, series = “Bagging ETS”, PI = F)

(d) Assess the bagged ETS forecast accuracy.

accuracy(bagging.ETS, test)

Question 1
Question 2
Question 3

程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com