CS代考 Predictive Analytics

Predictive Analytics

Final exam sample question solutions
(Semester 2, 2022)

Copyright By PowCoder代写 加微信 powcoder

You might try to refine and improve the given sample solutions.
The number and level of questions in the real final exam can be different to
the sample questions.

Question 1 ()

(1) Trend: reflects the long-run growth or decline in the time series.

Cycle: slow rises and falls that are not in a regularly repeating pattern with no
fixxed period

Seasonal: rises and falls that are in a regular repeating pattern, on a seasonal
basis such as months of the year or days of the week.

Irregular fluctuations: assumes to be unexplainable, random variations or ’un-
usual’ events.

(2)PACF measures the linear dependence of one variable after removing the ef-
fect of other variable(s) that affect to both variables. For example, the partial
autocorrelation of 2nd order measures the effect (linear dependence) of Yt−2 on Yt
after removing the effect of Yt−1 on both Yt and Yt−2. However, the ACF measures
the linear dependence (autocorrelation) and does not remove the effect of other
variable(s) that affect both variables.

(3) The MA(q) process is defined as:

Yt = εt + θ1εt−1 + θ2εt−2 + · · ·+ θqεt−q

We define the characteristic equation as:

1 + θ1z + θ2z2 + · · ·+ θqzq = 0

The MA(q) process is invertible if all the q roots of the characteristic equation
satisfy |z| > 1.

(4) Note: as mentioned in the question: “You only need to present
your final solution.” Therefore, you only need to type your answers as
highlighted in red:

The ARIMA(1, 1, 2)(0, 1, 0)7 model expression in the backshift operator form is:

(1− φ1B)(1−B)(1−B7)Yt = c+ (1 + θ1B + θ2B2)εt

Reasoning:

Term (1− φ1B) is corresponding to the AR(1) component.

Term is (1−B) is corresponding to the 1st order difference of the time series.

Term is (1−B7) is corresponding to the 1st order seasonal difference of the time
series, as we know the seasonal period is 7 as in the given model.

Term (1 + θ1B + θ2B2) is corresponding to the MA(2) component.

Lastly, we see that there are no seasonal AR and MA components, thus the model
ARIMA(1, 1, 2)(0, 1, 0)7 in the backshift operator form can be written as:

(1− φ1B)(1−B)(1−B7)Yt = c+ (1 + θ1B + θ2B2)εt

Question 2 ()

The condition that implies the given MA(1) process is invertible: |θ1| < 1. Reason: the given MA(1) is Yt = εt + θ1εt−1. The characteristic equation is 1 + θ1z = and its only root is z∗ = −1/θ1. |z∗| > 1 implies the MA(1) is
invertible. This means |θ1| < 1. The condition that implies the given AR(1) process is stationary: |φ1| < 1. Reason: the given AR(1) is Yt = φ1Yt−1 + εt. The characteristic equation is 1−φ1z = 0 and its only root is z∗ = 1/φ1. |z∗| > 1 implies the AR(1) stationarity.
This means |φ1| < 1. Question 3 () Note: as mentioned in the question: “You only need to present your final solution.” Therefore, you only need to type your answers as high- lighted in red: Cov(Yt, Yt−1) = Cov(c+ φ1Yt−1 + φ2Yt−2 + εt, Yt−1) = φ1Var(Yt−1) + φ2Cov(Yt−2, Yt−1) Under the stationarity condition we have Cov(Yt, Yt−1) = Var(Yt−1). Question 4 () Note: as mentioned in the question: “You only need to present your final solution.” Therefore, you only need to type your answers as high- lighted in red: The values of p, d, q, P,D,Q,m of the given model are: p = 1, d = 0, q = 1, P = 1, D = 1, Q = 0,m = 12 Reasoning: Term (1− φ1B) means the AR component has lag 1, thus p = 1. Term (1− Φ1B12) means the seasonal AR component has lag 1, thus P = 1. Term (1 − B12) is equal to (1 − B12)1. This means the order of the seasonal difference of the time series is 1, thus D = 1. Here, we can also see the seasonal period m = 12. Term (1 + θ1B) means the MA component has lag 1, thus q = 1. Lastly, the model does not have other differencing operations on the original time series and does not have seasonal MA component, thus d = 0 and Q = 0. Question 5 () Solution: (1) The number of layers is 3. The number of hidden layers is 1. The number of inputs is 3. The number of hidden neurons is 2. The number of output neurons is 1. z1 =σ(0.1 ∗ x1 + 0.3 ∗ x2 + 0.5 ∗ x3 + 1) =σ(0.1 ∗ 1 + 0.3 ∗ 2.5 + 0.5 ∗ 0.7 + 1) =0.9002 ≈ 0.900 η =β0 + β1z1 + β2z2 = 0 + 0.7σ(s1) + 0.8σ(s2) =0.7σ(0.1 ∗ 1 + 0.3 ∗ 2.5 + 0.5 ∗ 0.7 + 1) + 0.8σ(0.2 ∗ 1 + 0.4 ∗ 2.5 + 0.6 ∗ 0.7 + 2) =0.7 ∗ 0.9002 + 0.8 ∗ 0.9739 only do the decimal rounding at the last step of the calculation =1.4093 ≈ 1.409 Question 6 () The input and output pairs of training data are: Input: Y1, Y5, Y6, Y7, Y8; Output: Y9 ; Input: Y2, Y6, Y7, Y8, Y9; Output: Y10; Input: Y3, Y7, Y8, Y9, Y10; Output: Y11 ; Input: Y4, Y8, Y9, Y10, Y11 ; Output: Y12. Question 7 () Note: as mentioned in the question: “You only need to present your final solution.” Therefore, you only need to type your answers as follow- ing. The solving details can be seen in the lecture slides and recording. The required specifications that can map the basic RNN to the SES model are: xt = yt−1, ht = lt−1, W = (1− α), END OF EXAMINATION. 程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com