程序代写代做 1. Census data was collected on the 50 states and Washington, D.C. We are interested in determining whether average lifespan (y) is related to the ratio of males to females in percent (x1), birth rate per 1,000 people (x2), divorce rate per 1,000 people (x3), number of hospital beds per 100,000 people (x4), percentage of population 25 years or older having completed 16 years of school (x5) and per capita income (x6). The data is shown in Table 3.

1. Census data was collected on the 50 states and Washington, D.C. We are interested in determining whether average lifespan (y) is related to the ratio of males to females in percent (x1), birth rate per 1,000 people (x2), divorce rate per 1,000 people (x3), number of hospital beds per 100,000 people (x4), percentage of population 25 years or older having completed 16 years of school (x5) and per capita income (x6). The data is shown in Table 3.
Answer the question 1 based on the R code and output below.
Set 1: Code and output
> null<-lm(y~1, data=df) #null model > full<-lm(y~.,data=df) #full model > step(null, scope=list(lower=null, upper=full), direction=”forward”,k=2) Start: AIC=43.2
y~1
Df Sum of
+ x2 1
+ x4 1
+ x1 1

+ x3 1
+ x6 1
+ x5 1
Sq RSS AIC 11.0479 103.35 40.021
7.9440 106.45 41.530 4.5632 109.83 43.124 114.40 43.200 3.3073 111.09 43.704 1.5249 112.87 44.516 0.4279 113.97 45.009
Step: AIC=40.02
y ~ x2
Df Sum
+ x1
+ x4

+ x5
+ x3
+ x6
of Sq RSS
1 17.9252
1 15.5924
AIC
85.424 32.306
87.757 33.680
1 2.2680
1 1.1739
1 0.4647
103.349 40.021 101.081 40.889 102.175 41.438 102.885 41.791
AIC
77.926 29.620
Step: AIC=32.31
y ~ x2 + x1
Df Sum of Sq RSS + x4 1 7.4985
1

+ x3

+ x5
+ x6
1 4.6730
1 1.2969
1 0.1768
80.751
85.424
84.127
85.247
AIC
68.511
72.894
77.926
77.661
31.437
32.306
33.526
34.200
25.054
28.216
29.620
31.447
21.677
25.054
25.985
Df Sum
61.656 21.677
Step: AIC=29.62
y ~ x2 + x1 + x4
Df Sum
+ x3
+ x5

+ x6
of Sq RSS
1 9.4142
1 5.0313
1 0.2646
Step: AIC=25.05
y ~ x2 + x1 + x4 + x3
Df Sum
+ x5

+ x6
of Sq RSS
1 6.8556
AIC
61.656
68.511
67.091
+ x1 + x4 + x3 + x5
1 1.4207
Step: AIC=21.68
y ~ x2
of Sq RSS AIC
+ x6 1 0.85279
Call:
lm(formula = y ~ x2
60.803 22.966
+ x1 + x4 + x3 + x5, data = df)
Coefficients: (Intercept)
70.158910 -0.468656
1.1. Which stepwise-type procedure of variable selection is adopted? Which evaluating criterion is used to evaluate the variable selection procedure?
What is the selected subset regression model? [4 marks]
x2 x1 x4 x3 x5 0.115416 -0.003465 -0.207270 0.175312
2

Answer the question 2 based on the R code and output below.
Set 2: Code and output
> null<-lm(y~1, data=df) #null model > full<-lm(y~.,data=df) #full model > step(full, data=df, direction=”backward”,k=2) Start: AIC=22.97
y ~ x1 + x2 + x3 + x4 + x5
+ x6
21.677
22.966
25.985
28.559
28.624
32.920
39.565
21.677
25.054
26.624
28.216
32.596
39.903
x3 + x4 + x5, data = df)
Coefficients:
(Intercept)
70.158910 0.115416 -0.468656 -0.207270 -0.003465 0.175312
1.2. Which stepwise-type procedure of variable selection is adopted? Which evaluating criterion is used to evaluate the variable selection procedure?
What is the selected subset regression model? [4 marks]
Df Sum of – x6 1 -x5 1 -x3 1 -x1 1 -x4 1 -x2 1
Sq RSS AIC 0.8528 61.656 60.803 6.2877 67.091 9.7600 70.563 9.8502 70.653 16.0591 76.862 26.7569 87.560
Step: AIC=21.68
y ~ x1 + x2 + x3 + x4 + x5
Df Sum of -x5 1 -x1 1 -x3 1 -x4 1 -x2 1
Sq RSS
6.8556
8.9979
11.2386
17.7757
30.0115
AIC
61.656
68.511
70.654
72.894
79.431
91.667
+ x2 +
Call:
lm(formula = y ~ x1
x1 x2 x3 x4 x5
3

Answer the question 3 based on the R code and output below.
Set 3: Code and output
> all<-regsubsets(y~.,data=df,nbest=10) > all<-regsubsets(y~.,data=df,nbest=10) > all.summary <- summary(all) > all.summary$bic
[1] 2.68397628 4.19314141 5.78761114 6.36746854 7.17924335 7.67251926 [7] -3.09897871 -1.72493202 3.19314670 5.48413945 6.03317934 6.11525236
[13] 6.38595699
[19] -2.03623596
[25] 5.25394919
[31] -0.09439284
[37] -7.93266845
[43] -4.71117342
> which.min(all.summary$bic) [1] 37
7.33514349 7.56036287 -3.85272504
0.05263121 0.71122774 0.72718292 -6.48740378 -4.91712887 -3.32482511 1.05519867 1.76749956 1.86758210 -1.05081966 -0.98560503 3.31009259
-2.70016722
2.72021576
-1.30830766
6.02758138
9.95591094
6.41683754
-1.79105692
5.56869268
0.95435744
-3.62423716
> coef(all,37)
(Intercept) x1 x2 x3 x4 70.158910425 0.115416006 -0.468655749 -0.207270195 -0.003465495 x5
0.175312113
1.3. Determine which of the explanatory variables should be included in the regression model by using the all possible regressions approach and criteria BIC.
[4 marks]
1.4. Describe the difference between AIC and BIC as criteria for evaluating subset regression models.
[2 marks]
4

Here is the correlation matrix of the aforementioned data set and its inverse matrix. Answer the questions 5 and 6 based on the R code and output below.
Set 4: Code and output
> (R <- cor(X)) x1 x2 x3 x4 x5 x6 x1 1.000000000 x2 0.128755708 x3 0.236209288 x4 -0.041948996 x5 -0.005199729 x6 -0.155043817 > solve(R)
0.12875571 0.23620929 -0.0419489957 -0.005199729 -0.1550438171 1.00000000 0.38945363 -0.2433065008 0.054913239 -0.1918439419 0.38945363 1.00000000 -0.1989572556 -0.242796447 -0.0901082186
-0.24330650 -0.19895726 1.0000000000 0.376494507 -0.0007031934 0.05491324 -0.24279645 0.3764945073 1.000000000 -0.1862064803 -0.19184394 -0.09010822 -0.0007031934 -0.186206480 1.0000000000
x1 x2 x3 x4 x5 x6
x1 1.080711219
x2 -0.015661892
x3 -0.242901257
x4 0.003993423
x5 -0.028419215
x6 0.137376520
-0.01566189 -0.242901257 0.003993423 -0.02841921 0.13737652 1.32360877 -0.502898377 0.331061061 -0.29116945 0.15219793 -0.50289838 1.346240379 -0.001750119 0.36410931 0.05496726 0.33106106 -0.001750119 1.269263723 -0.50177407 -0.02856753 -0.29116945 0.364109308 -0.501774073 1.33424479 0.22063610 0.15219793 0.054967259 -0.028567532 0.22063610 1.09651442
1.5. Determine the state of multicollinearity.
1.6. Determine the variance inflation factors and interpret your results.
[2 marks] [4 marks]
5

2. Table 4 is a data set where the response variable yi is a binary outcome.Thus we use a logistic regression to fit the data. Please generate a data set of Alzheimer’s Disease. Here y takes value one indicating the status of AD, and zero otherwise for a health subject. Fours factors are reported for each AD patient, where x1 is the gender (x1 = 0 for male and 1 for female), x2 is the age, x3 is the exercise volume, and x4 is the level of omega 3 taken.
Here is code and outcome.
> mod <- glm(y~x1+x2+x3+x4,family = binomial) > mod
Call: glm(formula = y ~ x1 + x2 + x3 + x4, family = binomial)
Coefficients:
(Intercept) x1 x2 x3 x4 -6.20215 -1.21769 0.07502 1.03053 0.35734
Degrees of Freedom: 29 Total (i.e. Null); 25 Residual Null Deviance: 39.43
Residual Deviance: 33.18 AIC: 43.18
> mod.age <- glm(y~x2,family = binomial) > mod.age
Call: glm(formula = y ~ x2, family = binomial)
Coefficients: (Intercept) x2 -4.81263 0.06335
Degrees of Freedom: 29 Total (i.e. Null); 28 Residual Null Deviance: 39.43
Residual Deviance: 37.6 AIC: 41.6
> summary(mod.age)
Call:
glm(formula = y ~ x2, family = binomial)
Deviance Residuals:
Min 1Q Median 3Q Max
6

-1.4204 -0.9597 -0.7414 1.1593 1.8365
Coefficients:
Estimate Std. Error z value Pr(>|z|) (Intercept) -4.81263 3.29827 -1.459 0.145 x2 0.06335 0.04828 1.312 0.189
(Dispersion parameter for binomial family taken to be 1)
Null deviance: 39.429 on 29 degrees of freedom Residual deviance: 37.602 on 28 degrees of freedom AIC: 41.602
> residuals(mod.age,c=”deviance”) 123456 -0.9226183 1.6463951 -0.8018902 -0.9660532 -0.7119222 1.1606590 7 8 9 10 11 12 -0.7259699 -0.7676618 -1.4204390 1.1163407 -1.0058724 -1.0888297 13 14 15 16 17 18 1.5998183 1.2924501 -0.8211637 -0.9614134 1.2389406 1.4269942 19 20 21 22 23 24 -0.7567660 1.1553136 1.8364912 -0.8058069 -1.1542125 1.4074488 25 26 27 28 29 30 -0.6924079 -0.9631213 -0.9545991 -1.1861493 -0.7073516 1.0789951
7

2.1. When all four factors are considered as explanatory variables for the development of AD, present the fitted logistic regression model. [2 marks]
2.2. When one just looks at the effect of age on the status of AD, present the fitted logistic regression model.
2.3. Based on Question 2, calculate and interpret the odds ratio The deviance is defined as
D=2􏰂􏰀yilog yi +(ni−yi)log ni−yi 􏰁. i niπi ni(1 − πi)
[2 marks]
[4 marks]
2.4. What is a good rule of thumb to check the model adequacy based on deviance?
[3 marks]
2.5. When the model is adequate and the sample size is large, tell the distribution of the deviance.
2.6. Interpret whether the fitting is good based on the deviance residuals.
[2 marks] [4 marks]
2.7. Is the sign of afore deviance residual the same to the corresponding ordinary residual? And why?
[3 marks]
8

3. Assumed the MLR model for data set in Table 1, we apply ordinary least square to fit the model. Table 2 presents the ordinary residual ei, and four scaled residuals: standardized residual di, studentized residual ri, PRESS residual e(i), and R-student ti, for i = 1, · · · , 28.
Also the hat matrix H28×28 of the data and model are given below. Set 1: Code and output
> mod <- lm(y~x1+x2+x3+x4,data=data) >
> summary(mod)
Call:
lm(formula = y ~ x1 + x2 + x3 + x4, data = data)
Residuals:
Min 1Q Median 3Q Max -10.9854 -2.3803 -0.3131 1.8522 13.4620
Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) 0.2003 x1 1.4395 x2 -4.4718 x3 3.6805 x4 0.9434 —
6.1874
0.9358
1.0289
2.9447
0.6705
0.032 0.974454
1.538 0.137624 -4.346 0.000237 ***
1.250 0.223917
1.407 0.172779
Signif. codes: 0 *** 0.001 ** 0.01 * 0.05 . 0.1 1
Residual standard error: 4.554 on 23 degrees of freedom Multiple R-squared: 0.4595,Adjusted R-squared: 0.3655 F-statistic: 4.888 on 4 and 23 DF, p-value: 0.005314 > (round(H <- X %*% solve(t(X) %*% X) %*% t(X),2)) [,1] [,2] [,3] [,4] [,5] [,6] [,7] [,8] [,9] [1,] 0.39 -0.16 -0.11 0.07 0.11 0.01 0.12 0.05 0.03 [2,] -0.16 0.17 0.08 -0.02 0.02 0.02 -0.06 0.05 -0.01 [3,] -0.11 0.08 0.20 0.07 0.02 0.04 0.11 -0.05 0.11 [4,] 0.07 -0.02 0.07 0.23 -0.04 0.10 0.14 -0.04 -0.01 [5,] 0.11 0.02 0.02 -0.04 0.19 -0.10 0.05 0.01 -0.01 [6,] 0.01 0.02 0.04 0.10 -0.10 0.15 0.04 0.07 0.09 [7,] 0.12 -0.06 0.11 0.14 0.05 0.04 0.20 -0.06 0.09 [8,] 0.05 0.05 -0.05 -0.04 0.01 0.07 -0.06 0.13 0.04 9 [9,] 0.03 -0.01 0.11 -0.01 -0.01 0.09 0.09 0.04 0.24 [10,] -0.01 0.08 0.07 0.09 0.12 -0.06 0.06 -0.04 -0.11 [11,] -0.06 0.08 0.05 -0.01 -0.04 0.10 -0.02 0.09 0.11 [12,] 0.04 0.10 -0.13 -0.05 0.02 0.05 -0.15 0.17 -0.09 [13,] -0.20 0.19 0.12 -0.03 0.06 -0.01 -0.05 0.02 -0.01 [14,] 0.18 -0.03 -0.01 0.12 0.15 -0.06 0.11 -0.04 -0.11 [15,] -0.01 0.05 0.08 0.16 0.02 0.04 0.09 -0.03 -0.05 [16,] -0.01 0.06 0.06 0.13 0.04 0.02 0.05 -0.02 -0.08 [17,] 0.16 -0.05 -0.04 0.07 -0.02 0.10 0.05 0.08 0.07 [18,] 0.10 -0.02 0.07 -0.05 0.09 0.01 0.09 0.03 0.18 [19,] 0.18 -0.01 -0.06 -0.07 0.14 -0.03 0.00 0.08 0.03 [20,] 0.08 0.02 0.05 -0.12 0.17 -0.06 0.04 0.04 0.14 [21,] 0.02 0.09 -0.03 -0.03 0.08 -0.01 -0.06 0.08 -0.06 [22,] 0.03 0.02 0.01 0.10 -0.08 0.13 0.02 0.07 0.05 [23,] 0.06 0.03 -0.02 0.06 -0.02 0.09 0.00 0.08 0.02 [24,] 0.05 0.03 -0.04 0.07 -0.06 0.12 -0.02 0.10 0.01 [25,] -0.02 0.01 0.15 0.13 -0.04 0.10 0.14 -0.03 0.14 [26,] -0.04 0.06 0.08 0.03 -0.01 0.07 0.03 0.04 0.09 [27,] -0.06 0.11 0.09 -0.08 0.11 -0.03 0.00 0.03 0.07 [28,] 0.00 0.07 0.02 0.00 0.02 0.05 -0.02 0.07 0.03 [,10] [,11] [,12] [,13] [,14] [,15] [,16] [,17] [,18] [1,] -0.01 -0.06 0.04 -0.20 0.18 -0.01 -0.01 0.16 0.10 [2,] 0.08 0.08 0.10 0.19 -0.03 0.05 0.06 -0.05 -0.02 [3,] 0.07 0.05 -0.13 0.12 -0.01 0.08 0.06 -0.04 0.07 [4,] 0.09 -0.01 -0.05 -0.03 0.12 0.16 0.13 0.07 -0.05 [5,] 0.12 -0.04 0.02 0.06 0.15 0.02 0.04 -0.02 0.09 [6,] -0.06 0.10 0.05 -0.01 -0.06 0.04 0.02 0.10 0.01 [7,] 0.06 -0.02 -0.15 -0.05 0.11 0.09 0.05 0.05 0.09 [8,] -0.04 0.09 0.17 0.02 -0.04 -0.03 -0.02 0.08 0.03 [9,] -0.11 0.11 -0.09 -0.01 -0.11 -0.05 -0.08 0.07 0.18 [10,] 0.21 -0.05 0.02 0.12 0.19 0.14 0.15 -0.06 -0.05 [11,] -0.05 0.12 0.07 0.07 -0.10 -0.01 -0.02 0.05 0.05 [12,] 0.02 0.07 0.31 0.06 0.01 -0.01 0.03 0.07 -0.07 [13,] 0.12 0.07 0.06 0.23 -0.01 0.06 0.08 -0.09 -0.01 [14,] 0.19 -0.10 0.01 -0.01 0.27 0.12 0.13 0.02 -0.02 [15,] 0.14 -0.01 -0.01 0.06 0.12 0.15 0.13 0.00 -0.05 [16,] 0.15 -0.02 0.03 0.08 0.13 0.13 0.13 0.00 -0.06 [17,] -0.06 0.05 0.07 -0.09 0.02 0.00 0.00 0.14 0.04 [18,] -0.05 0.05 -0.07 -0.01 -0.02 -0.05 -0.06 0.04 0.19 [19,] 0.02 0.01 0.11 -0.02 0.09 -0.04 -0.02 0.06 0.09 [20,] 0.01 0.03 -0.03 0.05 0.02 -0.07 -0.06 -0.01 0.19 [21,] 0.09 0.03 0.17 0.09 0.07 0.03 0.06 0.01 -0.02 10 [22,] -0.03 0.08 0.07 -0.01 -0.03 0.05 0.03 0.10 -0.01 [23,] 0.00 0.06 0.11 0.00 0.01 0.03 0.03 0.08 -0.01 [24,] -0.02 0.07 0.15 -0.01 -0.01 0.03 0.03 0.10 -0.03 [25,] 0.00 0.06 -0.13 0.02 -0.02 0.07 0.04 0.04 0.08 [26,] 0.00 0.08 0.01 0.06 -0.04 0.03 0.02 0.03 0.05 [27,] 0.06 0.06 0.01 0.14 -0.01 -0.01 0.00 -0.05 0.10 [28,] 0.02 0.06 0.10 0.06 -0.01 0.02 0.03 0.03 0.02 [,19] [,20] [,21] [,22] [,23] [,24] [,25] [,26] [,27] [1,] 0.18 0.08 0.02 0.03 0.06 0.05 -0.02 -0.04 -0.06 [2,] -0.01 0.02 0.09 0.02 0.03 0.03 0.01 0.06 0.11 [3,] -0.06 0.05 -0.03 0.01 -0.02 -0.04 0.15 0.08 0.09 [4,] -0.07 -0.12 -0.03 0.10 0.06 0.07 0.13 0.03 -0.08 [5,] 0.14 0.17 0.08 -0.08 -0.02 -0.06 -0.04 -0.01 0.11 [6,] -0.03 -0.06 -0.01 0.13 0.09 0.12 0.10 0.07 -0.03 [7,] 0.00 0.04 -0.06 0.02 0.00 -0.02 0.14 0.03 0.00 [8,] 0.08 0.04 0.08 0.07 0.08 0.10 -0.03 0.04 0.03 [9,] 0.03 0.14 -0.06 0.05 0.02 0.01 0.14 0.09 0.07 [10,] 0.02 0.01 0.09 -0.03 0.00 -0.02 0.00 0.00 0.06 [11,] 0.01 0.03 0.03 0.08 0.06 0.07 0.06 0.08 0.06 [12,] 0.11 -0.03 0.17 0.07 0.11 0.15 -0.13 0.01 0.01 [13,] -0.02 0.05 0.09 -0.01 0.00 -0.01 0.02 0.06 0.14 [14,] 0.09 0.02 0.07 -0.03 0.01 -0.01 -0.02 -0.04 -0.01 [15,] -0.04 -0.07 0.03 0.05 0.03 0.03 0.07 0.03 -0.01 [16,] -0.02 -0.06 0.06 0.03 0.03 0.03 0.04 0.02 0.00 [17,] 0.06 -0.01 0.01 0.10 0.08 0.10 0.04 0.03 -0.05 [18,] 0.09 0.19 -0.02 -0.01 -0.01 -0.03 0.08 0.05 0.10 [19,] 0.17 0.15 0.09 -0.02 0.03 0.02 -0.06 -0.01 0.06 [20,] 0.15 0.26 0.04 -0.07 -0.03 -0.07 0.01 0.03 0.16 [21,] 0.09 0.04 0.13 0.01 0.05 0.05 -0.07 0.01 0.06 [22,] -0.02 -0.07 0.01 0.12 0.09 0.12 0.07 0.06 -0.04 [23,] 0.03 -0.03 0.05 0.09 0.08 0.10 0.02 0.03 -0.02 [24,] 0.02 -0.07 0.05 0.12 0.10 0.14 0.01 0.04 -0.04 [25,] -0.06 0.01 -0.07 0.07 0.02 0.01 0.18 0.08 0.02 [26,] -0.01 0.03 0.01 0.06 0.03 0.04 0.08 0.07 0.05 [27,] 0.06 0.16 0.06 -0.04 -0.02 -0.04 0.02 0.05 0.16 [28,] 0.04 0.03 0.06 0.05 0.05 0.06 0.01 0.04 0.05 [,28] [1,] 0.00 [2,] 0.07 [3,] 0.02 [4,] 0.00 [5,] 0.02 11 [6,] 0.05 [7,] -0.02 [8,] 0.07 [9,] 0.03 [10,] 0.02 [11,] 0.06 [12,] 0.10 [13,] 0.06 [14,] -0.01 [15,] 0.02 [16,] 0.03 [17,] 0.03 [18,] 0.02 [19,] 0.04 [20,] 0.03 [21,] 0.06 [22,] 0.05 [23,] 0.05 [24,] 0.06 [25,] 0.01 [26,] 0.04 [27,] 0.05 [28,] 0.06 12 3.1. Let X28×5 be the design matrix in the aforementioned MLR model. Present the formula of H by X. [1 marks] 3.2. Let Y and Yˆ be the vectors of response and LS fitted value, respectively. Present the association of Y and Yˆ by the hat matrix. [1 marks] 3.3. Find out the greatest diagonal element in the hat matrix and give the interpretation based on its magnitude. [2 marks] 3.4. Which is more precise, standardized residual di or studentized residual ri, and why? Which observation in the table can be evidence to support you? [4 marks] 3.5. Determine the variances of the second and the twenty-first studentized residuals, i.e. Var(r2) and Var(r21), respectively. [4 marks] 3.6. For the four scaled residuals, please class them into two categories and justify your classification. [2 marks] 3.7. Determine the influential points or outliers of all observations based on Table 2 and the hat matrix. [4 marks] 3.8. What is the difference between PRESS residual and PRESS statistic? [2 marks] 13 Attachments: Table 1: Table for Q2 1 12.32 1.74 -0.36 0.21 6.44 2 23.45 0.49 -4.62 0.55 3.29 3 23.78 -0.64 -3.93 0.83 6.34 4 22.44 1.92 -2.37 0.96 6.08 5 16.59 -0.27 -2.24 0.51 4.33 6 29.40 1.74 -3.51 0.37 6.11 7 22.62 0.22 -2.00 0.79 7.47 8 19.94 1.51 -3.38 0.06 4.32 9 19.68 -0.49 -3.48 0.16 7.95 10 19.95 0.68 -2.75 0.99 3.40 11 24.18 0.76 -4.16 0.22 5.33 12 22.67 2.62 -3.44 0.07 2.03 13 40.10 -0.12 -4.72 0.67 3.22 14 14.16 1.23 -1.27 0.88 4.28 15 20.43 1.32 -2.91 0.96 4.71 16 20.06 1.37 -2.93 0.90 4.05 17 17.55 1.88 -2.36 0.22 6.11 18 23.55 -0.77 -2.79 0.20 7.13 19 15.31 0.67 -2.14 0.12 4.57 20 19.08 -1.26 -2.86 0.16 5.82 21 16.19 1.29 -3.23 0.38 2.85 22 29.74 1.97 -3.32 0.40 5.55 23 24.82 1.87 -3.05 0.35 4.83 24 25.51 2.38 -3.24 0.30 4.61 25 26.19 0.19 -3.34 0.71 7.41 26 13.66 0.49 -3.76 0.44 5.62 27 18.51 -0.80 -3.86 0.40 4.64 28 22.71 1.04 -3.53 0.35 4.44 14 Observation Number, i 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 (1) the ordinary (2) standardized residuals residuals 1.16 0.25 (3) studentized residuals 0.32 -0.78 -0.52 -0.10 0.20 0.92 0.78 -0.43 -0.87 -0.09 -0.36 0.30 3.36 -0.20 -0.63 -0.55 -0.59 1.10 -0.04 0.47 -1.03 1.21 0.56 0.46 0.28 -2.50 -0.87 -0.06 (4) PRESS residuals 1.89 -3.91 -2.62 -0.51 1.00 4.55 3.99 -2.11 -4.58 -0.48 -1.77 1.64 17.43 -1.05 -3.12 -2.71 -2.88 5.59 -0.22 2.47 -5.05 5.87 2.65 2.25 1.43 -11.79 -4.33 -0.26 (5) R-student 0.07 -0.17 -0.11 -0.02 0.04 0.20 0.17 -0.09 -0.19 -0.02 -0.08 0.06 1.34 -0.04 -0.14 -0.12 -0.13 0.24 -0.01 0.10 -0.23 0.27 0.12 0.10 0.06 -0.71 -0.19 -0.01 -3.24 -2.11 -0.39 0.80 3.87 3.21 -1.84 -3.46 -0.38 -1.56 1.14 13.46 -0.77 -2.66 -2.34 -2.49 4.52 -0.18 1.82 -4.39 5.15 2.44 1.94 1.17 -10.99 -3.65 -0.25 -0.71 -0.46 -0.09 0.18 0.85 0.70 -0.40 -0.76 -0.08 -0.34 0.25 2.96 -0.17 -0.58 -0.51 -0.55 0.99 -0.04 0.40 -0.96 1.13 0.54 0.43 0.26 -2.41 -0.80 -0.05 Table 2: Diagnostics Table for Q3 15 Table 3: Table for Q4 x1 x2 x3 x4 603.30 840.90 569.60 536.00 649.50 717.70 791.60 1859.40 926.80 668.20 705.40 794.30 773.90 541.50 871.00 736.10 854.60 661.90 724.00 1103.80 841.30 919.50 754.70 905.40 801.60 763.10 668.70 658.80 959.90 866.10 878.20 713.10 560.90 560.70 1056.20 751.00 664.60 607.10 948.90 960.50 739.90 984.70 831.60 674.00 470.50 835.80 1026.10 556.40 814.70 950.40 925.90 x5 x6 14.10 4638 7.80 2892 6.70 2791 12.60 3614 13.40 4423 14.90 3838 13.70 4871 17.80 4644 13.10 4468 10.30 3698 9.20 3300 14.00 4599 9.10 3643 10.00 3243 10.30 4446 8.30 3709 11.40 3725 7.20 3076 9.00 3023 12.60 4276 13.90 4267 8.40 3250 9.40 4041 11.10 3819 9.00 3654 8.10 2547 11.00 3395 8.50 3200 8.40 3077 9.60 3657 10.90 3720 11.80 4684 12.70 3045 10.80 4583 11.90 4605 9.30 3949 10.00 3341 11.80 3677 8.70 3879 9.40 3878 9.00 2951 8.60 3108 7.90 3079 10.90 3507 14.00 3169 12.30 3677 11.50 3447 12.70 3997 9.80 3712 6.80 3038 11.80 3672 y 69.31 69.05 70.66 70.55 71.71 72.06 72.48 65.71 70.06 70.66 68.54 73.60 72.56 71.87 70.14 70.88 72.58 70.10 68.76 71.83 70.22 70.93 70.63 72.96 70.69 68.09 70.56 69.21 72.79 72.60 71.23 70.93 70.32 69.03 70.55 70.82 71.42 72.13 70.43 71.90 67.96 72.08 70.11 70.90 72.90 70.08 71.64 71.72 72.48 69.48 70.29 1 119.10 2 93.30 3 94.10 4 96.80 5 96.80 6 97.50 7 94.20 8 86.80 9 95.20 10 93.20 11 94.60 12 108.10 13 94.60 14 99.70 15 94.20 16 95.10 17 96.20 18 96.30 19 94.70 20 91.60 21 95.50 22 94.80 23 96.10 24 96.00 25 93.20 26 94.00 27 99.90 28 95.90 29 101.80 30 95.40 31 95.70 32 93.70 33 97.20 34 102.80 35 91.50 36 94.10 37 94.90 38 95.90 39 92.40 40 96.20 41 96.50 42 98.40 43 93.70 44 95.90 45 97.60 46 97.70 47 95.60 48 98.70 49 96.30 50 93.90 51 100.70 24.80 5.60 19.40 4.40 18.50 4.80 21.20 7.20 18.20 5.70 18.80 4.70 16.70 1.90 20.10 3.00 19.20 3.20 16.90 5.50 21.10 4.10 21.30 3.40 17.10 2.50 20.30 5.10 18.50 3.30 19.10 2.90 17.00 3.90 18.70 3.30 20.40 1.40 16.60 1.90 17.50 2.40 17.90 3.90 19.40 3.40 18.00 2.20 17.30 3.80 22.10 3.70 18.20 4.40 19.30 2.70 17.60 1.60 17.30 2.50 17.90 3.30 16.80 1.50 21.70 4.30 19.60 18.70 17.40 1.40 18.70 3.70 17.50 6.60 16.80 4.60 16.30 1.90 16.50 1.80 20.10 2.20 17.60 2.00 18.40 4.20 20.60 4.60 25.50 3.70 18.60 2.60 18.80 2.30 17.80 5.20 17.60 2.00 17.80 3.20 19.60 5.40 16 Table 4: Table for Q5 y x1 x2 x3 x4 1 0 0 65.96 0.86 1.03 2 1 1 59.28 0.04 0.89 3 0 1 60.66 0.32 0.75 4 0 1 67.76 0.01 1.48 5 0 1 56.34 0.24 1.50 6 1 1 76.59 0.71 0.19 7 0 0 57.04 0.31 1.42 8 0 0 59.06 0.51 0.89 9 0 1 84.73 0.05 0.99 10 1 1 78.26 0.56 1.84 11 0 1 69.37 0.12 2.56 12 0 1 72.62 0.89 3.65 13 1 0 60.91 0.01 1.23 14 1 1 71.76 0.78 3.61 15 0 0 61.54 0.09 0.84 16 0 1 67.57 0.52 2.66 17 1 0 73.70 0.38 4.55 18 1 0 66.97 0.07 1.97 19 0 0 58.54 0.32 1.33 20 1 0 76.79 0.67 1.99 21 1 0 52.58 0.93 3.78 22 0 0 60.84 0.47 0.86 23 0 0 75.10 0.14 3.37 24 1 0 67.66 0.54 3.33 25 0 0 55.35 0.20 2.34 26 0 1 67.64 0.90 2.01 27 0 1 67.29 0.39 1.54 28 0 1 76.29 0.31 1.63 29 0 1 56.11 0.16 2.65 30 1 0 79.69 0.90 4.07 17