CS代考程序代写 University of California, Los Angeles Department of Statistics

University of California, Los Angeles Department of Statistics
Statistics 100B Instructor: Nicolas Christou
Multivariate normal distribution
One of the most important distributions in statistical inference is the multivariate normal distribution. The probability density function of the multivariate normal distribution, its moment generating function, and its properties are discussed here.
Probability density function
We say that a random vector Y = (Y1, Y2, . . . , Yn)′ with mean vector μ and variance covari- ance matrix Σ follows the multivariate normal distribution if its probability density function is given by
1 −1 −1(Y−μ)′Σ−1(Y−μ)
f(Y)= n |Σ| 2e 2 , (1)
(2π)2
and we write, Y ∼ Nn(μ, Σ). If Y = (Y1, Y2) then we have a bivariate normal distribution
and its probability density function can be expressed as
1 f(y1,y2) = 2πσσ√1−ρ2
12
inverse of Σ and substituting it into (1). Here is the bivariate normal pdf. Bivariate Normal Distribution
0.015
0.010
0.005
0.000 −10
10 5
􏰏 1
× exp −2(1−ρ2)
􏰄y1 −μ1􏰅􏰄y2 −μ2􏰅􏰐􏰐 −2ρ σ σ
1212 Here, we have Σ = 1 12 . The previous expression can be obtained by finding the
􏰍 σ2 σ 􏰎 σ21 σ2
f(x,y)
−5
0
1
􏰏􏰄y1 −μ1􏰅2 􏰄y2 −μ2􏰅2 σ + σ
y
0 −5 5
10 −10
x

Moment generating function
A useful tool in statistical theory is the moment generating function. The joint moment generating function is defined as MY(t) = Eet′Y = Ee􏰃ni=1 yiti ,, where Y = (Y1, Y2, . . . , Yn)′ and t = (t1, t2, . . . , tn)′. We will find the joint moment generating function of Y ∼ Nn(μ, Σ).
• Suppose Z ∼ Nn(0, I). Since Z1, Z2, . . . , Zn are independent the joint moment gener-
ating function of Z is M (t) = e1 t′t. Z2
n2
• IfY∼N (μ,Σ)showthatZ=Σ−1(Y−μ)followsN(0,I).
• Aside note: What is Σ− 1 ?
2

• To find the joint moment generating function of Y ∼ Nn(μ,Σ) we use the transformation 2Y2
Y=Σ1Z+μtogetM (t)=et′μ+1t′Σt.
3

Theorem 1
Let Y ∼ Nn(μ,Σ), and let A be an m×n matrix of rank m and c be an m×1 vector. Then AY + c ∼ Nm(Aμ + c, AΣA′).
Proof
4

Theorem 2
Let Y ∼ Nn(μ, Σ). Sub-vectors of Y follow the multivariate normal distribution and linear combinationsofY1,Y2,…,Yn followtheunivariatenormaldistribution.
Proof
Suppose Y, μ, and Σ are partitioned as follows Y = 1 , μ = 1 , Σ = 11 12 ,
ofY ,Y ,…,Y ,i.e. a Y +a Y +…+a Y =a′Y,thematrixAoftheorem1isavector 12n1122nn
Q2 μ2 Σ21 Σ22 where Y1 is p×1. We will show that Q1 ∼ Np(μ1, Σ11) and Q2 ∼ Nn−p(μ2, Σ22). The result follows directly by using the previous theorem with A = (Ip,0). For a linear combination
and therefore, a′Y ∼ N(a′μ,
Example

a′Σa).
σ12 σ12 σ13 σ14 σ15
􏰍Q􏰎 􏰍μ􏰎 􏰍ΣΣ􏰎
Y1 μ1
Y2  μ2 
σ21 σ2 σ23 σ24 σ25 
􏰍Y 􏰎 LetY=Y ,μ=μ ,Σ=σ σ σ2 σ σ ,thenifQ = 1 ,
 3   3   31 32 3 34 35  1 Y  2 2 Y4 μ4 σ41 σ42 σ43 σ4 σ45 
Y5 μ5 σ51 σ52 σ53 σ54 σ52
.
itfollowsthatQ ∼N
􏰏􏰍 μ 􏰎 􏰍 σ2 σ 􏰎􏰐 1 , 1 12
1 μ2 σ21σ2
5

Statistical independence
Suppose Y, μ, Σ are partitioned as in theorem 2. We say that Q1, Q2 are statistically inde-
pendent if and only if Σ12 = 0. We can show this using the joint moment generating function
of Y. Recall that the exponent of the joint moment generating function of the multivariate
normal distribution is t′ μ + 1 t′ Σt which after partitioning t conformably (according to the 2
partitioning of Y,μ,Σ) can be expressed as t1′μ1+t2′μ2+1t1′Σ11t1+1t2′Σ22t2+t1′Σ12t2. 22
When Σ12 = 0, the joint moment generating function can be expressed as the product of the two marginal moment generating functions of Q1 and Q2, i.e. MY(t) = MQ1 (t1)MQ2 (t2), therefore, Q1 and Q2 are independent.
Theorem 3
Using theorem 1 and the statement about statistical independence above, we prove the fol- lowing theorem. Suppose Y ∼ Nn(μ,Σ) and define the following two vectors W1 = AY and W2 = BY. Then, W1 and W2 are independent if cov(W1, W2) = AΣB′ = 0.
Y = LY. Therefore using
Proof
We stack the two vectors as follows: W = 1 =
theorem 1 we find that W ∼ N(Lμ,LΣL′) or 􏰏􏰍 A 􏰎 􏰍 AΣA′ AΣB′ 􏰎􏰐
􏰍W􏰎􏰍A􏰎 W2 B
W ∼ N B μ, BΣA′ BΣB′ , and we conclude that W1 and W2 are independent if and only if AΣB′ = 0.
Conditional probability density functions for multivariate normal
Consider the bivariate normal distribution (see page 1). From theorem 1 it follows that Y1 ∼ N(μ1,σ1). This is also called the marginal probability distribution of Y1. We want to find the conditional distribution of Y2 given Y1.
From the conditional probability law, fY2|Y1(y2|y1) = fY1Y2(y1,y2), and after substituting the fY1 (y1 )
bivariate density and the marginal density it can be shown that the conditional probability density function of Y2 given Y1 is given by
1 σ2(1 − ρ)2
􏰏
1􏰍Y2 −μ2 −ρσ2(Y1 −μ1)􏰎􏰐 σ1
fY2|Y1(y2|y1)=􏰑
We recognize that this is a normal probability density function with mean
μY2|Y1 = μ2 + ρσ2 (Y1 − μ1) and variance σ2 = σ2(1 − ρ2). σ1 Y2 |Y1
6
√ exp −
2π 2 σ2(1 − ρ2)
.

In general:
Suppose that Y, μ, and Σ are partitioned as follows Y = 1 , μ = 1 , Σ = 11 12 ,
and Y ∼ N(μ,Σ). It can be shown that the conditional distribution of Q1 given Q2 is also mul-
tivariate normal, Q1|Q2 ∼ N(μ1|2,Σ1|2), where μ1|2 = μ1 + Σ12Σ−1(Q2 − μ2), and Σ1|2 = 22
Σ11 − Σ12Σ−1Σ21. 22
Proof:
LetU=Q1−Σ12Σ−1Q2 andV=Q2. 22
7
􏰍Q􏰎􏰍μ􏰎􏰍ΣΣ􏰎
Q2 μ2 Σ21 Σ22

Example 1
Suppose the prices (in $), Y1,Y2,Y3,Y4 of objects A, B, C, and D are jointly normally distributed as  Y1   1   3 2 3 3 
Y2 3 2 5 5 4 Y ∼ N4(μ,Σ), where, Y =  , μ =  , and Σ =  .
 Y3   6   3 5 9 5  Y4 4 3456
Answer the following questions:
a. Suppose a person wants to buy three of product A, four of product B, and one of product C. Find the probability that the person will spend more than $30.
b. Find the moment generating function of Y1.
c. Find the joint moment generating function of (Y1,Y3).
d. Find the correlation coefficient between Y3 and Y4.
Example 2
 Y1   2   2 1 1 
SupposeY∼N3(μ,Σ),whereY= Y2 ,μ= 1 ,andΣ= 1 3 0 . Findthejoint Y3 2 101
distributionofQ1 =Y1+Y2+Y3 andQ2 =Y1−Y2. Example 3
Answer the following questions:
1
 1 
a. Let X ∼ Nn(μ1,Σ), where 1 =  . , and Σ is the variance covariance matrix of X. Let
 .  1
1 n−1
1000  0 1 0 0 
b. Suppose ε ∼ N3(0,σ2I3) and that Y0 ∼ N(0,σ2), independently of the εi’s. Therefore  Y0 
 ε1 
the vector  , is multivariate normal. Define Yi = ρYi−1 + εi for i = 1, 2, 3. Express
Y1, Y2, Y3 in terms of ρ, Y0, and the εi’s.
 Y1 
c. Refer to part (b). Find the covariance matrix of Y =  Y2 ,
Σ = (1 − ρ)I + ρJ, with ρ > −
Therefore, when ρ = 0 we have X ∼ Nn(μ1,I), and in this case we showed in class that X ̄
 ε2  ε3
d. What is the distribution of Y?
, I =
 . . . .  and J = ….
.. ..
 . . . . . …. .. ..
1111  1 1 1 1 
0001 1111 and 􏰃ni=1(Xi − X ̄)2 are independent. Are they independent when ρ ̸= 0?
8
Y3