CS代考 6CCS3AIN, Tutorial 05 Answers (Version 1.0)

6CCS3AIN, Tutorial 05 Answers (Version 1.0)
1. The exact numbers are calculate in the PCA_tutorial_example.py. We will use approximations here.
Our data matrix is.
(a) Step 1: Compute the mean row vector
423 X=6 1 3  4 2 5 
783
x ̄T=􏰈5.25 3.25 3.5􏰉 (b) Step 2: Compute the mean row matrix
1 1

1
(c) Step 3: Subtract mean (obtain mean centred data)
5.25 3.25 3.5
 5.25 3.25 3.5    5.25 3.25 3.5
X ̄=.·x ̄T=5.25 3.25 3.5
 −1.25 −1.25 −0.5 
0.75 −2.25 −0.5
 −1.25 −1.25 1.5  1.75 4.75 −0.5
The dimensions are n × d
(d) Step 4: Compute the covariance matrix of rows of B
1.68 2.44 −0.63 C=n1BTB= 2.43 7.69 −0.63,
−0.63 −0.63 0.75 where n = 4 The dimensions are (d × n) × (n × d) = d × d
(e) Step 5: Compute the k largest eigenvectors v1,v1,…,vk (not covered how to do this in this module. You use Python or WolframAlpha).
Each eigenvector has dimensions 1 × d
−0.34 v1 ≈ −0.94
0.1
−0.71 v2 ≈  0.32 
0.63
(f) Step 6: Compute matrix W of k-largest eigenvectors
 −0.34 −0.71  W=−0.94 0.32 
0.1 0.63
Dimensions of W are (d × k).
(g) Step 7: Multiply each datapoint xi for i ∈ {1,2,…,n} with WT
We have e.g.,
T 􏰂 −0.34 −0.94 0.1 􏰃 W = −0.71 0.32 0.63
4 x1 = 2
3 1
B = X − X ̄ =  
 

So
y4 ≈ −0.48 Dimensionsofyi are(k×d)×(d×1)=k×1
yi = WT · xi 􏰂−2.92􏰃
y1 ≈ −0.3 􏰂−2.66􏰃
y2 ≈ −2.03 􏰂−2.72􏰃
y3 ≈ 0.97 􏰂−9.55􏰃
2