CS代写 https://xkcd.com/1775/

https://xkcd.com/1775/
https://xkcd.com/419/

Principal Component Analysis

Copyright By PowCoder代写 加微信 powcoder

Brief recap: eigenvalues and eigenvectors
with maximum variance formulation
with minimum
Applications
error formulation
Extensions of PCA
{ PCA ( kernel
probabilistic PCA
assignment

Eigenvectors of
A: M-by-M real symmetric matrix,
real, symmetric matrix (Appendix C)
● Eigenvector can be chosen to be orthonormal, i.e. U T
● Eigenvalues of A are real
M-by-1 vector
is a rotation in
● Determinant of A
● A is positive semi
t f X i ‘s it ,
is product of its eigenvalues
definite if all eigenvalues >= 0
positive semi definite

{ discrete
Motivation: continuous latent variable models
dataset has a small degree of freedom
Due to the underlying physical process (e.g. oil flow) Due to a few inherent transformations (e.g. written digits) → data lives in a low-dimensional subspace (manifold)
● Some dimensions can be redundant w.r.t. each other
○ Fewer d-imensions may be easier to explain
○ The transformation from all to fewer dimensions may yield new insights
reducing the number of dimensions could help down-stream modeling
clustering
● Sometimes
unsupervised pre-training
subsequent supervised training
(e.g. regression)
only visualise / plot in low
https://projector.tensorflow.org/
dimensions

Principal co
The simplest continuous
mponent analysis
latent variable
Dimensionality reduction by (linear) transformation.
model: both observations and latent variables are Gaussian.
Probabilistic and Kernelised version …
available, connects to
factor analysis, independent component analysis,

princi.pl#cnoun1
Principal co
mponent analysis – historical note
[Hotelling 1933] PCA can be defined as the orthogonal project of the data onto a lower dimensional linear space, known as the principal subspace, such that the variance of the projected data is maximized.
[Pearson 1901] Equivalently, it can be defined as the linear projection that minimizes the average projection cost. defined as the mean squared distance between the data points and their projections.
The “goto” baseline
for understanding high-dimensional data.

Maximum variance formulation
Dataset {xn}, n=1, 2,
Goal: find the
affect proj . … N, xn is D-dimensional vector.
“best” projection direction u1, s.t. variance of u1T X is
U i’ U j – – O
maximized.
C o v C↳X ) *
Max, eigenvalueofS
corresponding eigenvector

implement?
● evaluating the mean x and the covariance matrix S of the data set; –
‘ ‘ remove
What are about subsequent
● Corresponding eigenvectors
dimensions?
for the largest
M eigenvalues
● find the M eigenvectors of S corresponding to the M largest eigenvalues. a ll eigenvalue / eigenvector
putational complexity?
S OCNDs, ocpt.nl/hhEDeigenuetor

Orthonormal ba
A complete set of
sis and projecting data
orthonormal basis {ui} ,
i=1, …, D
– ero. “X2
can be uniquely expressed as

error formulation
Approximate xn with up to M coordinates. With M data-dependent data-independent ones bi
Square loss:
coefficients zni, and D-M

Principal Component Analysis
Brief recap: eigenvalues and eigenvectors
with maximum variance formulation
with minimum
error formulation
Applications
Extensions of PCA

Applications of PCA

compressed
bits perpixel
3 . 5 . 7 bits

PCA for data transformation
gaussian , a

PCA is not designed to separate classes

PCA for high-dimensional data D>N
Ordinarily:
The same eigenvalues! What about eigenvectors?
gene- expression
O(D3), D-N+1 zero eigenvectors.
m i n { D . N. }

to ||ui|| = 1
& choose Vi
eigenvectors ofS
orthonormal
this is true
eigenvectors

Extensions of

Principal Component Analysis
Brief recap: eigenvalues and eigenvectors
with maximum variance formulation
with minimum
Applications
Extensions of PCA
error formulation

c.Bonus , not inthe book) a dud .. – complex
PCA vs Fourier Transforms gfrourier real valued cosine transform.
orthonormal basis
linear transforms co .
* re, ¥¥¥*tE¥*¥*i?±
somdmwppru
low freq high freq
← 200ft → booth-79mHz
, “← Fourier→
not adapted
i¥÷÷÷i÷:÷÷÷.

程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com