CS考试辅导 Principle Components Analysis Factor Analysis

Principle Components Analysis Factor Analysis

Principal Components Analysis
Factor Analysis

Copyright By PowCoder代写 加微信 powcoder

Principal Components Analysis
A technique for reorienting the data (scatter plot) so that the first few dimensions (axes) account for as much of the available information (variance) as possible
If there is substantial redundancy (correlation) present in the data set, it may be possible to account for most of the information with a relatively small number of dimensions
Each principal component is a linear combination (i.e., weighted sum) of the original variables
The dimensions/components derived by PCA are orthogonal–each component is uncorrelated with all others

Principal Components Analysis
Illustration 1

Projection of points on a line
(i.e. dropping a perpendicular line from point to line)

Source of image: http://georgemdallas.wordpress.com/2013/10/30/principal-component-analysis-4-dummies-eigenvectors-eigenvalues-and-dimension-reduction/

Projection of points on a line
(i.e. dropping a perpendicular line from point to line)

Source of image: http://georgemdallas.wordpress.com/2013/10/30/principal-component-analysis-4-dummies-eigenvectors-eigenvalues-and-dimension-reduction/

The projected data points on this line have a bigger variance (squared std dev.)

Source of image: http://georgemdallas.wordpress.com/2013/10/30/principal-component-analysis-4-dummies-eigenvectors-eigenvalues-and-dimension-reduction/

Principal Components Analysis
Illustration 2

Principal Components Analysis
Principal component loadings refer to the correlations between the principal components and the original variables
For a particular original variable, the square of the correlation (r squared) tells you the proportion of the variance accounted for by each principle component
Since all the principal components are uncorrelated, you can add up the r square values to determine the proportion of variance accounted for by a set of components

Principle Components Analysis
Eigenvalue
The eigenvalue of a principle component tells you the variance of the component values
If the original variables are standardised (converted to z scores), the sum of the variances of the principle components is equal to the number of variables
How many components should we retain? Kaiser (1959) recommended retaining only principal components with eigenvalues above 1

Principle Components Analysis
To make it easier to interpret the components, it is sometimes desirable to re-orient or rotate the retained components so that the “loadings matrix” takes on a simpler structure
Simple structure means
any single variable is highly correlated with one (or a few) component, and any single component is highly correlated with only a few variables
any pair of component (columns in the loadings matrix) should have different patterns of loadings. Otherwise one could not distinguish the two components
The idea is to “rotate” the components so that the projections of each variable onto the components/factors are either large (i.e., near 1 in absolute value) or small (near zero)

Principle Components Analysis
Rotation (cont)
2 kinds of rotations:
Orthogonal rotation preserves the perpendicularity of the components — the components are at right angles to each other and are thus uncorrelated. 2 widely used methods:
Kaiser’s Varimax rotation (which tries to achieve a simple structure by focusing on the columns of the factor loadings matrix) and
Quartimax rotation (which focuses on the rows).
Oblique rotation allows for correlation between the rotated components

Factor Analysis
Factor analysis is related to principle components analysis, but identifies the underlying sources of variance common to two or more variables (called common factors)
Assumes that the observed variance in each variable is attributable to
a small number of common factors (i.e. unobservable characteristics common to two or more variables) and
a single specific factor (unrelated to any other underlying factor n the model).

Factor Analysis
Eigenvalues & rotation
The eigenvalues look different from those of a principal components analysis
Not all of the eigenvalues are positive
They no longer sum to the number of variables in the analysis
Reason: the eigenvalues now reflect the proportion of the common variance underlying the variables
The same methods of component rotation can be applied to the common factors

程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com