CS计算机代考程序代写 matlab Spectral Theory

Spectral Theory
1/46

Contents
1 Preliminary
Complex Numbers Complex Arrays
2 Eigenvalue Decomposition
3 Singular Value Decomposition
Singular Value Decomposition: Overview Understanding SVD
4 Properties of SVD Properties of SVD
Unitary Diagonalization and SVD Reduction of Dimensions
2/46

Preliminary
3/46

Complex Numbers
4/46

Complex Numbers
In what follows, we assume all scalars, vectors, and matrices may be complex.
Notation.
‚ R: the set of all real numbers
‚ C: the set of all complex numbers, i.e.,
tz“x`iy|x,yPRu wherei“? ́1.
5/46

Complex Numbers in MATLAB
Let z “ x ` iy P C. MATLAB
real(z)
imag(z)
conj(z)
abs(z)
angle(z)
Name
real part of z imaginary part of z conjugate of z modulus of z argument of z
Notation
Re z Im z z |z| argpzq
6/46

Euler’s Formula
‚ Recall that the Maclaurin series for et is
t2 tn ÿ8 tn
et “1`t` 2 ` ̈ ̈ ̈` n! ` ̈ ̈ ̈“n“0 n!, ́8†t†8.
‚ Replacing t by it and separating real and imaginary parts (using the cyclic
behavior of powers of i), we obtain
eit “ ÿ8 p ́1qkt2k ` i ÿ8 p ́1qkt2k`1 k“0 p2kq! k“0 p2k ` 1q!
loooooomoooooon
cosptq
‚ The result is called the Euler’s formula.
loooooooomoooooooon
sinptq
eit “ cosptq ` i sinptq.
7/46

Polar Representation and Complex Exponential
‚ Polar representation: A complex number z “ x ` iy P C can be written as z “ rei◊ where y
r“|z|, tan◊“x. ‚ Complex exponentiation:
ez “ex`iy “exeiy “expcosy`isinyq.
8/46

Complex Arrays
9/46

Complex Vectors
Denote by Cn “ Cnˆ1 the space of all column vectors of n complex elements. ‚ The hermitian or conjugate transpose of u P Cn is denoted by u ̊:
u ̊ P C1ˆn. ‚ The inner product of u, v P Cn is dened by
ÿn k“1
The 2-norm for complex vectors is dened in terms of this inner product:
}u}2 “ u ̊u.
u ̊v “
ukvk.
10/46

Complex Matrices
Denote by Cmˆn the space of all complex matrices with m rows and n columns.
‚ The hermitian or conjugate transpose of A P Cmˆn is denoted by A ̊: A ̊ “ `A ̆T “ `AT ̆ P Cnˆm.
‚ A unitary matrix is a complex analogue of an orthogonal matrix. If U P Cnˆn is unitary, then
and
U ̊U “ UU ̊ “ I
}Uz}2 “}z}2, foranyzPCn.
11/46

Complex Matrices: Some Analogies
Norm
Symmetry Orthonormality
Householder
Real
}v}2 “ ?vTv ST “S
(symmetric matrix)
QTQ “ I (orthogonal matrix)
Complex
}u}2 “ ?u ̊u S ̊ “S
(hermitian matrix)
U ̊U “ I (unitary matrix)
H “ I ́ v2vvvT T ̊
H “ I ́ u2uuu ̊
12/46

Eigenvalue Decomposition
13/46

Eigenvalue Decomposition
Eigenvalue Problem
Find a scalar eigenvalue ⁄ and an associated nonzero eigenvector v satisfying Av “ ⁄v.
‚ The spectrum of A is the set of all eigenvalues; the spectral radius is maxj ˇ⁄jˇ.
‚ The problem is equivalent to
‚ An eigenvalue of A is a root of the characteristic polynomial
14/46

Eigenvalue Decomposition (cont’)
Let A P Cnˆn and suppose that Avk “ ⁄kvk for k P Nr1, ns. ‚ Then
‚ If V is nonsingular, we can further write
which is called an eigenvalue decomposition (EVD) of A. If v is an eigenvector of A, then so is cv, c ‰ 0. Thus an EVD is not unique.
15/46

Eigenvalue Decomposition (cont’)
If A has an EVD, we say that A is diagonalizable; otherwise nondiagonalizable. Theorem 1 (Diagonalizability)
If A P Cnˆn has n distinct eigenvalues, then A is diagonalizable. Notes.
‚ Let A,B P Cnˆn. We say that B is similar to A if there exists a nonsingular matrix X such that
B “ XAX ́1.
‚ So diagonalizability is similarity to a diagonal matrix. ‚ Similar matrices share the same eigenvalues.
16/46

Calculating EVD in MATLAB
‚ E = eig(A)
produces a column vector E containing the eigenvalues of A.
‚ [V, D] = eig(A) ́1 producesV andDinanEVDofA,A“VDV
.
17/46

Understanding EVD: Change of Basis
Let X P Cnˆn be a nonsingular matrix.
‚ The columns x1,x2,…,xn of X form a basis of Cn.
‚ Any z P Cn is uniquely written as
z“Xu“u1x1 `u2x2 ` ̈ ̈ ̈`unxn.
‚ The scalars u1 , . . . , un are called the coordinates of z with respect to the columns of X.
‚ The vector u “ X ́1z is the representation of z with respect to the basis consisting of the columns of X.
Upshot
Left-multiplication by X ́1 performs a change of basis into the coordinates associated with the columns of X.
18/46

Understanding EVD: Change of Basis (cont’)
SupposeAPCnˆn hasanEVDA“VDV ́1.Then,foranyzPCn,y“Az can be written as
Interpretation
The matrix A is a diagonal transformation in the coordinates with respect to the V -basis.
19/46

What Is EVD Good For?
SupposeAPCnˆn hasanEVDA“VDV ́1. ‚ Economical computation of powers Ak:
‚ Analyzing convergence of iterates px1, x2, . . .q constructed by xj`1“Axj, j“1,2,…
20/46

Conditioning of Eigenvalues
Theorem 2 (Bauer-Fike)
LetAPCnˆn bediagonalizable,A“VDV ́1,witheigenvalues⁄1,…,⁄n.Ifμis an eigenvalue of A ` ”A for a complex matrix ”A, then
min ˇμ ́⁄jˇ§Ÿ2pVq}”A}2. 1§j §n
21/46

Singular Value Decomposition
22/46

Singular Value Decomposition: Overview
23/46

Singular Value Decomposition
Theorem 3 (SVD)
Let A P Cmˆn. Then A can be written as
A “ UV ̊, (SVD)
where U P Cmˆm and V P Cnˆn are unitary and P Rmˆn is diagonal. If A is real, then so are U and V .
‚ The columns of U are called the left singular vectors of A;
‚ The columns of V are called the right singular vectors of A;
‚ Thediagonalentriesof,writtenas‡1,‡2,…,‡r,forr“mintm,nu,arecalled the singular values of A and they are nonnegative numbers ordered as
‡1 •‡2 • ̈ ̈ ̈•‡r •0.
24/46

Singular Value Decomposition (cont’)
25/46

Thick vs Thin SVD
Suppose that m ° n and observe that:
26/46

SVD in MATLAB
‚ Thick SVD: [U,S,V] = svd(A);
‚ Thin SVD: [U,S,V] = svd(A, 0);
27/46

Understanding SVD
28/46

Geometric Perspective
WriteA“UV ̊ asAV “U:
Avk “ ‡kuk, k “ 1,…,r “ mintm,nu.
The image of the unit sphere under any m ˆ n matrix is a hyperellipse.
29/46

Algebraic Perspective
Alternately, note that y “ Az P Cm for any z P Cn can be written as `U ̊ y ̆ “ `V ̊ z ̆ .
Any matrix A P Cmˆn can be viewed as a diagonal transforma- tion from Cn (source space) to Cm (target space) with respect to suitably chosen orthonormal bases for both spaces.
30/46

SVD vs. EVD
Recall that a diagonalizable A “ V DV ́1 P Cnˆn satises
y “ Az ›Ñ ́V ́1y ̄ “ D ́V ́1z ̄.
This allowed us to view any diagonalizable square matrix A P Cnˆn as a diagonal transformation from Cn to itself1 with respect to the basis formed by a set of eigenvector of A.
Dierences.
‚ Basis: SVD uses two ONBs (left and right singular vectors); EVD uses one, usually
non-orthogonal basis (eigenvectors).
‚ Universality: all matrices have an SVD; not all matrices have an EVD.
‚ Utility: SVD is useful in problems involving the behavior of A or A`; EVD is relevant to problems involving Ak.
1The source and the target spaces of the transformation coincide.
31/46

Properties of SVD
32/46

Properties of SVD
33/46

SVD and the 2-Norm
Theorem 4
LetAPCmˆn haveanSVDAa“UV ̊.Then
1 }A}2“‡1and}A}F“ ‡12`‡2` ̈ ̈ ̈`‡r2.
2 The rank of A is the number of nonzero singular values. 3 Let r “ mintm, nu. Then
Ÿ2pAq “ ›A›2›A`›2 “ ‡1 . ‡r
34/46

Connection to EVD
LetA“UV ̊ PCmˆn andB“A ̊A.Observethat ‚ B P Cnˆn is a hermitian matrix2, i.e., B ̊ “ B.
‚ B has an EVD:
‚ The squares of singular values of A are eigenvalues of B.
‚ An EVD of B “ A ̊A reveals the singular values and a set of right singular vectors of A.
2This is the C-extension of real symmetric matrices.
35/46

Connection to EVD (cont’)
Theorem 5
The nonzero singular values of A P Cmˆn are the square roots of the nonzero eigenvalues of A ̊A or AA ̊.
36/46

Unitary Diagonalization and SVD
37/46

Unitary Diagonalization of Hermitian Matrices
The previous discussion is relevant to hermitian matrices constructed in a specic manner. For a generic hermitian matrix, we have the following result.
Theorem 6 (Spectral Decomposition)
Let A P Cnˆn be hermitian. Then A has a unitary diagonalization A “ V DV ́1,
where V P Cnˆn is unitary and D P Rnˆn is diagonal.
In words, a hermitian matrix (or symmetric matrix) has a complete set of orthonormal eigenvectors and all its eigenvalues are real.
38/46

Notes on Unitary Diagonalization and Normal Matrices
‚ A unitarily diagonalizable matrix A “ V DV ́1 with D P Cnˆn, is called a normal matrix3. All hermitian matrices are normal.
‚ Let A “ V DV ́1 P Cnˆn be normal. Since Ÿ2pV q “ 1 (why?), Bauer-Fike implies that eigenvalues of A can be changed by no more than }”A}2.
3Usual dention: A P Cnˆn is normal if AA ̊ “ A ̊A.
39/46

Unitary Diagonalization and SVD
Theorem 7
Let A P Cnˆn be hermitian. Then the singular values of A are the absolute values of the eigenvalues of A.
Precisely, if A “ V DV ́1 is a unitary diagonalization of A, then
is an SVD, where
A “ `V signpDq ̆ |D| V ̊
signpDq “ »—signpd1q .. fi, |D| “ »—|d1|
.. fi. —– . fl —– . fl
signpdn q |dn |
40/46

When Do Unitary EVD and SVD Coincide?
Theorem 8
If A “ A ̊, then the following statements are equivalent: 1 AnyunitaryEVDofAisalsoanSVDofA.
2 The eigenvalues of A are positive numbers.
3 x ̊Ax ° 0 for all nonzero x P Cn.
‚ The equivalence of 1 and 2 is immediate from Theorem 7
‚ The property in 3 is called the hermitian positive deniteness, c.f.,
symmetric positive deniteness.
(HPD)
‚ The equivalence of 2 and 3 can be shown conveniently using Rayleigh quotient; see next slide.
41/46

Note: Rayleigh Quotient
Let A P Rnˆn be xed. The Rayleigh quotient is the map RA : Rn Ñ R given by xTAx
RApxq“ xTx.
‚ RA maps an eigenvector of A into its associated eigenvalue, i.e., if
Av “ ⁄v, then RApvq “ ⁄.
‚ If A “ AT, then ÒRApvq “ 0 for an eigenvector v, and so
RApv`‘zq“RApvq`0`Op‘2q“⁄`Op‘2q, as‘Ñ0. The Rayleigh quotient is a quadratic approximation of an eigenvalue.
42/46

Reduction of Dimensions
43/46

Low-Rank Approximations
LetAPCmˆn withm•n.ItsthinSVDA“UppV ̊ canbewrittenas “ ‰ »— ‡ 1 fi »— v 1 ̊ fi
A “ u 1 u 2 ̈ ̈ ̈ u n —– . . . fl —– . . . fl ‡n vn ̊
“ ‰ »— v 1 ̊ fi ÿr
“ ‡ 1 u 1 ̈ ̈ ̈ ‡ n u n —– . . . fl “ ‡ j u j v j ̊ ,
v n ̊ j “ 1 ‚ Each outer product uj vj ̊ is a rank-1 matrix.
‚ Since ‡1 • ‡2 • ̈ ̈ ̈ • ‡r ° 0, important contributions to A come from terms with small j.
where r is the rank of A.
44/46

Low-Rank Approximations (cont’)
For 1 § k § r, dene
ÿk
A k “
where
‚ Uk istherstkcolumnsofU;
‚ Vk istherstkcolumnsofV;
‚ k is the upper-left k ˆ k submatrix of . This is a rank-k approximation of A.
j“1
‡ j u j v j ̊ “ U k k V k ̊ ,
45/46

Best Rank-k Approximation
Theorem 9 (Eckart-Young)
LetAPCmˆn.SupposeAhasrankrandletA“UV ̊ beanSVD.Then ‚ }A ́Ak}2 “‡k`1,fork“1,…,r ́1.
‚ For any matrix B with rankpBq § k, }A ́ B}2 • ‡k`1.
46/46