Key Issues:
Lecture IV
Real symmetric matrices and canonical forms
Fall 2020 Prof.Jiang@ECE NYU 159
Symmetric Matrices
Recall that a symmetric matrix A aij satisfies: aijaji, 1i,jn.
It is a real symmetric matrix if, additionally, all aij ‘ s are real.
Notation :
T nn
AA,A .
Fall 2020 Prof.Jiang@ECE NYU 160
Fact 1 about Symmetric Matrices
The eigenvalues of a real symmetric matrix are always real.
Fall 2020 Prof.Jiang@ECE NYU 161
Proof of Fact 1
By contradiction, assume that a real symmetric A has a complex eigenvalue, say, . Then,
Axx Axx,orxTAxT.
because A is symmetric. This further implies that
xT AxxT x and xT AxxT x. 0xTx
0, acontradiction.
Fall 2020 Prof.Jiang@ECE NYU 162
Fact 2 about Symmetric Matrices
For any real symmetric matrix, its eigenvectors associated with distinct eigenvalues are orthogonal.
Remarks:
Two vectors x, yn are orthogonal if xT y 0.
Orthogonal vectors are linearly independent. Fall 2020 Prof.Jiang@ECE NYU 163
Proof of Fact 2
For a real symmetric A, consider a pair of eigenvectors x, y associated with distinct eigenvalues , , i.e.,
Axx and Ayy. This further implies that
yT Ax yT x and xT Ay xT y. T
Asymmetric yT AxyT Ax xT Ay 0xT y
xT y0, aswished.
Fall 2020 Prof.Jiang@ECE NYU 164
Canonical Form – First Pass
Consider a real symmetric matrix Ann ,
with distinct (real, by Fact 1) eigenvalues i i1 .
Then, there is an orthogonal matrix O, i.e., OT O I , such that
Fall 2020
n
Prof.Jiang@ECE NYU 165
0 1
OTAOdiagi . 0
n
Constructive Proof
For each eigenvalue , take an eigenvector xi , i
which has unit norm, i.e., xi Define a matrix O as:
(xi )T xi 1.
Ox1, , xnnn T
x1
Then, OT nn
Fall 2020 Prof.Jiang@ECE NYU
166
T xn
Constructive Proof (cont’d)
It is directly checked using Fact 2 that OT O I , i.e., O is an orthogonal matrix.
Inaddition,OT AOdiag . i
Fall 2020 Prof.Jiang@ECE NYU 167
Exercise
Compute the eigenvalues 1, 2 of A1 2
2 3
and find a transformation matrix O s.t. OT AO1 0 .
0 2
Fall 2020 Prof.Jiang@ECE NYU 168
What if A is not necessarily symmetric
Answer:
Yes! As long as the eigenvalues are mutually distinct, there is a nonsingular matrix P such that
P1 AP diag i , denoted A ~ diag i . However, this P may not be orthogonal.
Fall 2020 Prof.Jiang@ECE NYU 169
Remark: A non symmetric matrix may not be diagonanizable.
Show that the following matrix A0 1
0 0
is not diagonalizable.
Fall 2020 Prof.Jiang@ECE NYU 170
Comment
Two similar matrices have the same eigenvalues. So, if A diagi, i.e.,P1APdiagi,
n the eigenvalues of A are simply i i1 .
However, the converse is not true.
Fall 2020 Prof.Jiang@ECE NYU
171
Fall 2020
Prof.Jiang@ECE NYU 172
Exercise
Two matrices having the same eigenvalues may not be similar. Show that the following matrix
A1 0 1 1
is not diagonizable. In other words, it is not similar to 1 0
0 1
Question (Necessity and Sufficiency):
When is a matrix similar to a diagonal
matrix?
Fall 2020 Prof.Jiang@ECE NYU 173
Necessary and Sufficient Condition for the Canonical Diagonal Form
An n n matrix A is similar to a diagonal matrix iff A has n linearly independent eigenvectors.
When A has n distinct eigenvalues, it is similar to a diagonal matrix.
Fall 2020 Prof.Jiang@ECE NYU 174
Proof
First, note that Statement 2 follows from Statement 1 and a result proved previously.
Assume A is similar to a diagonal matrix diag i .
Then, P nonsingular s.t. P1 AP .
LetPp1 p2 pn,withpilinearlyindependent.
APP Api pi,i1,2,….,n i
implying that pi is an eigenvector for eigenvalue . i
Fall 2020 Prof.Jiang@ECE NYU 175
Proof (cont’d)
Conversely, assume that A has n linearly independent n
eigenvectorspi ,i.e.,Api pi.
i
Then, Pp p p isnonsingularand
i 1 12n
satisfies (by direct computation) that P1AP.
Fall 2020
Prof.Jiang@ECE NYU 176
Comment
From the proof of Part 1, it follows that the following is an equivalent condition for diagonalization of A:
dimNAIdimNA In 1k
where
,…, are the distinct eigenvalues of A, k n.
1k
Fall 2020 Prof.Jiang@ECE NYU 177
Fall 2020
Prof.Jiang@ECE NYU
178
An Example
Bring the matrix A 0 1 1 0
into a diagonal form.
Theeigenvaluesof A are1 j, 2 j. As it can be directly checked, the associated
independent eigenvectors are:
c1 1 and c2 1 . j j
Then, P c1 c2 , implying that P1APj 0.
0 j
Fall 2020 Prof.Jiang@ECE NYU 179
Diagonalizable Matrix
A matrix is said to be “diagonalizable”, if it is similar to a diagonal matrix.
Are the following statements true or false:
(1) Two diagonalizable matrices always commute.
(2) The block-diagonal matrix
nn Bblockdiag B ,Bi i
ii
is diagonalizable if and only if each B is diagonalizable. i
Fall 2020 Prof.Jiang@ECE NYU
180
Let’s stop for a short review…
• Reviewoftheresultsonnontrivial solutions to homogeneous equations:
Ax0, Amn, xn. • Howaboutinhomogeneoussystems?
Fall 2020 Prof.Jiang@ECE NYU 181
A Quiz?
Anysetofvectorsxi n, with1iN, are always linearly dependent, if N n.
Fall 2020 Prof.Jiang@ECE NYU 182
Real and Symmetric Matrices
• Theeigenvaluesarealwaysreal.
• Eigenvectorsassociatedwithdistinct
eigenvalues are always orthogonal.
• Anymatrixwithnorepeatedeigenvaluesis diagonalizable.
• Howtotransformarealandsymmetric matrix into a diagonal form?
Fall 2020 Prof.Jiang@ECE NYU 183
Fall 2020
Prof.Jiang@ECE NYU 184
A General Result for General Symmetric Matrices
For any real and symmetric matrix Ann , there always exists an orthogonal matrix, say O,
OTOI, suchthat
1 0
OTAO
0 n
Special case: A Trivial Example
a0 1
A 0a
n
Clearly, the identity matrix is an orthogonal matrix.
Fall 2020 Prof.Jiang@ECE NYU 185
Before proving this general and fundamental result, let us introduce some useful tools.
Fall 2020 Prof.Jiang@ECE NYU 186
The Gram-Schmidt Orthogonalization Process
Question :
How to generate a set of mutually orthogonal
N
v e c t o r s y i 1 s u c c e s s i v e l y ,
i
from a set of N real linearly independent
N n-dimensional vectors x i1 ?
i Fall 2020 Prof.Jiang@ECE NYU
187
Fall 2020
Prof.Jiang@ECE NYU 188
Let us start with a set of real-valued vectors iN
x i1 . Here is the systematic procedure.
First,
y1 : x1
y2 :x2 a x1 11
where a is a scalar to be determined so that 11
T
inner product y1,y2 y1 y2 0
x1,x2ax1 0. 11
Fall 2020
0, y3 , x2 0.
Prof.Jiang@ECE NYU 189
x1,x2ax1 0a:x1,x2 /x1,x1 11 11
withD:x1,x1 0. 1
Next, construct y3 as: y3:x3a x1a x2
22
where a21, a22 are scalars to be determined s.t.
y3 , y1 y3 , x1
0, y3 , y2 0
21
Fall 2020
Prof.Jiang@ECE NYU
190
y3,x1 0, y3,x2 0
x3,x1 a21 x1,x1 a22 x2,x1 0 x3,x2 a21 x1,x2 a22 x2,x2 0
which has a (unique) solution a21, a22 if 11 12
D:detx,x x,x 0.
2
21 22 x,x x,x
Fall 2020
Prof.Jiang@ECE NYU 191
By contradiction, assume that 11 12
D2:detx,x x,x 0 x2,x1 x2,x2
Then, there are two scalars r , s , not both 0, 11
such that
r x1,x1 s x1,x2 0 11
r x2,x1 s x2,x2 0 11
x1,rx1sx2 0, x2,rx1sx2 0
11 11
Fall 2020
Prof.Jiang@ECE NYU 192
x1,rx1sx2 0, x2,rx1sx2 0 11 11
rx1sx2,rx1sx2 0
1111 r x1 s x2 0.
11
12 Contradiction with x , x
being linearly independent. Thus,
11 12
D:detx,x x,x 0.
2
21 22 x,x x,x
So, we have obtained three mutually orthogonal vectors:
Fall 2020
Prof.Jiang@ECE NYU 193
y1 : x1
y2 :x2 a x1
y3:x3a x1a x2 21 22
11
Continuing this process, we can find other mutually orthogonal vectors:
i1
yi :xi a xk
(i1)k k1
with the scalars a(i1)k chosen to achieve the mutual orthogonality condition:
yi,yj 0 ij,
orequivalently, yi,xj 0,1ji1. Fall 2020 Prof.Jiang@ECE NYU 194
Fall 2020
Prof.Jiang@ECE NYU 195
Othonormal Vectors
They are defined as follows: ui:yi/yi ,i1,2,,N.
It is easy to show that, if n N, O u1, u2, , uN
is an orthogonal matrix.
An Example
Consider the linearly independent vectors: 1 0
12 x0,x1.
1 1
By means of the Gram-Schmidt process, find a set of orthonormal vectors u1, u2.
Fall 2020 Prof.Jiang@ECE NYU 196
Exercise
Show that if v ,, v is a set of k linearly independent 1k
vectors in n , then there exists an invertible upper triangular matrix T kk such that the matrix U VT has
orthonormal columns.
Fall 2020 Prof.Jiang@ECE NYU 197
Comment
During the Gram-Schmidt process, we proved that the determinants D , called Gramians, are nonzero.
Indeed, we can prove that
D detxi,xj 0, 1kN,
k
k
for any set of linearly independent vectors
k i
Fall 2020 Prof.Jiang@ECE NYU
x i1 . 198
Indeed,
Leading principle minor
Each Gramian D det xi , x j is associated with k
a positive-definite quadratic form:
k i1
k j1
Q(u)
xi,xj uu
uxi, i
u xj j
k
ij
i, j1
Q positive definite in u (u ,,u )k .
1k
Q(u) 0, where equality holds only when u 0.
Fall 2020 Prof.Jiang@ECE NYU 199
An Interesting Result
For any positive-definite quadratic form N
Qauu,
ij i j i, j1
the associated determinant
D det aij
is always positive.
Fall 2020 Prof.Jiang@ECE NYU 200
N
au 0, i1,2,,N
ij j j1
Then, it follows that
NN
Q u i
i 1 J 1
a u 0
a contradiction.
Fall 2020
Prof.Jiang@ECE NYU 201
ij j
Proof
First, we prove that D 0. By contradiction, assume otherwise, there is a nontrivial solution to
Second, we prove that D 0. For [0,1], consider a family of quadratic forms defined as
N P()Q 1 u2.
i1
i
Clearly, P 0, for all nontrivial u. Then,
based on the above analysis, the associated determinants are nonzero.
At 0, thedeterminantis detI 0.
So, by continuity, at 1, the determinant is D
which cannot be negative.
Fall 2020 Prof.Jiang@ECE NYU 202
General 2×2 Symmetric Matrices
Fall 2020
Prof.Jiang@ECE NYU
203
We begin with the two-dimensional case: a aa1
A11 12 a21 a22 a2
which is symmetric, i.e., a 12
a . 21
Considerapairofeigenvalue1 andassociated x
(normalized) eigenvector x1 : 11 , i.e. x
12 Ax1x1a1,x1 x, a2,x1 x
1 1 11 1 12
General Symmetric Matrices (Cont’d)
Using the Gram-Schmidt process, take a 22 orthogonal matrix O y1 y2 , with y1:=x1 the
2
given normalized eigenvector.
It will be shown that
OTAO 1 0 220 2
Fall 2020 Prof.Jiang@ECE NYU 204
General Symmetric Matrices (Cont’d)
First, show that
12
T
y 1 11
b
a1,y2 2220b
1 12 y a2,y2 22
OTAOOT
Then, b 0 using symmetry;
1 12
and b because the eigenvalues are
OT AO . 2222
OT AO
22 2
unchanged under O.
Fall 2020 Prof.Jiang@ECE NYU 205
to a diagonal form.
Exercise 1
Try to reduce the real symmetric matrix A1 k
k 1
Fall 2020 Prof.Jiang@ECE NYU 206
Define the real bilinear form n
Exercise 2
Qx,yyTAxa yx,
x,yn
ij i j i, j1
that reduces to the inner product when A I. ProvethatQ issymmetric,i.e.,Qx,yQy,x if and only if A is symmetric.
See the text (Horn & Johnson, 2nd edition, 2013; page 226)
Fall 2020 Prof.Jiang@ECE NYU 207
Homework #4
1. Does the singular matrix A1 1
1 1
have two independent eigenvectors?
T
2. Show that A and A have the same eigenvalues.
Fall 2020 Prof.Jiang@ECE NYU 208
A2 0 0 1
Justify your answer.
Homework #4
3. Show by direct calculation for A and B, 22 matrices, that AB and BA have the same characteristic equation.
4. Can you give two matrices that are reducible to the following canonical diagonal matrix
Fall 2020 Prof.Jiang@ECE NYU 209