Vous êtes sur la page 1sur 51

Lecture IV

Key Issues:

Real symmetric matrices and canonical


forms

Fall 2018 Prof.Jiang@ECE NYU 157


Symmetric Matrices

Recall that a symmetric matrix A   aij 


satisfies: aij  a ji , 1  i, j  n.
It is a real symmetric matrix if, additionally,
all aij ' s are real.
Notation :
A  AT , A   nn .
Fall 2018 Prof.Jiang@ECE NYU 158
Fact 1 about Symmetric Matrices
The eigenvalues of a real symmetric matrix are
always real.

Fall 2018 Prof.Jiang@ECE NYU 159


Proof of Fact 1
By contradiction, assume that a real symmetric A
has a complex eigenvalue, say, . Then,
Ax  x  Ax  x , or x A  x . T T

because A is symmetric. This further implies that


x T Ax  x T x and x T Ax  x T x.
 0       xT x
       0, a contradiction.
Fall 2018 Prof.Jiang@ECE NYU 160
Fact 2 about Symmetric Matrices
For any real symmetric matrix, its eigenvectors
associated with distinct eigenvalues are
orthogonal.

Remark: Orthogonal vectors are linearly independent.


Fall 2018 Prof.Jiang@ECE NYU 161
Proof of Fact 2
For a real symmetric A, consider a pair of eigenvectors
 x, y  associated with distinct eigenvalues , ,
respectively, i.e.,
Ax  x and Ay  y.
This further implies that
y Ax  y x and x Ay  x y.
T T T T

A symmetric  yT Ax  xT Ay  0     xT y
 xT y  0, as wished.
Fall 2018 Prof.Jiang@ECE NYU 162
Canonical Form – First Pass
nn
Consider a real symmetric matrix A   ,
with distinct (real, by Fact 1) eigenvalues i i 1 .
n

Then, there is an orthogonal matrix O, i.e., O O  I , T

such that
 1  0 
 
O AO  diag  i        .
T

0   
 n
Fall 2018 Prof.Jiang@ECE NYU 163
Constructive Proof
For each eigenvalue  i , take an eigenvector xi ,
which has unit norm, i.e., xi  ( xi )T x i  1.
Define a matrix O as:

O  x , , x  
1 n
 nn

 
 x1 
T

 
Then, O  
T    n n
 
 
 xn 
T

 
Fall 2018 Prof.Jiang@ECE NYU 164
Constructive Proof (cont’d)

It is directly checked using Fact 2 that O O  I ,


T

i.e., O is an orthogonal matrix.

In addition, OT AO  diag   i   .

Fall 2018 Prof.Jiang@ECE NYU 165


Exercise

Compute the eigenvalues 1 ,  2 of


1 2
A 
 2 3
and find a transformation matrix O s.t.
 1 0 
O AO  
T
.
 0 2 
Fall 2018 Prof.Jiang@ECE NYU 166
What if A is not necessarily symmetric

In general, there may be a nonsingular,


not necessarily othogonal , matrix P such that
P 1 AP  diag   i  .

As said previously, in this case, A is said to be


similar to diag   i  , denoted A ~ diag   i  ,
and diag   i  is the canonical diagonal form of A.
Fall 2018 Prof.Jiang@ECE NYU 167
Comment
When A is similar to a diagonal matrix diag  i  ,
i.e., P 1 AP  diag  i  ,
the eigenvalues of A are simply i i 1 .
n

  
Indeed , det I  P 1 AP  det P 1  I  A P 
 det P det  I  A det P  det  I  A ,
1


noting that det P 1 det P  det P 1 P  1. 
Fall 2018 Prof.Jiang@ECE NYU 168
Exercise 1
Can the following matrix
1 0 
A 
1 1 
be transformed into the canonical diagonal form
1 0
 
0 1
Two matrices having the same eigenvalues may not be similar.
Fall 2018 Prof.Jiang@ECE NYU 169
Exercise 2
Can the following matrix
0 1
A 
0 0
be transformed into a canonical diagonal form?

Remark: A non symmetric matrix may not be diagonanizable.


Fall 2018 Prof.Jiang@ECE NYU 170
Question (Necessity and Sufficiency):

When is a matrix similar to a diagonal


matrix?

Fall 2018 Prof.Jiang@ECE NYU 171


Necessary and Sufficient Condition
for the Canonical Diagonal Form

 An n  n matrix A is similar to a diagonal matrix


iff A has n linearly independent eigenvectors.

 When A has n distinct eigenvalues, it is similar


to a diagonal matrix.

Fall 2018 Prof.Jiang@ECE NYU 172


Proof
First, note that Statement 2 follows from Statement 1
and a result proved previously.

Assume A is similar to a diagonal matrix   diag  i  .


1
Then, P nonsingular s.t. P AP  .

Let P  p p  p , with p
1 2 n
   linearly independent.
i

AP  P  Ap i   i p i , i  1, 2,...., n
implying that p is an eigenvector for eigenvalue  i .
i

Fall 2018 Prof.Jiang@ECE NYU 173


Proof (cont’d)

Conversely, assume that A has n linearly independent

 
n
eigenvectors p i
, i.e., Ap  i p .
i i
i 1


Then, P  p1 p2  pn is nonsingular and
satisfies (by direct computation) that
P1 AP  .

Fall 2018 Prof.Jiang@ECE NYU 174


Comment
From the proof of Part 1, it follows that the
following is an equivalent condition for
diagonalization of A:
dim N  A  1 I     dim N  A  k I   n
where
1 ,..., k are the distinct eigenvalues of A, k  n.

Fall 2018 Prof.Jiang@ECE NYU 175


An Example
 0 1
Bring the matrix A   
1 0 
into a diagonal form.

Fall 2018 Prof.Jiang@ECE NYU 176


The eigenvalues of A are 1   j,  2  j.
As it can be directly checked, the associated
independent eigenvectors are:
1  1 
c    and c    .
1 2

 j  j 
 
Then, P  c c , implying that
1 2

1  j 0
P AP   .
 0 j
Fall 2018 Prof.Jiang@ECE NYU 177
Diagonalizable Matrix
A matrix is said to be "diagonalizable", if it is similar to
a diagonal matrix.
Are the following statements true or false:

(1) Two diagonalizable matrices always commute.


(2) The block-diagonal matrix

B  block diag  Bi  , Bi   ni ni

is diagonalizable if and only if each Bi is diagonalizable.


Fall 2018 Prof.Jiang@ECE NYU 178
Let’s stop for a short review…

• Review of the results on nontrivial


solutions to homogeneous equations:

mn
Ax  0, A   , x .
n

• How about inhomogeneous systems?

Fall 2018 Prof.Jiang@ECE NYU 179


A Quiz?

 Any set of vectors x   , with 1  i  N , are


i n

always linearly dependent, if N  n.

Fall 2018 Prof.Jiang@ECE NYU 180


Real and Symmetric Matrices
• The eigenvalues are always real.

• Eigenvectors associated with distinct


eigenvalues are always orthogonal.

• Any matrix with no repeated eigenvalues is


diagonalizable.

• How to transform a real and symmetric


matrix into a diagonal form?
Fall 2018 Prof.Jiang@ECE NYU 181
A General Result for General
Symmetric Matrices

For any real and symmetric matrix A   nn ,


there always exists an orthogonal matrix, say O,
OT O  I , such that
 1  0 
 
O AO      
T

0   
 n

Fall 2018 Prof.Jiang@ECE NYU 182


Special case: A Trivial Example

 a1  0 
 
A    
0  a 
 n

Clearly, the identity matrix is an orthogonal matrix.


Fall 2018 Prof.Jiang@ECE NYU 183
Before proving this general and fundamental
result, let us introduce some useful tools.

Fall 2018 Prof.Jiang@ECE NYU 184


The Gram-Schmidt Orthogonalization
Process
Question :

How to generate a set of mutually orthogonal

 i1 successively,
N
vectors y i

from a set of N real linearly independent

 i1 ?
N
n-dimensional vectors x i

Fall 2018 Prof.Jiang@ECE NYU 185


Let us start with a set of real-valued vectors

x 
N
i
. Here is the systematic procedure.
i 1
First ,
y1 : x1
y : x  a11 x
2 2 1

where a11 is a scalar to be determined so that

 
T
inner product y , y 1 2
 y 1
y2  0
 x1 , x 2  a11 x1  0.
Fall 2018 Prof.Jiang@ECE NYU 186
x1 , x 2  a11 x1  0  a11 :  x1 , x 2 / x1 , x1
with D1 : x1 , x1  0.

Next , construct y 3 as:


y 3 : x 3  a 21 x1  a22 x 2
where a21 , a22 are scalars to be determined s.t.
y 3 , y1  0, y3 , y 2  0

y , x  0,
3 1 3
y ,x 2
 0.
Fall 2018 Prof.Jiang@ECE NYU 187
y , x  0,
3 1 3
y ,x 2
0

 x3 , x1  a21 x1 , x1  a22 x 2 , x1  0

 3 2
 x , x  a21 x , x  a22 x , x  0
1 2 2 2

which has a (unique) solution a21 , a22 if


 x1 , x1 x1 , x 2 
D2 : det    0.
 x 2 , x1 2
x ,x  2 

Fall 2018 Prof.Jiang@ECE NYU 188
By contradiction, assume that
 x1 , x1 x1 , x 2 
D2 : det  0
 x 2 , x1 x 2
, x 2 
 
Then, there are two scalars r1 , s1 , not both 0,
such that
r1 x1 , x1  s1 x1 , x 2  0
r1 x 2 , x1  s1 x 2 , x 2  0

x1 , r1 x1  s1 x 2  0, x 2 , r1 x1  s1 x 2  0
Fall 2018 Prof.Jiang@ECE NYU 189
x , r1 x  s1 x
1 1 2
 0, x , r1 x  s1 x
2 1 2
0

r1 x1  s1 x 2 , r1 x1  s1 x 2  0
 r1 x  s1 x  0.
1 2

1 2
Contradiction with x , x being
linearly independent. Thus,
 x1 , x1 x1 , x 2 
D2 : det    0.
 x 2 , x1 x 2 , x 2 

Fall 2018 Prof.Jiang@ECE NYU 190
So, we have obtained three mutually orthogonal
vectors:
y1 : x1
y : x  a11 x
2 2 1

y : x  a21 x  a22 x
3 3 1 2

Fall 2018 Prof.Jiang@ECE NYU 191


Continuing this process, we can find other
mutually orthogonal vectors:
i 1
y i : xi   a(i 1) k x k
k 1

with the scalars a(i 1) k chosen to achieve


the mutual orthogonality condition:
y ,y
i j
0 i  j ,
or equivalently, y i , x j  0, 1  j  i  1.
Fall 2018 Prof.Jiang@ECE NYU 192
Othonormal Vectors
They are defined as follows:
u : y / y , i  1, 2,  , N .
i i i

It is easy to show that, if n  N ,



O  u , u , , u
1 2 N

is an orthogonal matrix.
Fall 2018 Prof.Jiang@ECE NYU 193
An Example

Consider the linearly independent vectors:


 1 0 
  
x  0  , x  1  .
1 2 
1  1 
By means of the Gram-Schmidt process,
1 2
find a set of orthonormal vectors u , u .
Fall 2018 Prof.Jiang@ECE NYU 194
Exercise
Show that if v1 , , vk  is a set of k linearly independent
vectors in  n , then there exists an invertible upper triangular
matrix T   k k such that the matrix U  VT has
orthonormal columns.

Fall 2018 Prof.Jiang@ECE NYU 195


Comment
During the Gram-Schmidt process, we proved that
the determinants Dk , called Gramians, are nonzero.
Indeed, we can prove that

Dk  det x i , x j   0, 1 k  N,

 
k
for any set of linearly independent vectors x i
.
i 1

Fall 2018 Prof.Jiang@ECE NYU 196


Leading principle minor
Indeed,
Each Gramian Dk  det x , x  i j
 is associated with
a positive-definite quadratic form:
k k
Q(u )  i ,
u x i

i 1
 j
u
j 1
x j

k
 
i , j 1
x , x ui u j
i j

Q positive definite in u  (u1 , , uk )   k .


 Q(u )  0, where equality holds only when u  0.
Fall 2018 Prof.Jiang@ECE NYU 197
An Interesting Result
For any positive-definite quadratic form
N
Q a uu ,
i , j 1
ij i j

the associated determinant


D  det  aij 
is always positive.
Fall 2018 Prof.Jiang@ECE NYU 198
Proof
 First, we prove that D  0. By contradiction,
assume otherwise, there is a nontrivial solution to
N

a u
j 1
ij j  0, i  1, 2, , N

Then, it follows that


 N N

Q   ui   aij u j   0
i 1  J 1 
a contradiction.
Fall 2018 Prof.Jiang@ECE NYU 199
 Second, we prove that D  0. For   [0,1], consider
a family of quadratic forms defined as
N
P()  Q  1     ui2 .
i 1

Clearly, P     0, for all nontrivial u. Then,


based on the above analysis, the associated
determinants are nonzero.
At   0, the determinant is det I  0.
So, by continuity, at   1, the determinant is D
which cannot be negative.
Fall 2018 Prof.Jiang@ECE NYU 200
General 2x2 Symmetric Matrices
We begin with the two-dimensional case:
 a11 a12   a 
1

A    2 
 a21 a22   a 
which is symmetric, i.e., a12  a21.

Consider a pair of eigenvalue 1 and associated


 x11 
(normalized) eigenvector x :   , i.e.
1

 x12 
Ax  1 x  a , x  1 x11 ,
1 1 1 1
a , x  1 x12
2 1

Fall 2018 Prof.Jiang@ECE NYU 201


General Symmetric Matrices (Cont’d)

Using the Gram-Schmidt process, take a 2  2



orthogonal matrix O2  y y , with y :=x the
1 2
 1 1

given normalized eigenvector.


It will be shown that
 1 0 
O AO2  
T
2 
 0 2 

Fall 2018 Prof.Jiang@ECE NYU 202


General Symmetric Matrices (Cont’d)
First , show that
 1 y11 1
a ,y   b 
2
T     1 12 
O2 AO2  O2
T

 1 y12 2 
a , y   0 b22 
2

Then, b12  0 using symmetry;

 
T
T
O AO2
2  O2T AO2 .
and b22   2 because the eigenvalues are
unchanged under O.
Fall 2018 Prof.Jiang@ECE NYU 203
Exercise 1
Try to reduce the real symmetric matrix
1 k 
A 
k 1
to a diagonal form.

Fall 2018 Prof.Jiang@ECE NYU 204


Exercise 2
Define the real bilinear form
n
Q  x, y   yT Ax   aij yi x j ,
i , j 1
x, y   n

that reduces to the inner product when A  I .


Prove that Q is symmetric, i.e., Q  x, y   Q  y, x 
if and only if A is symmetric.

See the text (Horn & Johnson, 2nd edition, 2013; page 226)
Fall 2018 Prof.Jiang@ECE NYU 205
Homework #4
1. Does the singular matrix
 1 1
A 
 1 1 
have two independent eigenvectors?

2. Show that A and A have the same eigenvalues.


T

Fall 2018 Prof.Jiang@ECE NYU 206


Homework #4
3. Show by direct calculation for A and B, 2  2 matrices,
that AB and BA have the same characteristic equation.

4. Can you give two matrices that are reducible to


the following canonical diagonal matrix
 2 0
A 
 0 1
Justify your answer.
Fall 2018 Prof.Jiang@ECE NYU 207

Vous aimerez peut-être aussi