Académique Documents
Professionnel Documents
Culture Documents
Outline
Reminder about GCR
Residual minimizing solution
Krylov Subspace
Polynomial Connection
Preconditioners
Diagonal Preconditioners
Approximate LU preconditioners
With Normalization
Generalized
Conjugate Residual
Algorithm
r 0 = b Ax 0
For j = 0 to k-1
pj = r j
Residual is next search direction
For i = 0 to j-1
Orthogonalize
T
p j p j ( Mp j ) ( Mpi ) pi Search Direction
pj
pj
( Mp ) ( Mp )
T
x j +1 = x j + ( r
r j +1 = r j
Normalize
) ( Mp ) p
( r ) ( Mp ) M p
j T
Update Solution
j T
Update Residual
Generalized
Conjugate Residual
Algorithm
1) orthogonalize the Mr i ' s
rp00
rp11
With Normalization
Algorithm Steps by Picture
pr 22
rp33
rk
M pk
Generalized
Conjugate Residual
Algorithm
r0
First search direction r = b Mx = b, p0 =
Mr 0
0
Residual minimizing x1 =
solution
Second Search
Direction
( r 0 ) Mp0 p0
T
r1 = b Mx1 = r 0 1Mr 0
p1 =
r1 1,0 p0
M r1 1,0 p0
Generalized
Conjugate Residual
Algorithm
Residual minimizing
solution
x 2 = x1 + ( r1 ) Mp1 p1
T
r 2 = b Mx 2 = r 0 2,1Mr 0 2,0 M 2 r 0
p2 =
r1 2,0 p0 2,1 p1
M r1 2,0 p0 2,1 p1
Generalized
Conjugate Residual
Algorithm
k 1
pk = r ( Mr k )
k
j =0
pk =
pk
Mpk
k = ( r k ) ( Mpk )
T
x k +1 = x k + k pk
r k +1 = r k k Mpk
( Mp ) p
j
Orthogonalize and
normalize search
direction
Polynomial view
Generalized
Conjugate Residual
Algorithm
Residual Minimization
Krylov Methods
Polynomial View
If x k +1 span r 0 , Mr 0 ,..., Mr k
minimizes r k +1
2
2
1) x
k +1
th
Krylov Methods
Nodal Formulation
No-leak Example
Insulated bar and Matrix
Incoming Heat
T (1)
T (0)
Near End
Temperature
Discretization
m
SMA-HPC 2003 MIT
Far End
Temperature
2 1
1 2
Nodal
1 Equation
Form
1
2
10
Krylov Methods
Nodal Formulation
leaky Example
Conducting bar and Matrix
T (1)
T (0)
Near End
Temperature
Discretization
Far End
Temperature
2.01 1
1 2.01
Nodal
Equation
1 Form
1 2.01
m
SMA-HPC 2003 MIT
11
R
E
S
I
D
U
A
L
10
10
10
10
Insulating
-1
Leaky
-2
-3
-4
10
20
Iteration
30
40
50
60
12
10
R
E
S
I
D
U
A
L
-1
10
-2
10
-3
10
Insulating
-4
10
Leaky
-5
10
10
15
20
25
Iteration
30
35
40
45
50
13
Krylov Methods
Residual Minimization
Optimality of poly
Therefore
Any polynomial which satisfies
the constraints can be used to
get an upper bound on
r k +1
r0
14
Induced Norms
Matrix Magnification
Question
Suppose y = Mx
How much larger is y than x?
OR
15
Vector Norm
Review
Induced Norms
L2 (Euclidean) norm :
x
i=1
L1 norm :
x
xi
xi
i=1
< 1
< 1
L norm :
x
= max
i
xi
< 1
16
Standard Induced
l-norms
Induced Matrix
Norms
Definition:
M l max
Mx
x
Examples
M
SMA-HPC 2003 MIT
max
1
= max
i
j =1
max
M ij
M
j
i =1
ij
x l =1
Mx
Max Column
Sum
Max Row
Sum
17
Standard Induced
l-norms continued
Induced Matrix
Norms
= m ax
j
i =1
Why? Let x =
= m ax
i
Why? Let
[1
j =1
x =
ij
0
ij
[ 1
1]
As the algebra on the slide shows the relative changes in the solution x is bounded
by an A-dependent factor times the relative changes in A. The factor
|| A1 || || A ||
was historically referred to as the condition number of A, but that definition has
been abandoned as then the condition number is norm-dependent. Instead the
condition number of A is the ratio of singular values of A.
cond ( A) =
max( A)
min( A)
Singular values are outside the scope of this course, consider consulting Trefethen
& Bau.
18
Useful
Eigenproperties
Spectral Mapping
Theorem
Given a polynomial
f ( x ) = a0 + a1 x + + a p x p
f ( M ) = a0 + a1M + + a p M p
Then
spectrum ( f ( M ) ) = f ( spectrum ( M ) )
19
Krylov Methods
u N
=
u1
eigenvectors of M
k (M )
u
1
u N
u
1
Convergence Analysis
Norm of matrix polynomials
k ( 1 )
u N
k ( N )
1
Cond(U)
u
1
k ( 1 )
u N
k ( N )
condition number of
M's eigenspace
SMA-HPC 2003 MIT
20
Krylov Methods
k ( 1 )
Convergence Analysis
Norm of matrix polynomials
= max x =1
k ( N )
2
( ) x
k
= max i k ( i )
k ( M )
cond (V ) max i k ( i )
21
Krylov Methods
Convergence Analysis
Important Observations
n ( M ) = 0 and therefore r n = 0
22
Krylov Methods
Residual Polynomial
If M = MT then
1) M has orthonormal eigenvectors
cond (V ) =
u
1
u N
u
1
u N
=1
k ( M ) = max i k ( i )
23
* = evals(M)
- = 5th order poly
- = 8th order poly
24
25
Krylov Methods
26
Krylov Methods
x
Ck 1 + 2 min
max min
min
Ck 1 + 2
max min
27
28
Krylov Methods
Chebychev Bounds
max
Ck 1 2
max min
max
min
2
max
+ 1
min
29
Krylov Methods
Chebychev Result
rk
max
min
r0
2
max
+ 1
min
30
Preconditioning
Krylov Methods
1
0
0
1
0
Diagonal Example
1
0
0
2
0
31
Preconditioning
Krylov Methods
Diagonal Preconditioners
Let M = D + M nd
(
Apply GCR to D 1M x = I + D 1M nd x = D 1b
The Inverse of a diagonal is cheap to compute
Usually improves convergence
SMA-HPC 2003 MIT
32
Heat Conducting
Bar example
x
x1
x2
x
100
xi
xi +1
Discretized system
one small x
xn
2 + 1
u1 f (x1)
1 2 +
1 1+ +100
100
100 1+ +100 1
1
1
u f (x )
1
2
n n
max
> 100
min
33
rk
r0
Iteration
SMA-HPC 2003 MIT
34
Heat Conducting
Bar example
Preconditioned Matrix
Eigenvalues
Residual Minimizing
Krylov-subspace
Algorithm can
eliminate outlying
eigenvalues by
placing polynomial
zeros directly on
them.
35
GCR
O ( m3 )
O (m)
O ( m2 )
O ( m6 )
O ( m3 )
O ( m3 )
O ( m9 )
O ( m6 )
O ( m4 )
36
Preconditioning
Krylov Methods
Approximate LU
Preconditioners
Let M L U
Applying GCR to
((
LU
( )
M x = LU
(( LU )
M x is equivalent to
solving LUy = Mx
SMA-HPC 2003 MIT
37
Preconditioning
Krylov Methods
Approximate LU
Preconditioners Continued
38
39
Preconditioning
Krylov Methods
Approximate LU
Preconditioners Continued
40
Summary
Reminder about GCR
Residual minimizing solution
Krylov Subspace
Polynomial Connection
Preconditioners
Diagonal Preconditioners
Approximate LU preconditioners
SMA-HPC 2003 MIT
41