Lecture 12 Flashcards
LU factorization block format decomposition
Block: a11 a12 a21 a22 a11 is a scalar, a12 is (1,n-1), a21 is (n-1,1), a22 is (n-1,n-1) LU decomposition: 1 0 l21 L22 . u11 u12 0 U22 = u11 u12 u11.l21 l21.u12+L22.U22 so
u11 = a11
u12 = a12
l21 = a21/u11
L22.U22 = A22 - l21.u12, which is another LU factorization, but with a smaller dimension.
Do it again until L22.U22 reduces to (1,1)
Example LU factorization using block-format 2 8 4 1 1 2 3 3 1 2 6 2 1 3 4 2
L = diag(1), 0 upper U = 0 lower 1st step: U1 = 2 8 4 1 ... L1 = [1, 0.5,0.5,0.5]T L22.U22 = [[-2,1,2.5],[-2,5,1.5],[-1,2,1.5]] 2nd step: U1 = -2 1 2.5 L1 = [0,1,1,0.5]T L22.U22 = [[3,-1],[1.5,0.25]] 3rd step: U1= [0,0,3, -1] L1 = 1 0.5 L22.U22 = 0.75
LU factorization complexity and property
O(n^3)
The algorithm may fail, even if A is invertible
(100,100) matrix takes about 2 seconds to be LU factorized, how long for a (1000,1000)?
2000 seconds
What’s wrong with the block format LU decomposition? How to deal this that?
Zero divisions Swap rows (Find the largest entry by absolute value and swap it to the top row)
LU factorization with partial pivoting
A = PLU
- LUP decomposition always exists for a matrix A but is not unique
- more robust than LU and approximately the same cost (On^2)
LU factorization with partial pivoting:
2 8 4 1
1 2 3 3
1 2 3 2
1 3 4 2
L = 1 0 0 0 .5 1 0 0 .5 .5 1 0 .5 1 0 1 U = 2 8 4 1 0 -2 1 2.5 0 0 1.5 .25 0 0 0 -1 P = 1 0 0 0 0 1 0 0 0 0 0 1 0 0 1 0 A = PLU
Gaussian elimination?
LU factorization exactly performs that
Outer product u.v
u v.T or np.outer(u,v)
Solving linear system using LUP decomposition
A = PLU PLUx = b solve for y: Ly = P^t b solve for x: Ux = y - The algorithm may fail, in particular if A is singular, U will have zeros on its diagonal - O n^3
Python PLU
import scipy as sp
P, L, U = sp.linalg.lu(A)