Midterm 2 Flashcards
invertible matrix
a nxn matrix where AA-1 = I
inverse of matrix
A^-1 where AA^-1 = I
singular
not invertible
determinant of a 2x2 matrix
ad - bc
elementary matrix (E)
matrix performed by doing ONE elemtary row operation on an IDENTITY matrix
ALL elementary matrices are invertible because all row operations are reversible
– the inverse of an elementary matrix is another elementary matrix that’ll turn E back into I
Row Equivalent Matrices
matrices that can turn into one another through a sequence of elementary row operations
nonsingular vs singular
invertible vs not invertible
inverse of a 2x2 matrix
(1/ad-bc)* [d -b ]
-c a
original: a b
c d
if determinant is 0, then A is NOT invertible because we can’t make an inverse!!
Ax = b can be rewritten using inverses…
IF AND ONLY IF A is invertible
Ax = b
A^-1Ax = A^-1b
x = A^-1b
row reduction method is probably easier when it comes to finding the inverse of bigger matrices!
For all b in Rn, x = A^-1b is a unique solution
-invertible matrices have NO free variables
-has to be a unique solution
product of nxn invertible matrices ARE
INVERTIBLE
inverse of product is…
the product of the inverses in reverse order
elementary row operations performed on mxn matrix - the resulting matrix can be written as EA - what about multiple elementary row matrix?
Ek…E2E1A
method to find the inverse
row reduce A to the identity matrix while performing the same row operations on the identity matrix at the same time
○ [A | I] => [I | A-1]
matrix is invertible if and only if
ROW equivalent to the identity matrix == pivots in every row and column == onto and one-to-one (remember they are square matrices)
linear transformations
mapping between two vector spaces (Rns) that preserves all vector addition and scalar properties
invertible linear transformations
T: Rn -> Rn (square matrix?)
if there is another linear transformation S: Rn -> Rn
WHERE
S(T(x)) = x for all x in Rn
T(S(x)) = x for all x in Rn
equivalent to saying that
A-1Ax = Ix
Invertible Matrix Theorem (18) given A is a square nxn matrix, then the following statements are ALL equivalent
a) A is an invertible matrix
b) the columns of A form a linearly independent set
c) the columns of A span all of Rn
d) the transformation T: Rn -> Rn defined by T(x) = Ax OR the linear transformation x |-> Ax is one-to-one
e) the transformation T: Rn -> Rn defined by T(x) = Ax is onto
OR the linear transformation x |-> Ax maps Rn onto Rn
f) A has n pivot positions
g)A is row equivalent to the n x n identity matrix
h) Ax = 0 only has the trivial solution
i) the equation Ax = b has at least one solution for each b in Rn (doesn’t it also only have the unique solution for each b?)
j) There is an nxn matrix C such that CA = I
k) there is an nxn matrix D such that AD = I
l) A^T is an invertible matrix
m) columns of A form a basis of Rn
n) Col A = Rn
o) dimColA = n
p) rank A = n
q) Nul A = {0}
r) dim Nul A = 0
s) the number 0 is not an eigenvalue of A
t) the determinant of A is not 0
Let A and B be square matrices: if AB = I, then…
A and B are both invertible
B = A^-1 & A = B^-1
how many inverses can a matrix have
ONE - inverses of matrices are unique
How to determine if a linear transformation is invertible?
Let a matrix A represent the linear transformation
- if A is invertible, then the linear transformation is invertible!!
reflection through the y-axis is invertible but a projection is NOT
Adding 2 partitioned matrices A and B
A and B must be the same size, partitioned in the exact same way
add block by block
scaling partitioned matrices
scale block by block
multiplying 2 partitioned matrices A and B
column partition of A must equal row partition of B
OR the number of columns in partition A = number of rows in partition B
- multiply like regular matrices
so a 2x3 matrix times a 3x1 matrix will give you a 2x1 matrix
Inverses of Partitioned Matrices
A B ] [ X Y] = [In 0
0 C ] [Z W] 0 In]
AX + BZ = In
AY + BW = 0
CZ = 0
CW = In
Factorization of a matrix
expressing a matrix as the product of two or more matriches
row interchanges
swapping rows when row reducing
lower triangular matrix
entries above the main diagonal are all 0s
upper triangular matrix
entries below the main diagonal are all 0s
Algorithm for LU Factorization
- Reduce A to echelon form U by a sequence of row replacement operations, if possible
- Place entries in L such that the same sequence of row operations reduces L to I
Why do we use LU Factorization
more efficient to solve a sequence of equations with the same coefficient matrix by LU factorization rather than row reducing the equations every single time
Let A be an mxn matrix that can be row reduced to echelon form WITHOUT row exchanges !!!(interchanges?) - what is L and U?
L is a mxm lower triangular matrix with one’s on the main diagonal
U is mxn matrix that is the echelon form of A
Rewriting Ax = b using A = LU
Ly = b
Ux = y
Ax = b -> L(Ux) = b
How do we get U?
Row reduce A to echelon form using only row replacements that add a multiple of one row to another BELOW it
Condition for LU Factorization
can be row reduced to echelon form without row exchanges??
How do we get L?
take the row replacement operations you did on A when getting ecehlong form (find the elementary matrices that transform A into U)
and reverse the signs and input them in their respective spots on the mxm identity matrix
– replacing the 0s of identity matrix with the row replacement “coefficients”
– basically: after finding all the elementary row matrices, take their inverses
Ep…E1A = U
A = (Ep….E1)^-1U = LU
L = (Ep…E1)^-1
Using LU Decomposition
- Forward solve for y in Ly = b
– modify row below using above rows - Backwards solve for x in Ux = y
– modify rows above using below rows
Subset of Rn
any collection of vectors that are in Rn
Subspace of Rn
A subset in Rn that has 3 properties:
- the zero vector is in H
- u +v in H (closed under addition)
■ cu in H (closed under scalar multiplication)
SUBSPACE can be written as the Span{} of some amount of linearly independent vectors (always linearly independent vectors? All combinations of linear vectors? can there be dependent vectors/)
Column Space of a Matrix A (mxn)
ColA
the subspace of Rm spanned by {a1…an}
essentially all the pivot columns!!
Null Space of a Matrix A (mxn)
NullA
the subspace of Rn spanned by the set of all vectors x that solve Ax = 0
Basis for a Subspace H of Rn
A linearly independent set in H that spans H
– DOES NOT contain the zero vector (because it is linearly independent) unlike the span