Midterm 2 Flashcards
Conditions of a subsapce
- u + v ∈ V
- ku ∈ V
Are polynomials with real coefficients a vector space? (Pn)
Yes
BAIS
For a vector space V, the vectors are called a basis of V if:
- Linearly independent
- Span V
What is the dimension of a vector space?
elements in a basis
Given a vector space V and a collection of vectors spanning V, if the vectors are linearly DEPENDENT, one of the vectors can be removed and ___
the set would still span V
What do you know about any two bases of V?
they will always have the same # elements
dimension of a vector space is well defined
The vectors v1…vn are a BASIS of V iff ___
every vector in V can be written in a unique way as a linear combination of v1…vn
If a vector space has dimension n and the vectors u1, u2,…, un span V
Then they are a basis
Uniqueness of a basis
If S = {v1,…vn} is a basis for a vector space V, then every vector v in V can be expressed in the form v = x1v1 + … + xnvn
Coordinates of a vector space
Change of coordinate
P and P-1 are the change of coordinate matrices
Let V be an n dimensional vector space, and let {v1, v1, vn} be any basis.
f the set V has more than n vectors: it’s linearly dependent
If the set V has less than n vectors: it doesn’t span V
Procedure for computing the change of coordinate matrix P B->C
- Form matrix [C|B]
- RREF
- Resulting matrix [I|P B->C]
Nullspace
set of vectors x that satify: Ax = 0
Column space
span of the columns of A
(set of vectors b, such that Ax = b has a least 1 solution)
Rowspace
span of the rows of A
Row operations effect the null, row, or column space?
Change column space (not dimension)
Don’t change: nullspace & rowspace
Do row operations change the span of the set of rows?
No
Rank
common dimension of row space & column space
leading 1s in the general solution of Ax=0 (REF)
rank(A) + dim(Null(A)) =
rank(A) + dim(Null(A)) = n
Let A be an (mxn) matrix:
- If B is an invertible (nxn) matrix, then rank(BA) =
- If C is an invertible (mxn) matrix, then rank(BA) =
Let A be an (mxn) matrix:
- If B is an invertible (nxn) matrix, then rank(BA) = rank(A)
- If C is an invertible (mxn) matrix, then rank(BA) = rank(A)
Determinant
scalar associated to a square matrix
If rank(A) < n, det =
If rank(A) = n, det =
If rank(A) < n, det = 0
If rank(A) = n, det ≠ 0
How to compute the determinant of a (2x2) matrix?
ad - bc
How to compute the determinant of a (3x3) matrix?
Arrow technique
Minor (Mij) of entry aij
the determinant of the submatrix that remains after the ith row and the jth column are deleted from A
Cofactor of entry aij
Cofactor expansion
obtained by multiplying the entries in any row/column by the cofactors and adding the resulting product = determinant
Determinant properties:
multiply all the elements of a single row/column by a fixed scalar (k)
det(B) = k det(A)
Determinant properties:
If two rows/columns are identical
det = 0
Determinant properties:
If a row/column is a sum of two vectors -> split it up into 2 seperate determinants
|a+b a a| = |a a a| + |b a a|
Determinant properties:
If we add to a row a multiple of another R1 + kR2
det stays same
If two rows/columns of a matrix are linearly dependent
det = 0
Relationship between determinants and transposes
det(A) = det(A^T)
A matrix is invertible iff
det not equal 0
Permutation
Reoredering of the elements
det(AB) =
det(AB) = det(A)det(B)
How can you computer the inverse of a matrix if you know its determinant?
A^-1 = (1/detA) * A
Eigenvector
Ax = λx, x ≠ 0
Compute eigenvector: Null(A - λI)
Eigenspace
collection of eigenvectors associated with each eigenvalue
1 ≤ dim eigenspace ≤ multiplicity λ in characteristic polynomials
Characteristic polynomial
det(A - λI)
λ: roots of the polynomial
Diagonalizable matrix
if there exists an invertible matrix P and a diagonal matrix D such that: A = PDP-1
The scalar λ is an eigenvalue of the (n x n) matrix A iff
det(A - λI) = 0
Given a matrix A and the matrix obtained from A after a base change B = P-1AP
det(A - λI) = det(B - λI)
The set of eigenvectors corresponding to a fixed eigenvalue a together with the zero vector form a subspace H of Rn
Eigenvectors corresponding to different eigenvalues are __
linearly independent
Let A be an nn matrix, then A is diagonalizable iff both conditions are satisfied:
Complex number
a + bi
Complex conjugate of a + bi
a - bi
Modulus |z| of a+bi
(distance from point to origin)
Complex number is real iff
it equals it’s conjugate
Properties of conjugates
conjugate of the sum is the sum of conjugates
conjugate of the product is the product of conjugates
If you know a polynomial/matrix has REAL entries and λ = a+bi
Then another eigenvalue must be λ = a - bi
If A is an nn TRIANGULAR matrix, then the eigenvalues of A
are the entries on the main diagonal of A.
B is similar to A (for square matrices)
if there is an invertible matrix P such that B = P-1AP
Powers of diagonalizable matrices
A^K = P D^K P-1