Final Exam Flashcards
Do elementary row operations change the solution set?
No
When does a linear system have 0 solutions?
[0 0 0 1]
0x + 0y + … = 1
When does a linear system have 1 solution?
leading 1s = # variables
When does a linear system have infinite solution?
leading 1s < # variables
(paramterize)
Symmetric matrix
Span
set of all linear combinations
Linearly dependent
if there exists scalars (not all equal to 0) such that:
x1v1 + x2v2 + … + xnvn = 0
Linearly independent
if the only scalars are all 0 such that:
x1v1 + x2v2 + … + xnvn = 0
Homogeneous linear system
Ax = 0
Sends to zero vector
Non-homogeneous linear system
Ax = b
(translations of Ax = 0)
Homogenous system:
- If vector (v) is a solution to Ax = 0, then scalar (k) multiplied by v (kv) is what?
Also a solution to Ax = 0
Homogenous system:
- If vectors (v1, v2) are solutions to Ax = 0, then:
v1 + v2 is what?
v1 + v2 is also a solution to Ax = 0
Homogenous system:
- Any linear combination of homogenous solution (Ax=0) is what?
Also a solution
Systems:
- If vector (a) is a solution to Ax = 0
- If vector (c) is a solution to Ax = b
Then vector a+c is a solution to what?
homogenous solution + non-homogenous solution = non-homogenous solution
a+c is a solution to Ax=b
Systems:
- If vectors (a, c) are solutions to Ax = b
Then vector a-c is a solution to what?
Vector a–c is a solution to Ax=0 (homogenous)
non-homo – non-homo = homo
Linear transformation
- T(v+w) = T(v)+T(w)
- T(kv) = kT(v)
Invertible matrix
det ≠ 0
- Equation Ax=b has a unique solution
- Matrix has full rank
Can non-square matrices have inverses?
Yes, but only on one side
Subspace
- u+v in V
- ku in V
Basis
- Linearly independent
- Span
How can you determine the dimension of a vector space?
elements in basis
UNIQUE
If a vector space has dimension n, then n linearly independent vectors must be a ___
basis
If a vector space V has dimension n, and collection of vectors span(V) then ___
vectors are a basis
Coordinates of a vector space
- v = arbitrary vector in V
- Basis: {v1,v2…vn}
a1v1 + a2v2 + … + anvn = v
Collection of scalars [a1 a2 … an]
Find the matrix that represents the transformation (T) from basis A to basis B
- Compute image of each basis vector of A:
T(a1), T(a2) … T(an) - Re-write in terms of basis B:
T(a1) = x1b1 + … + xnbn - From matrix using coefficients (x1…xn) as columns
Characteristic polynomial
det(A – λI)
How to find eigenvectors given λ?
null(A – λI)
Fundamental formula relating rank & dim(Null)
rank(A) + dim(null(A)) = n
Rank
linearly independent rows of the matrix
# leading 1s in RREF
If a matrix is (mxn) what is it’s rank?
Whichever is smallest m or n
Nullspace
Solution to Ax=0
Column space
span of the columns of A
How to determine the column space of a matrix?
- RREF A → find the columns with leading 1s
- Col(A) = span(columns with leading 1s in ORIGINAL matrix A)
- Basis for Col(A) = {columns of original matrix A which correspond to leading 1s in RREF A}
Row space
span of the rows of A
How to determine the row space of a matrix?
- RREF A
- Row(A) = span (RREF’s rows of A)
Do row operations preserve the row space?
Yes!
Do row operations preserve the column space?
No!
But the dim(col(A)) yes
Determinant of a triangular matrix
product on main diagonal
If rank < n, then determinant = ?
det = 0
What is the minor? (Mij)
determinant submatrix remains after the ith row & jth column are deleted
What is the cofactor? (Cij)
Determinant properties:
- Multiply all elements of 1 row/column by k
k * det
Determinant properties:
If a row/column is a sum of two vectors
det can be written as the sum of the 2 determinants with each of the 2 row vectors and the remaining rows/columns as in the original
Determinant properties:
If 2 rows/column are identical
det = 0
Determinant properties:
Add to a row/column a multiple of another row
det stays SAME
Determinant properties:
If 2 rows/column are linearly dependent
det = 0
det(AB) =
det(AB) = det(A) * det(B)
Eigenvector (v)
Av = λv
v ≠ 0
Does changing the basis of a linear transformation change its characteristic polynomial?
No
The set of eigenvectors corresponding to a single eigenvalue together with the zero vector form a __
subspace of Rn
Eigenvectors corresponding to different eigenvalues are what?
Linearly independent
Diagonalisable if
- dim(eigenspace) = multiplicity of eigenvalue for each λ
- able to perform a change of coordinate that brings matrix to diagonal form D = P^-1 AP
Every eigenvalue must have at least how many eigenevctors?
1
If a charactertistic polynomial has n DIFFERENT eigenvalues (mult = 1) then?
diagonalizable
Complex #
z = a + bi
i^2 =
-1
Complex conjugate
z(bar) = a - bi
Moduluz: |z|
distance from point to the origin
Given an (nxn) matrix with REAL entries, if a+bi is an eigenvalue with eigenvector (v) then what is another eigenvalue/eigenvetcor?
Orthogonal
dot product = 0
Orthonormal
dot product = 0
length = 1
How to normalize a vector?
How to compute the length of a vector? ||v||
Orthogonal complement
set of vectors that are orthogonal to every vector in subspace H
A vector can be written as the sum of its orthogonal projection and what?
Orthogonal projection
v hat
Gram-Schmidt Orthogonalisation Proces
transform a basis → orthogonal basis
What does a least squares approximation do?
Solves the equation Ax=b as closely as possible
least squares = vector y such that |b-Ay| is as small as possible…
2 methods for computing least squares approx
- Find the projection b(hat) of b on the space spanned by the columns of A, then solve the equation Ax=b(hat).
- Solve A^T Ax= A^T b
Line of best fit to n data points
Find the values of a,b in the linear equation
y = a + bx that minimize the distance between the n data points
Orthonormal matrix
An orthonormal change of coordinate transforms a symmtetric matrix into what?
Into another symmetric matrix
A square symmetric matrix with real entries has __ eigenvalues
real eigenvalues
If v1 and v2 eigenvectors of A with different eigenvalues, then
v1 and v2 are orthogonal
An nxn symmetric matrix with real entries is always diagonalizable over the complex numbers in an ____ basis.
orthonormal basis
Given a symmetric matrix (A=A^T), we find an orthonormal matrix P and a diagonal matrix D both with real entries such that:
What is thefFirst step to diagonalization of symmetric matrices?
Show that all eigenvalues are real
Positive definite
symmetric matrix whose eigenvalues are all positive
Are all positive definite matrices invertible?
Yes
det = +
Given any invertible matrix, ___ is positive definite
Cholesky decomposition
For any nxn symmetric matrix A, the following conditions are equivalent (and therefore any imply that the matrix is positive definite):
Cholesky Decomposition
Matrix representation of a quadratic function
f(x) = x^T A x
Given a quadratic function with f(x) = x^T A x where A is symmtric. The origin is a local min if?
A is positive definite
Given a quadratic function with f(x) = x^T A x where A is symmtric. The origin is a local max if?
–A is positive definite
Given a quadratic function f(x) = x^T A x and a circle ||x||=1, what is the max/min value of f?
Max = largest eigenvalue of A
Min = smallest eigenvalue of A