Final exam Flashcards
Statements equivalent to A is invertible
A^T is invertible, rows are linearly independent, rows span R^n, columns are linearly independent, columns span R^n, rows are equivalent to the identity matrix, A can be written as the product of elementary matrices, determinant can’t be 0, eigenvalue cant be 0, Ax=b only one solution, the null space is the zero vector
Subspace definition
A subset of R^n that satisfies two categories: of x,y are elements of U, then their sum is an element of U
Alpha(x) is an element of U
Column space
Span of the linearly independent columns of A
Defined by Ax
Row space
Span of linearly independent rows
A^Tx
Null space
Space where Ax=0
Col(AB) is in or equal to the col(A)
This works because taking a vector y in AB means that ABx=y, which you can say is Az=y for Bx=z, therefore y is also an element of col(A)
Row(AB) is in or equal to row(A)
Same idea, just use transposes
Rank theorem
Rank of matrix+ nullity= number of columns
Linear transformation definition
T(x+y)= T(x)+T(y) and T(ax)=aT(x)
Matrices and linear transformations
There’s always a matrix of transformation for linear trans such that T(x)=Ax
Coordinate vector
Coefficients of linear combo of vectors
Eigenvalues
There’s an eigenvalue for square matrix such that Ax=(lambda)x
Geometric multiplicity
Dimension of the eigenspace
Trace
Sum of the diagonal is a matrix
Eigenvalues with transposes
A and A^T have the same eigenvalues
Properties of determinants
If there’s a row/column of all 0s det=0, if triangular then det=product of diagonal entries, if has 2 rows/columns the same det=0, if one row/column multiple of another det=0, if a is scalar det(aA)=a^ndet(A)
Determinants of elementary matrices
Swap row: -1
Multiple of row: scalar
Addition of rows: 1
Algebraic multiplicity
Number of times factor shows up in characteristic polynomial
If you have distinct eigenvalues w/corresponding eigenvectors…
They are linearly independent
Similar matrices
There exists a matrix S such that B=S^-1AS or A=SBS^-1
Properties of similar matrices
Det(A)=det(B), A is invertible iff B is, A&B have same rank and characteristic polynomial, A&B have same eigenvalues
Diagonalizable
Similar to diagonal matrix (that contains eigenvalues), can only happen if there are n linearly independent eigenvectors
Matrix invertible iff
Geo multiplicity and algebraic are equal
Adjoint
Transpose of the matrix of cofactors (find by cofactors expansion)
If there’s an orthonormal basis you can do
The vector times the linear combo vectors to find the scalar multiple
Properties of orthogonal matrices
QQ^T=I, Q^-1=Q^T, magnitude of Qx equals magnitude of x, det(Q)=+-1
Properties of Orthogonal subspaces
If orthogonal then intersection of U and V is 0, dim(U)+dim(U^perp)=n, (U^perp)^perp is U, U intersect U^T is 0, (col(A))^perp=null(A^T), (row(A))^perp=null(A)
Orthogonal projection of v onto subspace
Projection is element of U, v-p is element of U^perp
Distance between 2 vectors
Magnitude of u-v
Triangle inequality
Magnitude of u+v is less than/equal to magnitude of u plus magnitude of v
Cauchy- Schwartz inequality
Absolute value of u dot v is less than/equal to magnitude of v times magnitude of u
Properties of dot product/vectors
v+w=w+v u dot v= v dot u Magnitude of u=mag of -u udotv+udotw=udot(v+w) udotu=mag u squared (v+w)+u=v+(w+u) v+0=v u+-u=0 1u=u a(u+v)=au+av (a+b)u=au+bu (ab)u=a(bu)
Orthogonal projections (just vectors)
P in direction of u
v-p orthogonal to u
How many solutions can systems of equations have?
0(inconsistent),1(consistent), infinite
Matrix addition
Functions the same as regular addition
Matrix multiplication
AB does not necessarily equal BA, AB=BC does not imply A=C, AB=0 does not mean one of the matrices is 0 matrix
Properties of matrices
A(B+C)=AB+AC, (B+C)A=BA+CA, B(aA)=(aB)A, (A+B)^T=A^T+B^T, (A^T)^T=A, (aA)^T=aA^T, (AB)^T=B^TA^T
Inverses
Exist if AB=BA=I for square matrices only
Two square matrices A and B are row equivalent iff…
A is a product of elementary matrices times B
Vector space
There is an addition and scalar multiplication that satisfies: v+w=w+v, (v+w)+u=v+(w+u), 0 vector in V such that v+0=v, for each u in V there’s -u in V such that u+-u=0, a(u+w)=au+aw, (a+b)u=au+bu, a(bu)=(ab)u, 1u=u
Examples of vector spaces
F(R)- space of all real valued fctns whose domain is R, P-space of all polys in one variable, P_n-subspace of P that contains all polys of degree n, C(R)-subspace of F(R) that contains cont fctns, C[a,b]- set of real valued fctns on closed interval, R^mxn-set of mxn matrices with real entries
Dimension of vector spaces
Infinite dimensional if V has no finite basis
Linear transformation for vector space
v_1,v_2 in V such that T(v_1+v_2)=T(v_1)+T(v_2), T(av_1)=aT(v_1)
Matrix of transformation for vector space
V goes to W with trans and [v]_E goes to [w]_F with matrix of trans
[T(v)]_F=M[v]_E
Inner product
<u> is a scalar that satisfies <u> greater than/equal to 0 and <u>=0 iff u=0, <u>=, =a<u>+b</u></u></u></u></u>
<a>=</a>
Tr(B^TA)
Magnitude of matrices
sqrt(tr(A^TA))
Orthogonal functions
=0