Exam 3 Flashcards
5.1 Eigenvector Definition
a nonzero vector x such that Ax = lambdax for some value of lambda.
5.1 eigenvalue
lambda is a eigenvalue of A if there is a nontrivial solution x of Ax = lambdax
5.1 Theorem 1
eigenvalues of a triangular matrix are the entries on its main diagonal
5.1 Theorem 2
If the eigenvectors correspond to the eigenvalues of a matrix, then the vectors are linearly independent
5.2 Invertible Matrix Theorem: A is invertible only if __ is __ an eigenvalue of A
A is invertible only if the number 0 is not an eigenvalue of A
What does det AB equal
det A(det B)
What does det A^T equal
det A
Does row replacement change the determinant
No, but it does change the sign of the determinant
How does row scaling effect the determinant
Row scaling scales the determinant by that scalar value
multiplicity
how many times an eigenvalue occurs
Similarity
A is similar to B if there is an invertible matrix P such that APP^-1 (inverse of P) = B
5.3 A matrix is diagonalizable if
A is similar to a diagonal matrix
5.3 diagonalization theorem
An matrix is diagonalizable
only if A has enough linearly independent eigenvectors to form a basis of R^n.
5.3 A=PDP^-1 (is diagonalizable) only if what
the columns of P are n linearly independent eigenvectors of A
5.3 Theorem 6
An nxn matrix with n eigenvalues is diagonalizable
5.4 Theorem 8: Diagonal Matrix Representation
If the columns of B form a basis, then D is the basis matrix for the transformation x -> Ax
6.1 Inner Product/ Dot Product
[u1, …un] [v1,…vn] = u1v1+… unvn
6.1 length/norm of a vector
||v|| = sqrt(v*v) = sqrt(v1^2 + … vn^2)
6.1 ||cv|| = ?
|c| ||v||
6.1 unit vector
vector whose length is 1
6.1 How to find the distance between u and v
dist(u,v) = ||u-v||
6.1 Orthogonal
2 vectors are orthogonal if u*v = 0
6.1 A vector is perpendicular to W only if what?
x is orthogonal to every vector in a set that spans W.
x * all other vectors = 0
6.1Theorem 3
The orthogonal complement of the row space of A is the null space of A. The orthogonal complement of the column space of A is the null space of A^T
6.2 Theorem 4
If S is an orthogonal set of nonzero vectors, then S is linearly independent and a basis for the subspace spanned by S
6.2 Orthogonal Basis
A basis for W that is also an orthogonal set
6.2 Theorem 5: For each y is W the constants in the linear combination y = c1u1…cpup are given by ______
cj = (yuj)/(ujuj)
where j = 1…p
6.2 Theorem 6: An mxn matrix U has orthonormal columns only if ____
U^T*U = I(identity matrix)
6.2 Theorem 7: Let U be an mxn matrix with orthonormal columns, and let x and y be in R^n. Then ||Ux|| = ?
||x||
6.2 Theorem 7: Let U be an mxn matrix with orthonormal columns, and let x and y be in R^n. Then (Ux)*(Uy) = ?
x*y
6.2 Theorem 7: Let U be an mxn matrix with orthonormal columns, and let x and y be in R^n. Then
(Ux)*(Uy) = 0 only if ____
x*y= 0
6.3 The orthogonal Decomposition Theorem
Let W be a subspace of R^n. Then each y in R^n can be written in the form: y = yhat +z
where yhat is in W and z is perpendicular to W. If {u1…up} is an orthogonal basis of W, then:
Yhat = ((yu1)/u1u1)u1) + …((yup)/upup)up)
and z = y - yhat
6.3 If y is in W = Span{u1…up}, then ?
projection of w y = y
6.3 The best Approximation Theorem
Let W be a subspace of R^n, let y be a vector in R^n, and let yhat be the orthogonal projection of y onto W.
then yhat is the closest point in W to y.
||y-yhat|| < ||y-v||
for all v in W distinct from yhat
6.3 theorem 10. if {u1… up} is an orthonormal basis for a subspace W of R^n then what
projw y = (yu1)u1 + … (yup)up
6.3 theorem 10 if U = [u1…up] then what
projw y = UU^T y for all why in R^n
4.1 subspace: 3 properties
contains the zero vector
is closed under addition
is closed under multiplication