finals: sections 5.9 - 7.4 Flashcards
what is a probability vector?
a vector with nonnegative entries that add up to one
what is a stochastic matrix?
a square matrix that holds probability vectors
what is a markov chain?
a sequence of probability vectors such that
x1 = Px0, x2 = Px1, …., where xk+1 = Pxk
what is a steady state vector?
a probability vector for a stochastic matrix P such that Pq =q
if P is a stochastic matrix, then ______________
1 is an eigenvalue of P
when is a stochastic matrix regular?
if there exists a positive integer k such that P^k has strictly positive entries
what is the dot product (inner product) of two vectors?
u^T * v
what is a unit vector?
a vector of length 1
why do we care about orthogonal basis?
the weights can be computed easily
why are orthonormal matrices important in computer algorithms?
for matrix computations
an mxn matrix has orthonormal columns if and only if ________
U^T * U = I
what is an orthogonal matrix?
a square matrix U such that U^T = U^-1 and U has orthonormal columns <– implied by saying U is an orthogonal matrix
what are the properties of the dot product?
1) u (dot) v = v (dot) u
2) (u + v) (dot) w = u (dot) w + v (dot) w
3) (cu) (dot) v = c(u (dot) v) = u (dot) (cv)
4) u (dot) u >= 0, u (dot) u = 0 if and only if u = 0
what is the definition of the length (or norm) of a vector?
||v|| = √(v * v) = √(v1^2 + v2^2 + … Vn^2)
and
||v^2|| = v * v
for u and v in R^n, the distance between u and v, written as dist(u, v) is the length of the vector u - v. That is, ______________
dist(u, v) = ||u - v||
when are two vectors orthogonal to each other?
when their dot product equals zero
two vectors are orthogonal to each other if and only if (use Pythagorean theorem) __________
||u + v||^2 = ||u||^2 + ||v||^2
if the length of (u + v)^2 is equal to the length of u squared plus v squared
if S + {u1, …, up} is an orthogonal set of nonzero vectors in R^n , then S is ____________
linearly independent and a basis for the subspace spanned by S
what is an orthogonal basis for W?
it is a set of vectors that are orthogonal and are also a basis for W
what is an orthonormal set?
an orthogonal set of all unit vectors
an mxn matrix has orthonormal columns if and only if ____________
U^T*U = I
what is a symmetric matrix?
a matrix A such that A^T = A
if A is symmetric, then any two eigenvectors from different eigenspaces are ______________
orthogonal
an nxn matrix A is orthogonally diagonalizable if and only if ____________
A is symmetric
what is orthogonally diagonalizable?
A = PDP^-1, but since a is orthogonal, P^-1 = P^T so
A = PDP^T
when is the length of Axk maximized?
when x = v1, the eigenvector corresponding to the largest eigenvalue
what is the definition of the singular value decomposition?
let A be an mxn matrix with rank r, then there exists an mxn matrix Σ for which the diagonal entries in D and the first r singular values of A, and there exists an mxm orthogonal matrix U and an nxn orthogonal matrix V such that A = UΣV^T
is the SVD, are U and V unique?
no
what are the singular values of a matrix A?
the square roots of the eigenvalues of A^T * A
which matrix factorizations always exist for any matrix?
SVD ONLY