Linear Algebra Flashcards
Define convex
Any line segment joining two points in the curve is above the curve
Singular matrix
Inverse of the matrix doesn’t exist
Multi-variate Gaussian
f(x) = 1/sqrt[ (2 pi)^d |covariance|] exp((x - m).T (x-m)/(2 covariance))
LU Decomposition
QR Decomposition
Singular Value Decomposition
A [m x n] = U [m x m] S [m x n] V.T [n x n]
U, V are orthogonal, unitary
Eigendecomposition
A = Q E Q^-1 for a square matrix A
Columns of Q = eigen vectors
diagonals of E = eigen values
Eigen values and vectors of a symmetric matrix
eigenvals = Real
vectors = orthogonal
Unitary Matrix
Conjugate transpose = inverse
Positive definite vs positive semi-definite matrices
Symmetric Matrix ‘A’ is positive definite if
z.T A z > 0 for every non-zero vector z
Semi-definite: >=0
How to know if a matrix is invertible
Lowest eigen val is positive
SVD and Rank of matrix
Rank of matrix = # of non-zero singular values
When does Ax = b have a unique solution? (hint ranks)
When rank[A] = rank[A| b] = n
where A is m x n, b is m x 1
Dot product vs Cross product
Dot product yields a scalar: A.B = ||A|| ||B|| cos alpha
Cross product yields another vector perpendicular to both A and B with magnitude: A x B = || A || || B || sin alpha
inverse of a 3x3 matrix
1/|A| adjugate (A)
adjugate = transpose of cofactor
cofactor of x_ij = (-1) ^ (i + j) det(of matrix skipping row_i + col_j)
How to diagonalize a matrix?
To diagonalize matrix ‘A’, find eigen vectors and let P be matrix with eigen vectors as columns. If # of eigen values == # of rows/cols, then it can be diagonalized as:
D = P^{-1} A P
Measures (L0, L1, L2, L-inf)
L0 = # of non-zero
L1 = sum of abs elems/Manhattan
L2 = euclidean distance to origin
L-inf = max abs value
How to test if a matrix is positive definite?
Determinant test
Pivot test
All eigenvalues are positive
Determinant test:
All upper left determinants > 0
i.e.,
Every sub-matrix starting with single element in the top-left corner to larger sizes (row+1, col+1) has a determinant > 0
Pivot test:
convert to upper triangular matrix, if pivots (diagonal) elements > 0 –> positive definite
Properties of eigen values and eigen vectors for:
symmetric matrices
Symmetric matrix: A.T A = I
Real Eigen values
Orthogonal eigen vectors
Properties of eigen values and eigen vectors for:
positive definite matrices
eigen values real and > 0
orthogonal eigen vectors
Properties of eigen values and eigen vectors for: orthogonal matrix
Orthogonal matrix: Q Q.T = I
| eigen value | = 1
orthogonal eigen vectors
lambda | = 1
Why is an orthogonal matrix computationally preferable?
Because, inverse = transpose
Determinant and eigenvalues
product of eigen values
Interpretation of a determinant
Volume scaling done by linear transformation