Linear Transformation And Linear System Flashcards
Affine transformation
An affine transform is a combination of a linear transform with a translation
Affine Transform - Formal Definition
Householder reflection matrix
Anisotropic
When the scaling factors across different dimensions are different, the scaling is said to be anisotropic
All linear transformations defined by matrix multiplication can be expressed as a sequence of
Rotations/reflections, together with a single anisotropic scaling
Dimensionality of a vector space
The number of members in every possible basis set of a vector space V is always the same. This value is referred to as the dimensionality of the vector space.
Matrix Invertibility and Linear Independence
An n × n square matrix A has linearly independent columns/rows if and only if it is invertible
The normal equation
Left-inverse of the matrix A in the normal equation
Right-inverse of the matrix A in the normal equation
Converting coordinates from one system to another
- A is the original coordinates system
- B is the target coordinates system
- xb and xa and the vectors being transformed
The normal equation expressed as a change of coordinates system
Disjoint Vector Spaces
Two vector spaces U⊆Rn and W⊆Rn are disjoint if and only if the two spaces do not contain any vector in common other than the zero vector
Orthogonal Vector Spaces
Two vector spaces U⊆Rn and W ⊆ Rn are orthogonal if and only if for any pair of vectors u ∈ U and w ∈ W, the dot product of the two vectors is 0
Rectangular matrices are said to be of full rank when either the rows or the columns are linearly independent
The former is referred to as full row rank, whereas the latter is referred to as full column rank
Left Null Space
The left null space of an n × d matrix A is the subspace of Rn containing all column vectors x ∈ Rn, such that AT x = 0. The left null space of A is the orthogonal complementary subspace of the column space of A
The four fundamental subspaces of linear algebra
Linear Independence and Gram Matrix
The matrix AT A is said to be the Gram matrix of the column space of an n × d matrix A. The columns of the matrix A are linearly independent if and only if AT A is invertible
Orthogonality of Basis Vectors for the discrete cosine transform
The dot product of any pair of basis vectors bp and bq of the discrete cosine transform for p != q is 0
Regularized left and right inverse
Orthogonal projection of a point onto a line
Projection of a point onto a line
the Subspace goes in the Subscript
Moore-Penrose pseudoinverse
Moore-Penrose pseudoinverse relation to other inverses
- The conventional inverse, the left-inverse, and the right-inverse are special cases of the Moore-Penrose pseudoinverse.
- When the matrix A is invertible, all four inverses are the same.
- When only the columns of A are linearly independent, the Moore-Penrose pseudoinverse is the left-inverse.
- When only the rows of A are linearly independent, the Moore-Penrose pseudoinverse is the right-inverse.
- When neither the rows nor columns of A are linearly independent, the Moore-Penrose pseudoinverse provides a generalized inverse that none of these special cases can provide.
- Therefore, the Moore-Penrose pseudoinverse respects both the best-fit and the conciseness criteria like the left- and right inverses.
Orthonormal matrix relation to the identity matrix
Projection matrix in the normal equation
Computing the projection matrix via QR decomposition
Condition Number
Let A be a d×d invertible matrix. Let ||Ax||/||x||
be the scaling ratio of vector x. Then, the condition number of A is defined as the ratio of
the largest scaling ratio of A (over all d-dimensional vectors) to the smallest scaling ratio
over all d-dimensional vectors.
Inner Products: Restricted Definition
A mapping from x, y ∈ Rn to x, y ∈ R is an inner product if and only if 〈x, y〉 is always equal to the dot product between Ax and Ay for some n × n non-singular matrix A. The inner product 〈x, y〉 can also be expressed using the Gram matrix S = AT A
Inner product: cosines and distances
When the linear transformation A is a rotreflection matrix, the matrix S is the identity matrix, and the inner product specializes to the normal dot product
Inner-Product: General Definition
The real value 〈u, v 〉 is an inner product between u and v, if it satisfies the following axioms for all u and v:
Complex-valued inner product
Conjugate Transpose of Vector and Matrix
The conjugate transpose v∗ of a complex vector v is obtained by transposing the vector and replacing each entry with its complex conjugate. The conjugate transpose V∗ of a complex matrix V is obtained by transposing the matrix and replacing each entry with its complex conjugate
Orthogonality in Cn
Orthogonal Matrix with Complex Entries
The Discrete Fourier Transform
Used for finding an orthonormal basis for time-series in the complex domain