Final Definitions Flashcards
Linearly Dependent Set
A set of vectors is called linearly dependent if one of the vectors is a linear combination of the other vectors. Otherwise the set of vectors is called linearly independent.
Linear Transformation
A function T: Rn→Rm
that satisfies the following properties:
T(x+y)=T(x)+T(y)
T(ax)=aT(x)
T is a linear transformation if and only if T(x) = Ax
Matrix Inverse
An nxn matrix is called invertible if there exists a matrix B, such that AB= I. And I is an nxn identity matrix. B is called the inverse of A and denoted as A^-1.
Basis
A set of linearly independent vectors in Rn that span Rn.
Subspace
vector space within another vector space.
Basis for subspace
A set of linearly independent vectors in S, and that span the subspace S.
Dimension
The number of vectors in any basis of H, that span H and are linearly independent.
Null space
The set of all solutions to Ax=0.
Rank
The number of pivot columns of the matrix A.
Column Space
The set of all liner combinations to the columns of A.
Eigenvectors/eigenvalues
Eigenvalue (λ) is a value such that Av= λv, and v is a non-zero vector called the eigenvector.
Diagonalizable matrix
Matrix A is diagonalizable if there exists a diagonal matrix D and an invertible matrix P such that A=PDP-1
Similar matrices
Matrix A is similar to matrix M if there exists an invertible matrix P, such that A= PMP-1.
Orthogonality
Vectors (u and v) are orthogonal to each other if its dot product =0.
Orthogonal Complement
The set of vectors that is orthogonal to every vector of the subspace W in Rp.