New Deck Flashcards
Basis (for a subspace)
A basis for a subspace is a set of vectors v_1, …, v_k in W such that :
- v_1, … v_k are linearly independent; and
- v_1, … v_k span W
Characteristic Polynomial of a matrix:
the characteristic polynomial of a n by n matrix A is the polynomial in t given by the formula det(A-t*I)
Column Space of a matrix:
the column space of a matrix is the subspace spanned by the columns of the matrix considered as vectors
Also row space
Defective matrix
a matrix A is defective if A has an eigenvalue whose geometric multiplicity is less than its algebraic multiplicity
Diagonalizable matrix
A matrix is diagonalizable if its dimension of a subspace: the dimension of a subspace Wi s the number of vectors in any basis of W (if W is the subspace{0}, we say that its dimension is 0)
Row echelon form of a matrix
A matrix is in row echelon form if
- all rows that consist entirely of zeros are grouped together at the bottom of the matrix; and
- the first counting left to right) nonzero entry in each non zero row appears in a column to the right of the first nonzero entry in the preceeding row (if there is an preceeding row)
Reduce row echelon form of a matrix:
A matrix is in reduce row echleoon form if
- matrix is in row echedlon form
- the first nonzero entry in each nonzero row is the number 1; and
- the firs tnonzero entry in each nonzero row is the only nonzero entry in its column
Eigenspace of a matrix
The eigenspace associated with the eigenvalue c of a matrix A is the null space of A-c*I
Eigenvalue of a matrix:
An eigenvalue of a n by n matrix A is a scalar c such that Ax =cx holds for some nonzero vector x
Eigenvector of a matrix:
An eigenvector of a n by n matrix A i a nonzero vector x such that Ax=cx holds for some scalar c
equivalent linear systems
Two system of linear equations in n unknowns are equivalent if they have the same set of solutions
homogenous linear system
A system of linear equations A*x=b is homogeneous if b=0
inconsistent linear system
A system of linear equations is inconsistent if it has no solutions
inverse of a matrix
the matrix B is an inverse for the amtrix A if AB = BA = I , identity matrix
Least squares solution of a linear system
A leat-squares soution to a system of linear equations Ax = b is a vector x that minimizes the length of the vector Ax-b
Linear Combination of vectors
vector v is a linear combination of the vectors v_1, …, v_k if there exist scalars a_1, …, a_k such that v=a_1 * v_1 + …+ a_k*v_k
Linearly Dependent vectors
vectors v_1, … , v_k are linearly dependent if the equation a_1 * v_1 + …+ a_k*v_k =0 has a solution where not all the scalar a_1, …, a_k are zero
Linearly Independent vectors
vectors v_1, … , v_k are linearly dependent if and only if the solution to the equation a_1 * v_1 + …+ a_k*v_k =0 is the solution where all the scalar a_1, …, a_k are zero
Linear Transformation
a linear transformation from V to W is a function T from V to W such that:
- T(u+v) = T(u) + T(v) fpr all vectors u and v in V; and
- T(av) = aT(v) for all vectors v in V and all scalars a
algebraic multiplicity of an eigenvalue:
The algebraic multiplicity of an eigenvalue c of a matrix A is the number of times the factor (t-c) occurs in the characteristic polynomial of A
Geometric mutliplicity of an eigenvalue
The geometric multiplicity of an eigenvalue c of a matrixA is the dimension of the eigenspace of c.
Symmetric matrix
a matrix is symmetric if it equals its transpose
Subspace
A subspace Q of n-space is a subspace if
- the zero vector is in W
- x+y is in W whenever x and y are in W; and
- a*x is in W whenever x is in W and a is any scalar
Span of a set of vectors
the span of the vectors v_1, … v_k is the subset V consisting of all linear combinations pof v_1, … v_k . One can also say that the subspace V is spanned by the vectors v_1, .. , v_k and that these vectors span V
Singular matrix
An by n matrix A is singular if the equation A*x = 0 (where x is an n-tuple) has a nonzero solution for x
Row space of a matrix
the row space of a matrix is the subspace spanned by the rows of the matrix considered as vectors
Row equivalent matrices
Two matrices are row equivalent if one can be obtained from the other by a sequence of elementary row operations:
The elementary row operations performed on a matrix are :
* interchange two rows;
* multiply a row by a nonzero scalar;
add a constant ultiple of one row to another
Rank
of a linear transformation
Similar Matrices
Matrices A and B are similar if there is a square nonsingular matrix S such that s^{-1} AS=B
Nonsingular matrix
an n by n atrix A is nonsingular if the only solution to the equation A*x = 0 is x =0
Null Space of a matrix
The null space of a m by n matrix is the ser of all n-tuple x such that A*x=0
Null Space of a linear transformation
Null space of a linear transformation T is the set of vectors v in its domain such that T(v) =0
Nullity of a matrix
The nullity of a matrix is the dimension of its null space
Nullity of a linear transformation
The nullity of a dimension of its null space
Orthogonal set of vectors
A set of n-tuples is orthogonal if the dot product of any two of them is 0
Orthogonal matrix
A matrix A is orthogonal if A is invertible and its orthogonal linear transformation. A linear transformation T from V to W orthogonal if T(v) has the same length as v for all vectors v in V
Orthonormal set of vectors
A set of n-tuples os orthonormal if it is orthonormal if it is orthogonal and each vector has length 1
Range of a matrix
The range of a m by n matrix A is the set of all m-tuples A*x, where x is any n-tuples
Range of a linear transformation
The range of a linear transformation T is the set of all vectors T(v) , where v is any vector in tis domain
Rank of a matrix
The rank of a matrix is the number of nonzero rows in any row equivalent matrix that is in row echelon form
Rank of a linear transformation
the rank of a linear transformation (and hence of any matrix regarded as a linear transformation )is the dimension of its range. A theorem tell us that the two definitions of rank of a matrix are equivalent
row equivalent matrices
two matrices are row equivalent if one can be obtained from the other by a sequence of elementary row operations:
The elementary row operations performed on a matrix are:
interchange two rows;
multiplying a row by a nonzero scalr
add a constant multiple of one row to another
Sheldon Axler Notes:
Vectors are defined first
Linear independence, span , basis and dimension are defined in the chapter , which present the basic theory of finite-dimensional vector spaces
Linear maps are introduced next leading to Fundamental Theorem of Linear Maps: if T is a linear map on V, then dimV - dim nullT = dim range T. Quotient spaces and duality are topics with high abstraction
Theory of polynomials is needed to understand linear operators (no linear algebra).
Studying linear operators by restricting it to small subspaces leads to eigenvectors. Complex vector spaces, eigvenvalues always exist. Eachlinear operator on a complex vector space has a upper triangular matrix with respect to some basis.
Inner product spaces are defined in this chapter and their basic properties are developed along with orthonormal bases and Gram-schimdt procedure. Orthogonal projections can be used to solve certain minimalization problems
Spectral theorem characterizes linear operators where there is an orthonormal basis consisting of eigenvector is the highlight.
notes
Invertible Matrix
How do you know if A^{-1} exists: no nonzero solution to Ax = 0
columns are linearly independent of A
rows of A are Linearly Independent
detA != 0
unique solution to Ax=B
all eigenvalues are nonzero: means it is invertible
nonsingular means invertible
if Ax=0, only solution is 0 if A is invertible and ths A^{-1} A = I
How do you now if you can invert A, IF A is invertible A must be nonsingular (A^{-1} is also nonsingular)
Det(A) != 0 same as Ax=0 same as x=0
Solutions to system
Consistent: IFF right most column is not
Inconsistent:
consistent and inconsistent
Reduced Row Echelon Form (RREF)
Free variable if pivot has 0 below it
unique solution : no free variable
infinitely many solutions: at least 1 free variable
Linear combination
y = c_1x_1 + …. + c_nx_n
set
{[ ] , [ ] , [ ] }
Span
some linear combination of set
identity matrix
1 0 0
0 1 0
0 0 1