Linear Algebra Flashcards
Solution
A list of values (s_1, s_2,…,s_n) that satisfy every equation of a linear system
Consistent
The solution set is nonempty
Inconsistent
The solution set is empty
Two linear systems are equivalent if
they share the same solution set
What are the options for the number of solutions a system can have?
1) Exactly one
2) Infinitely many
3) None
What does no solution look like in R2?
Two parallel lines
Two matrices are row equivalent if
There’s a sequence of elementary row operations that transform one matrix into another
In reduced row echelon for how many matrices are row equivalent?
Exactly one matrix
What is the rank of a matrix?
The number of pivot columns
What is the nullity of a matrix?
The number of free variable columns
If there are more variables than equations we…
Add a free variable
If there is a free variable then there are _______ solutions.
Infinitely many
Vectors u, v are equivalent if
u_1=v_1,…,u_n=v_n
Spanning set
The set of all linear combinations of a certain vector
Homogenous
The solution to the matrix equation Ax=b where b=0
Trivial solution
In the matrix equation Ax=b when x=0
The system Ax=0 has a nontrivial solution if and only if
it has at least one free variable
Linear independence
Each vector adds another dimension to the span
Linear dependence
When at least two vectors have the same span
Zero Matrix
An mxn matrix with all zero entries
Square Matrix
An nxn matrix
Main Diagonal
In a square matrix, it’s the values a_11, a_22,…,a_nn
Diagonal Matrix
Only the entries of the main diagonal are nonzero
Is the identity matrix a diagonal matrix?
Yes
Does AB=BA for any A,B
No
What sizes of two matrices must match for a product to be possible?
The column of the 1st matrix and the row of the 2nd matrix must match
What is the transpose of A?
A nxm(flipped) matrix where the 1st column=1st row and so on
Symmetric
A is an nxn matrix and A=A^T
Hermitian
A is an nxn matrix and A=A^*=A^(-T)
Quadratic form
A is a symmetric matrix and Q is a function defined by Q(x)=x^TAx
Positive definite
x^tA, x>0 for nonzero x’s
Positive semidefinite
x^tAx, x>=0 for nonzero x’s
Negative definite
x^tAx, x<0 for nonzero x’s
Negative semidefinite
x^tAx, x<=0 for nonzero x’s
Indefinite
x^tAx, x>0 for all x
Nilpotent
k is a positive integer and A is an nxn matrix such that A^k=0(zero matrix)
Idempotent
An nxn matrix A where A^2=A
Skew-symmetric
An nxn matrix A such that A^T=-A
Trace of A/Tr(A)
The sum of the main diagonals of A
A is invertible if
there exists an nxn matrix A^(-1) such that AA^(-1)=A^(-1)A=I_n
If A is an nxn matrix then
its inverse is unique
A is orthogonal
A is an invertible nxn matrix and A^T=A^(-1) and therefore AA^T=A^TA=I_n
Elementary matrix
A square matrix obtained by forming ONE elementary row operation on I_n
If A is invertible then Ax=b must have how many solution(s)?
One(unique)
Determinant of a 2x2 matrix A
a_11a_22-a_12a_21
Submatrix of an nxn matrix A
Matrix obtained by removing the ith row and jth column
(i,j)-cofactor of A
C_(ij)=(-1)^(i+j)det(A_(ij))
How to find the determinant of matrices?
For 2x2 matrices, use the product of the first diagonal minus the product of the second diagonal. For larger square matrices, use the cofactor expansion across any row or column
An nxn matrix A is invertible if and only if
det(A)!=0
Cramer’s Rule
A is an invertible nxn matrix and b is a real number. Then there is a unique solution x in Ax=b given as x_i=(det(A_i(b)))/(det(A))
Vector space
If ten axioms are true
Vectors
Elements of a vector space
Zero vector space
The set V that only contains 0
Subspaces
Vector spaces that are formed from subsets of other vector spaces (A vector space V is a subset H of V)
Properties of a subspace:
The zero vector is in H, it’s closed under addition (for u,v in H, we have (u+v) in H), and closed under scalar multiplication (for u in H we have cu in H)
Zero subspace
The set H consisting of only the zero vector in V, H={0}
H is a spanning set for V
Let v1,…,vk be in a vector space V, then H=span{v1,…,vk} is a subspace of V
Null space of a mxn matrix A
Set of all solutions to the homogenous vector equation Ax=0: Nul(A)={x in Rn:Ax=0}
Set of all vectors x in Rn mapped to 0 in Rm by a linear transformation
Column space of an mxn matrix A
The set of all linear combinations of the columns of A. Col(A)=span{a1,…,an} where A=[a1 … an]
The column space of an mxn matrix A is a subspace of
Rm (rows)
The null space of an mxn matrix A is a subspace of
Rn (columns)
Linear transformation T:V -> W where V,W are vector spaces and
each x in V maps to a unique vector T(x) in W such that
1) T(x+v) = T(u) + T(v) for all u,v in V
2) T(cu) = cT(u) for all u in V
Kernel (null space) of T
Let T:V->W be a linear transformation
The set of u in V such that T(u)=0 where 0 in W:
Ker(T)={u in V: T(u) = 0}
Range of T
Let T:V->W be a linear transformation
The set of all b in W such that T(u)=b where u in V:
Rng(T={b in W:T(u)=b, u in V})
Let S={v1,…,vk} be a set of vectors in a vector space V.
S={v1,…,vk} is a linearly independent set if
c1v1+…+ckvk=0 holds only when c1=…=ck=0
Let S={v1,…,vk} be a set of vectors in a vector space V.
S={v1,…,vk} is a linearly dependent set if
c1v1+…+ckvk=0 holds for nonzero constants
Let H be a subspace of a vector space V. An indexed set of vectors B={b1,…,bk} in V is a basis for H is
1) B is a linearly independent set
2) H=span{b1,…,bk} (B spans H)
A basis for the Col(A) is formed from
the pivot columns of A
Let B={b1,…,bn} be a basis for a vector space V and let x in V. The the B-coordinates of x are the constants c1,…,cn such that
x=c1b1+…+cnbn
Let B={b1,…,bn} be a basis for a vector space V and let x in V. Assume c1,…,cn are the B-coordinates of x. Then the B-coordinate vector of x is
[x]_B =[c1
c2
.
.
.
cn]
Let B={b1,…,bn} be a basis and [x]_B be the B-coordinate vector of x. For x=P_B[x]_B, the change of coordinates matrix from B to the standard basis in Rn is
P_B=[b1 …. bn]
A vector space V is isomorphic to another vector space W if and only if
every vector space calculation in V can be accurately reproduced in W and vv
Let B be a basis for a vector space V. The set {u1,…,un} in V is linearly independent if and only if
the set{[u1]_B,…,[un]_B}] is linearly independent in Rn
Suppose B={b1,…,bn} is a basis for a vector space V. Then any set S in V with more than n vectors must be
linearly dependent
If a vector space V has a basis with n vectors, then every other basis for V must have
n vectors
Dimension of a vector space V
the number of vectors in the basis for V
V is spanned by a finite set
V is finite-dimensional
V is spanned by an infinite set
V is infinite-dimensional
The dimension of the zero vector space is
0
Let H be a subspace of a finite dimensional vector space V. Then
any linearly independent set in H can be expanded if necessary to a basis for H
Let A be an mxn matrix. The row space of A is
the set of all linear combinations of the rows of A.
A=[r1
.
.
rn]
the Row(A)=span{r1,…,rn}
If A and B are row equivalent, then
Row(A)=Row(B)
If B is the row echelon form of A, then
the nonzero rows of B form a basis for Row(A) and Row(B)
Let A be an mxn matrix. Then
dimCol(A)=dimRow(A)
dimCol(A)=
=number of pivots of A
dimRow(B)=
=number of nonzero rows of B
dimNul(A)=
=number of free variable columns of A
Let A be an mxn matrix. The rank of A is
the dimension of the column space of A:
Rank(A)=dimCol(A)=dimRow(A)
Let A be an mxn matrix. The nullity of A is
the dimension of the null space of A:
Nullity(A)=dimNul(A)
Let A be an mxn matrix, then Rank(A)=
=Rank(A^T)
Rank-Nullity Theorem
Let A be an mxn matrix, then
Rank(A)+Nullity(A)=n
If W is a subspace of Rn, then the orthogonal complement of W is
the set of all vectors in Rn that are orthogonal to every vector in W.
Wperp={v in Rn: v*w=0 for all w in W}
Let W be a subspace of Rn, then
1) Wperp is a subspace of Rn
2) The only vector in common to W and Wperp is 0
3) The orthogonal complement of Wperp is W
Let T:V->W be a linear transformation. The rank of T is
the dimension of the range of T:
Rank(T)=dimRng(T)
Let T:V->W be a linear transformation. The nullity of T is
the dimension of the kernel of T:
Nullity(T)=dimKer(T)
Rank-Nullity Theorem for Linear Transformations
Let T:V->W be a linear transformation. Then
Rank(T)+Nullity(T)=dimV
A linear transformation T:V->W is one-to-one if
T maps distinct vectors in V to distinct vectors in W
If Rng(T)=W, then T is called
onto
A linear transformation T:V->W is one-to-one if and only if
Ker(T)={0}
Let dimV=dimW=n. Then alinear transformation T:V->W is one-to-one if and only if
it is onto