Linear Algebra Flashcards
When is a matrix in Upper Echelon from?
All non-zero rows are below zero rows. The first entry of a non zero row is 1. The first entry of each non zero row is strictly to the right of the row above.
What are the properties of a vector space? (5)
i) a(U+V) = aU + aV ii) (a+b)V = aV + bV iii) (ab)V = a(bV) iv) 1V = V v) addition properties of a field
What is a linear combination of v(1), v(2), … , v(n) ?
a vector of the form v = av(1) + bv(2) + …. + nv(n) for integers a,b,…,n
When are vectors linearly independent?
When the only combination of the vectors that makes zero is the trivial one
Lemma; The vectors v(1),…,v(n) are linearly dependent if and only either v(1)=0 or, for some r, the vector v(r) is a linearly combination of v(1),…,v(r-1)
If v(1)=0 then for any a(1) = 1 and a(i) = 0 for all i not equal to one, the linear combination is equal to zero so the vectors are linearly independent. If v(r) is a linear combination then a(1)v(1)+ … + a(r-1)v(r-1) - v(r) = 0 so the vectors are linearly dependent. Conversely, if they’re linearly independent then a(1)v(1)+ … + a(r)v(r)=0 with the coefficients not all zero. Let r be the maximal with a(r) not zero, then if r=1 v(1)=0. Otherwise, we get v(r) = a(1)v(1)/a(r)+ …. + a(r-1)v(r-1)/a(r) in which case v(r) is a linear combination.
When do vectors span a set V
When every vector in V is a linear combination of a subset S
When are a subset S in V a basis of V
When they span V and the vectors in S are linearly independent
Lemma; The vectors v(1),…..,v(n) form a basis of V if and only if every vector in V can be written uniquely as a linear combination of v(1),…..,v(n)
Suppose v(1),…..,v(n) form a basis of V. Then each vector in V can be written as a(1)v(1)+ … + a(n)v(n). Suppose also that v= b(1)v(1) + …. + b(n)v(n). Then v-v = (a(1)-b(1))v(1) + ….. + (a(n)-b(n))v(n) = 0 This implies by linear independence that a(i) = b(i) for all i. So each vector can be written uniquely Conversely, suppose each vector in V can be written uniquely. Then v(1),….,v(n) span V. Specifically a(1)v(1)+….+a(n)v(n)=0 and since its unique all a(i) = 0, so the vectors a linearly independent and therefore form a basis.
What is the dimension of V?
The number of vectors in the basis of V
Lemma; Suppose that vectors v(1),….,v(n),w span V and that w is a linear combination of v(1),….,v(n). Then v(1),….,v(n) span V
Since v(1),….,v(n),w span V then every vector v can be written as a linear combination v= a(1)v(1)+….+a(n)v(n)+a(w)w. Since w is a linear combination w=b(1)v(1)+….+b(n)v(n) this can be rearranged to get v=(a(1)+a(w)b(1))v(1)+….+(a(n)+a(w)b(n))v(n). so the vectors v(1),….,v(n) span V
Theorem; Suppose that the vectors v(1),….v(r) span the vector space V. Then there is a subsequence of v(1),…,v(r) than form a basis of V
We sift the vectors v(1),….,v(r). The vectors that are sifted are linear combinations of the remaining vectors, and the remaining vectors still span. The remaining vectors are linearly independent, and hence form a basis of V
Theorem; Let V be a vector space over K which has a finite spanning set, and suppose that the vectors v(1),….,v(r) are linearly independent. Then we can extend the sequence to a basis v(1),….,v(n) where n >= r
Suppose that w(1),…..,w(q) is a spanning set. We sift the sequence v(1),….,v(r),w(1),….,w(q). Since w(1),…..w(q) spans V, the whole sequence spans V. Furthermore, v(1),….,v(r) are linearly independent none of these are sifted so the basis contains v(1),….v(r)
What is the row reduced matrix of vectors that form a basis
Identity
Proposition; Suppose that vectors v(1),….,v(n) span V and that vectors w(1),…..w(m) are linearly independent. Then m<=n
Since v(1),….,v(n) span V then w(1),v(1),…..,v(n) are linearly independent, so we sift and at least one v(j) is deleted. We add w(2) to the front and sift again, and keep repeating. None of the w(i) are deleted since they are linearly independent. In total, we add m vectors, and a v(j) is deleted each time, so we must have m<=n
Can n-1 vectors span a vector space of dimension n?
No
Can n+1 vectors span a vector space of dimension n?
Yes
Can n-1 vectors be linearly independent in a vector space of dimension n?
Yes
Can n+1 vectors be linearly independent in a vector space of dimension n?
No
What are the 3 conditions of a subset W in a vector space V for it to be a subspace?
i) W is non-empty ii) Closed under addition iii) Closed under scalar multiplication
Proposition; If W(1) and W(2) are subspaces then W(1)nW(2) is also a subspace
Take u,v in W(1)nW(2). Then u+v is in W(1) , and u+v is in W(2), so its in the intersection group. Similarly au is in the intersection, so its a subspace
Define the subset W(1) + W(2)
w(1) + w(2), where w(1) is in W(1) and w(2) is in W(2)
Proposition; W(1) + W(2) is a subspace where W(1) and W(2) are subspaces, in fact it’s the smallest subspace containing both
Take u,v in W(1) + W(2), u= u(1) + u(2), v= v(1)+v(2). Then u+v = (u(1)+u(2)) + (v(1)+v(2)) is in the subset. a(u) = a(u(1)+u(2)) is in the subset. So its a subspace. Any other subspace will contain this subspace, so its the smallest
What is the equation for dim(W(1)+W(2))
dim(W(1)) + dim(W(2)) - dim(W(1)nW(2))
What are the two properties of a linear transformation T from U to V
T(u(1)+u(2)) = T(u(1)) + T(u(2)) T(au) = aT(u)
Proposition; Let U,V be vector spaces, let S be a basis of U and let f:S -> V be a function assigning to each vector in S an element of V. Then there is a unique linear map T from U to V such that for every s in S T(s) = f(s)
Since S is a basis of U each element in U is uniquely determined. u = a(1)s(1) + …. + a(n)s(n). so T(u) = T(a(1)s(1)+…..+T(a(n)s(n) = a(1)f(s(1)) + a(n)f(s(n)). So T if it exists is uniquely determined
Let T(1), T(2) map U to V, with non matrices A,B. What is the matrix of the transformation i) T(1) + T(2) ii) T(1)(T(2))
i) A+B ii) AB
What is the image of T, that maps U to V
The set of vectors v in V such that T(u)=v
What is the kernel of T, that maps U to V
The set of vectors u in U such that T(u)=0
Proposition; Let T map U to V, then i) Im(T) is a subspace of V ii) Ker(T) is a subspace of U
i) Take v(1),v(2) in Im(T) then v(1) + v(2) = T(u(1)) + T(u(2)) = T(u(1)+u(2)) av(1) = aT(u(1)) = T(au(1)) ii) Take u(1),u(2) in Ker(T) then u(1) + u(2) = T(0) + T(0) = T(0) au(1) = aT(0) = T(0)
What is the rank(T) and the nullity(T)
rank is the dimension of the Image. Nullity is the dimension of the Kernel
State the Rank-Nullity theorem
Rank(T) + Nullity(T) = dim(U) T maps U to V
Prove the Rank Nullity Theorem
Let nullity of T = s and the basis be e(1),….e(s). Extend the basis to a basis of U, e(1),…..,e(s),f(1),….,f(r). So dim(U) = s+r Since e(1),….,e(s) is the nullity, f(1),….,f(r) must span Image(T). Suppose for some scalars we have a(1)T(f(1))+…..+a(r)T(f(r)) = 0, then a(1)f(1) + … + a(r)f(r) is in the kernel of T. But the basis of Ker(T) is e(1),…,e(s). So there is scalars such that b(1)e(1) +…. +b(s)e(s) = a(1)f(1) + … + a(r)f(r) i.e. the subtraction is in the kernel. Since they are a basis they are all linearly independent, so the scalars all equal zero, and hence f(1),…,f(r) are linearly independent.
Corollary; Let T be a map from U to V and suppose that dim(U)=dim(V)=n. Then the following properties are equivalent i) T is surjective ii) rank(T) = n iii) Nullity(T) = 0 iv) T is injective v) T is bijective
Surjectivity means that Im(T)=n, so i) implies ii). ii) implies iii) by the rank-nullity theorem. If Ker(T) = zero, suppose T(u(1))=T(u(2)) then T(u(1)-u(2))=0 but the kernel is empty so T must be injective. Surjective and Injective imply bijective
What is the row space of A, a matrix, and what is the row rank
The subspace spanned by the rows of A. The row rank is the dimension of the row space
What is the column rank equal to?
Rank(T)
What is the rank of a matrix A equal to
the number of non-zero rows of the upper echelon form matrix
What is the complete set of solutions to the equation Ax=b
x + nullspace(A)
Proposition; Let A be a matrix of a linear map T. A linear map T is invertible if and only if its matrix A is invertible. Then T inverse and A inverse are unique
Under bijection between matrices and linear maps, multiplication of matrices correspond to composition of linear maps. Therefore, invertible maps correspond to invertible linear maps. Since the inverse map of a bijection is unique, T inverse is unique.
Lemma; If A and B are invertible nxn matrices, then AB is invertible, and (AB)^-1 = B^-1A^-1
ABB-1A-1 = B-1A-1AB=Identity
What is the row reduced form of an invertible non matrix
Identity
When is a permutation even?
When it is a combination of an even number of permutations. In this case it has sign +1
How do elementary row operations affect the determinant of a matrix?
i) det(Identity) = 1 ii) If B is a result of interchanging rows of A det(B) = -det(A) iii) if B is the result of adding rows of A, the determinants are the same iv) if B is the result of multiplying a row by a scalar k det(B)=kdet(A) v) if A has two equal rows then det(A)=0
When is a matrix in upper triangular form
When all of it’s entries below the main diagonal are zero
What is the determinant of an upper triangular matrix
The product of the main diagonal
What is the (I,j)th minor of A
The (n-1)x(m-1) matrix A with the ith row and jth column removed
What is a cofactor of A
(-1)^(I+j) multiple by the determinant of the corresponding minor
What is the determinant of A?
Theorem; Let A be an nxn matrix. Then det(A^T)=det(T)

Theorem; For an nxn matrix A, det(A) = 0 if and only if A is singular
A can be reduced to row reduced form using elementary row operations, which don’t affect the rank of A, and so they don’t affect whether or not A is singular. Therefore, they also dont affect whether or not det(A)=0. The rank of A is the number of non-zero rows of the row reduced form, so if A is singular it has zero rows since singular means rank(A)<n.></n.>
<p> </p>
<p>Conversely, if A is non-singular then the row reduced form is the identity, in which case det(A)=1 not 0</p>
</n.>
What is the adjugate matrix adj(A) of an n x n matrix?
the n x n matrix where the (i,j)th entry is the cofactor c(j,i). It is the transpose of the matrix of cofactors
What is the product of A adj(A)
det(A) multiplied by the Identity matrix
Write the inverse of A provided det(A) doesnt equal zero
1/det(A) multiplied by adj(A)
When are two n x n matrices over K similar
If there exists an n x n invertible matrix P such that B=P^(-1)AP
When is a matrix diagonal
if the (i,j)th entry is 0 for all i not equal to j
When is a matrix diagonalisable
When it is similar to a diagonal matrix
Define Eigenvectors and Eigenvalues
Let T map V to V be a linear map, where V is a vector space over K. Suppose that for some non-zero vector in v in V and some scalar k in K, we have T(v)=kv. Then v is called an eigenvector of T, and k an eigenvalue of T corresponding to v
Theorem; Let A be an n x n matrix. Then k is an eigenvalue of A if and only if det(A-kI)=0

What is the characteristic equation of the n x n matrix A
det(A - xIn) = 0
Theorem; Similar matrices have the same characteristic equation and hence the same eigenvalues

What are the eigenvalues of a matrix in Upper Triangular form?
The diagonal entries
What are the 3 Elementary matrices

When does the homogenous system of equations Ax=0 have a non-zero solution
When A is singular
When does that equation system Ax=b have a unique solution
When A is non-singular
Lemma; If E is an nxn elementary matrix, and B is any nxn matrix, then det(EB)=det(E)det(B)

Theorem; For any two nxn matrices A and B we have det(AB) = det(A)det(B)

Theorem; Let k1,….,kr be distinct eigenvalues of T: V to V, and v1,……,vr be corresponding eigenvectors. Then v1,……,vr are linearly independent
