Big Bad Proofs Flashcards
An invertible nxn matrix can be reduced to In by EROs (Thm 6)
Show that only solution of Ax=0 is x=0 [x=Ix=(A-1)Ax=0]
Therefore x=0 is only solution of Ex=0. Where E is RRE of A
There are no free variables since all Xi must equal 0
Since there are no free variables each column must contain the leading entry of a row. Since each leading entry is to the right of the last one E=In
Subspace test (Thm 12)
U is a subspace iff
0v in U and
Pu1+u2 in U for all u1, u2
Left->Right
Use axioms to show it’s always true
Right -> Left
Set U1,U2 to show closed
Suppose V has a finite spanning set S. Then S contains a linearly independent spanning set (basis) (Prop 18)
Take the largest subset of S that is linearly independent and show it is a spanning set. Assume for contradiction it is not a spanning set and throw an extra element in to get contradiction
Steinitz Exchange Lemma (Thm 19)
Take X a subset of V. Suppose that u in Span(X) but that u not in Span(X\{v}) for some v. Then let Y=(X\{v})U{u}. Span(Y)=Span(X)
Write u as a linear combination of X and rearrange to get v= (letting v=vn).
Then consider arbitrary element w in Span(Y). Write out linear combination and show w also in Span(X).
Same for w in Span(X) -> show in Span(Y)
Span(X)=Span(Y) by double inclusion
For S,T finite subsets of V. If S is linearly independent and T spans V, the size of S is smaller or equal to the size of T. (Thm 20)
“Linearly independent sets are at most as big as spanning sets”
Use Steinitz exchange Lemma to swap out elements of T and replace them with elements of S. This new list still spans V. If you run out of elements of T before S you have a list Tj composed entirely of elements of S that spans V. If there are elements of S remaining, say u(j+1) then u(j+1) is a linear combination of Tj which is false since S is linearly independent
Dimension Formula
Let U, W be subspaces of finite-dimensional vector space V over F. Then dim(U+W)+dim(U∩W)=dim(U)+dim(W) (Thm 25)
Consider a basis for U∩W.
Extend it to a basis for U and extend it to a basis for W.
Then show the combination of these bases is a basis for U+W: Show it spans U+W and show its linearly independent. Then the result follows
Rank-Nullity Theorem
Let V, W be vector spaces with V finite-dimensional.
Let T:V→W be linear. Then dimV=rank(T)+null(T) (Thm 36)
Take a basis, v1,…,vn, for kerT.
Extend it to a basis v1,…,vn,v’1,…,v’r for V.
Let wi=T(v’i)
CLAIM: S={w_1,…,w_r } is a basis for ImT.
S∪{T(v_1),…,T(v_n)} spans ImT from proposition 35 but v1,…,vn∈kerT so T(v1 ),…,T(vn ) do not contribute and so S spans ImT
Show S is linearly independent. Take a linear combination of S = 0w. Rewrite as T(v’i) and use linearity to get an element of kerT.
This can then be written as a linear combination of v1,…,vn. But v1,…,vn,v’1,…,v’r are linearly independent so all scalars=0 so w1,…,wr are linearly independent.
Result Follows
Change of Basis Theorem
Let V, W be finite-dimensional vector spaces over F.
Let T:V→W be linear.
Let v1,…,vn and v’1,…,v’n be bases for V.
Let w1,…,wm and w’1,…,w’m be bases for W.
Let A=(aij)∈Mmxn(F) be the matrix for T wrt. v1,…,vn and w1,…,wm.
Let B=(bij)∈Mmxn(F) be the matrix for T wrt. v’1,…,v’n and w’1,…,w’m.
Take pij.qij∈F such that v’i=sum of (pjivj) from j=1 to n and w’i’=sum of (qjiwj) from j=1 to m.
Let P=(pij)∈Mnxn(F) and Q=(qij)Mmxm(F).
Then B=Q^(-1)AP.
(Thm 45)
First take Q to be invertible and set up Q^(-1)=(r_ij)
so that wi=sum(rjiw’j) from j=1 to m
The rest follows quite nicely. You’re trying to show what B equals so start with T(v’i) as that’s what B represents.
Take A∈M_mxn (F), let r=colrank(A).
Then there are invertible matrices P∈Mnxn(F) and Q∈Mmxm(F) such that Q^(-1)AP has the block form
( Ir 0rxs )
( 0txr 0txs )
where s=n-r and t=m-r.
(Prop 47)
Consider LA: Fcol^n→Fcol^m (which has matrix A wrt. Standard bases), and find suitable bases for domain and codomain with respect to which the matrix for L_A has the required form.
Find the rank and nullity of LA (considering rank=colrank(A)). Take a basis of kerLA and extend it to a basis of the domain.
You can then find a basis for the image and extend that to a basis for the codomain
Then consider each basis vector of the domain under the transformation and you get a matrix of the required form
Then change of basis theorem then shows that there are change of basis matrices P and Q that give the result
Let A be an m x n matrix. Then colrank(A)=rowrank(A) (Thm 49)
State that there exists Q, P such that B=Q^(-1)AP where B is in the block form. rowrank(Q^(-1)AP)=rowrank(A), colrank(Q^(-1)AP)=colrank(A). But rowrank(B)=colrank(B)=r. So rowrank(A)=r=colrank(A)
Cauchy-Schwarz Inequality –
Let V be a real inner product space. Take v1,v2∈V. Then
|| ≤ ‖v1‖‖v2‖, with equality iff v1,v2 are linearly dependent. (Thm 56)
If v1 or v2=0. Then inequality clear, assume v1,v2≠0.
For t∈R, consider ‖tv1+v2‖^2=. This gives a quadratic. Since always positive (positive definite) the discriminant is always 0 or less. This gives result
If equality, discriminant is 0 so there is a repeated root. So there is an α∈R so that ‖αv1+v2‖=0. By positive-definiteness, αv1+v2=0 so v1,v2 linearly dependent