Sketch Proofs Flashcards
Matrix algebra proofs (Prop 1&2)
Eg. A+(B+C)=(A+B)+C Or A(BC)=(AB)C
Just use the (i,j) entry of the matrices
For the multiplication ones it may be easier to work on both sides and meet in the middle
AIn=A=InA (Lemma 3)
Just use (i,j) entry
Show if A is invertible, A has a unique inverse (Lemma 4)
Assume B,C both inverses. Consider BAC show B=C
Show (AB)-1=(B-1)(A-1) if A,B invertible (Prop 5)
Literally just show (AB)(B-1)(A-1)=I
And the other way round
An invertible nxn matrix can be reduced to In by EROs (Thm 6)
Show that only solution of Ax=0 is x=0 [x=Ix=(A-1)Ax=0]
Therefore x=0 is only solution of Ex=0. Where E is RRE of A
There are no free variables since all Xi must equal 0
Since there are no free variables each column must contain the leading entry of a row. Since each leading entry is to the right of the last one E=In
Show applying an ERO is the same as left multiplying by the elementary matrix (Lemma 7)
Show manually for all 3 types
If X1,X2,…,Xn are EROs that take A to In. If B is the result of using these EROs on In, then B is the inverse of A (Thm 8)
Consider the series of matrix multiplications of the elementary matrices then manipulate it
In a vector space there exists a unique additive identity 0v (Lemma 9)
Assume multiple show equal
W+v+U
In a vector space there exists unique additive inverses (Lemma 10)
Assume multiple show equal
Properties of vector spaces (Prop 11)
P0v=0v
0V=0v
-P)v=P(-v
If Pv=0v then P=0 or v=0v
Use additive inverses
Try to prove these, they’re harder than you think
Subspace test (Thm 12)
U is a subspace iff
0v in U and
Pu1+u2 in U for all u1, u2
Left->Right
Use axioms to show it’s always true
Right -> Left
Set U1,U2 to show closed
If U is a subspace of V then U is a vector space (Prop 13)
State U has legitimate operations because it’s closed under addition and scalar multiplication
Then show the axioms - show it has an additive identity and inverse and state all other axioms inherited from V
Take U,W subspaces of V
Then U+W is a subspace of V and
U intersect W is a subspace of V
(Prop 14)
Subspace test
The span of any number of elements in V is a subspace of V (Lemma 15)
Use the subspace test
Let v1,…,vm be linearly independent. Let vm+1 not be in span(v1,…,vm)
Then v1,…,vm,vm+1 are linearly independent (Lemma 16)
You can throw something not in a span and it’ll still be linearly independent
Take a linearly combination of them all. Assume am+1 is not 0 and show contradiction
Then show all the a’s=0 so it in linearly independent
S (a subset of V) is a basis for V iff every vector in V has a unique expression as a linear combination of elements of S (Prop 17)
-> take 2 linear combinations and show they are the same
Suppose V has a finite spanning set S. Then S contains a linearly independent spanning set (basis) (Prop 18)
Take the largest subset of S that is linearly independent and show it is a spanning set. Assume for contradiction it is not a spanning set and throw an extra element in to get contradiction
Steinitz Exchange Lemma (Thm 19)
Take X a subset of V. Suppose that u in Span(X) but that u not in Span(X\{v}) for some v. Then let Y=(X\{v})U{u}. Span(Y)=Span(X)
Write u as a linear combination of X and rearrange to get v= (letting v=vn).
Then consider arbitrary element w in Span(Y). Write out linear combination and show w also in Span(X).
Same for w in Span(X) -> show in Span(Y)
Span(X)=Span(Y) by double inclusion
For S,T finite subsets of V. If S is linearly independent and T spans V, the size of S is smaller or equal to the size of T. (Thm 20)
“Linearly independent sets are at most as big as spanning sets”
Use Steinitz exchange Lemma to swap out elements of T and replace them with elements of S. This new list still spans V. If you run out of elements of T before S you have a list Tj composed entirely of elements of S that spans V. If there are elements of S remaining, say u(j+1) then u(j+1) is a linear combination of Tj which is false since S is linearly independent
If V is finite dimensional and S, T are bases of V. S and T are finite and the same size (Cor 21)
Since V is finite dimensional there exists a basis B of size n.
Then use the fact that linearly independent sets are at most as big as spanning sets 2000 times.
Remember B, S, and T are all spanning and linearly independent. Just use the rule in every possible way
Let A be a matrix and B be a matrix obtained from A by a finite sequence of EROs. Then rowsp(A)=rowsp(B) and rowrank(A)=rowrank(B). (Lemma 22)
Check for each type of ERO that the spans are the same
Let U be a subspace of finite-dimensional V. Then U is finite-dimensional and dim(U)<=dim(V). If dim(U)=dim(V), then U=V. (Prop 23)
Choose S to be a largest linearly independent set contained in U. Since is is linearly independent the size of S <=n. Show by contradiction that S spans U and such S is a basis for U
Then show that if dim(U)=dim(V) adding a vector to S causes S to be linearly dependent so S spans V.