Sketch Proofs Flashcards

1
Q

Matrix algebra proofs (Prop 1&2)

Eg. A+(B+C)=(A+B)+C
Or A(BC)=(AB)C
A

Just use the (i,j) entry of the matrices

For the multiplication ones it may be easier to work on both sides and meet in the middle

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

AIn=A=InA (Lemma 3)

A

Just use (i,j) entry

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Show if A is invertible, A has a unique inverse (Lemma 4)

A

Assume B,C both inverses. Consider BAC show B=C

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Show (AB)-1=(B-1)(A-1) if A,B invertible (Prop 5)

A

Literally just show (AB)(B-1)(A-1)=I

And the other way round

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

An invertible nxn matrix can be reduced to In by EROs (Thm 6)

A

Show that only solution of Ax=0 is x=0 [x=Ix=(A-1)Ax=0]
Therefore x=0 is only solution of Ex=0. Where E is RRE of A

There are no free variables since all Xi must equal 0
Since there are no free variables each column must contain the leading entry of a row. Since each leading entry is to the right of the last one E=In

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Show applying an ERO is the same as left multiplying by the elementary matrix (Lemma 7)

A

Show manually for all 3 types

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

If X1,X2,…,Xn are EROs that take A to In. If B is the result of using these EROs on In, then B is the inverse of A (Thm 8)

A

Consider the series of matrix multiplications of the elementary matrices then manipulate it

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

In a vector space there exists a unique additive identity 0v (Lemma 9)

A

Assume multiple show equal

W+v+U

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

In a vector space there exists unique additive inverses (Lemma 10)

A

Assume multiple show equal

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Properties of vector spaces (Prop 11)

P0v=0v
0V=0v
-P)v=P(-v
If Pv=0v then P=0 or v=0v

A

Use additive inverses

Try to prove these, they’re harder than you think

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Subspace test (Thm 12)

U is a subspace iff
0v in U and
Pu1+u2 in U for all u1, u2

A

Left->Right
Use axioms to show it’s always true

Right -> Left
Set U1,U2 to show closed

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

If U is a subspace of V then U is a vector space (Prop 13)

A

State U has legitimate operations because it’s closed under addition and scalar multiplication

Then show the axioms - show it has an additive identity and inverse and state all other axioms inherited from V

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Take U,W subspaces of V

Then U+W is a subspace of V and
U intersect W is a subspace of V

(Prop 14)

A

Subspace test

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

The span of any number of elements in V is a subspace of V (Lemma 15)

A

Use the subspace test

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Let v1,…,vm be linearly independent. Let vm+1 not be in span(v1,…,vm)
Then v1,…,vm,vm+1 are linearly independent (Lemma 16)

You can throw something not in a span and it’ll still be linearly independent

A

Take a linearly combination of them all. Assume am+1 is not 0 and show contradiction
Then show all the a’s=0 so it in linearly independent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

S (a subset of V) is a basis for V iff every vector in V has a unique expression as a linear combination of elements of S (Prop 17)

A

-> take 2 linear combinations and show they are the same

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Suppose V has a finite spanning set S. Then S contains a linearly independent spanning set (basis) (Prop 18)

A

Take the largest subset of S that is linearly independent and show it is a spanning set. Assume for contradiction it is not a spanning set and throw an extra element in to get contradiction

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Steinitz Exchange Lemma (Thm 19)

Take X a subset of V. Suppose that u in Span(X) but that u not in Span(X\{v}) for some v. 
Then let Y=(X\{v})U{u}. Span(Y)=Span(X)
A

Write u as a linear combination of X and rearrange to get v= (letting v=vn).
Then consider arbitrary element w in Span(Y). Write out linear combination and show w also in Span(X).
Same for w in Span(X) -> show in Span(Y)
Span(X)=Span(Y) by double inclusion

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

For S,T finite subsets of V. If S is linearly independent and T spans V, the size of S is smaller or equal to the size of T. (Thm 20)

“Linearly independent sets are at most as big as spanning sets”

A

Use Steinitz exchange Lemma to swap out elements of T and replace them with elements of S. This new list still spans V. If you run out of elements of T before S you have a list Tj composed entirely of elements of S that spans V. If there are elements of S remaining, say u(j+1) then u(j+1) is a linear combination of Tj which is false since S is linearly independent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

If V is finite dimensional and S, T are bases of V. S and T are finite and the same size (Cor 21)

A

Since V is finite dimensional there exists a basis B of size n.

Then use the fact that linearly independent sets are at most as big as spanning sets 2000 times.
Remember B, S, and T are all spanning and linearly independent. Just use the rule in every possible way

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Let A be a matrix and B be a matrix obtained from A by a finite sequence of EROs. Then rowsp(A)=rowsp(B) and rowrank(A)=rowrank(B). (Lemma 22)

A

Check for each type of ERO that the spans are the same

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Let U be a subspace of finite-dimensional V. Then U is finite-dimensional and dim(U)<=dim(V). If dim(U)=dim(V), then U=V. (Prop 23)

A

Choose S to be a largest linearly independent set contained in U. Since is is linearly independent the size of S <=n. Show by contradiction that S spans U and such S is a basis for U

Then show that if dim(U)=dim(V) adding a vector to S causes S to be linearly dependent so S spans V.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Show that, in an n-dimensional vector space V, If U is a spanning set for V and U has size n, U is a basis for V.

A

U is a spanning set for V and so U contains a basis for V. A basis for V must have size n so U is this basis.

24
Q

Show that, in an n-dimensional vector space V, if U is a linearly independent subset of V and U has size n, U is a basis for V

A

All linearly independent subsets have size at most n (as there is a spanning set of size n - the basis). So adding any vector v to U gives a linearly dependent set.

So v can be written as a linear combination of U. So U spans V so U is a basis

25
Q

Let U be a subspace of a finite-dimensional vector space V.
Then every basis of U can be extended to a basis of V.

That is, if u1,…,um is a basis for U. Then there is a basis for V that is u1,…,um,u(m+1),…,un 
for u(m+1),…,un ∈V. (Thm 24)
A

Take bases for U and V respectively. Create a list S0 for the basis of U. For each element in the basis of V add it to the list if vi∉Span(Si).
All Si are linearly independent as adding something not in the span maintains linear independence.
Proceed inductively.
vi ∈ Si ∀i and hence vi ∈ Sn ∀i.
So Span(Sn)=V.
So we have constructed a basis for V out of a basis for U

26
Q

Dimension Formula

Let U, W be subspaces of finite-dimensional vector space V over F. Then dim⁡(U+W)+dim⁡(U∩W)=dim⁡(U)+dim⁡(W) (Thm 25)

A

Consider a basis for U∩W.
Extend it to a basis for U and extend it to a basis for W.
Then show the combination of these bases is a basis for U+W: Show it spans U+W and show its linearly independent. Then the result follows

27
Q

Let U, W be subspaces of V. Then the following are equivalent (alternate definitions of direct sum) (Prop 26)

i) V is the direct sum of U and W
ii) every v has a unique expression v=u+w
iii) dim(V)=dim(U)+dim(W) and V=U+W
iv) dim(V)=dim(U)+dim(W) and U∩W ={0v}
v) if u1,…,um is a basis for U and w1,…,wn is a basis for W. Then u1,…,um,w1,…,wn is a basis for V

A

(i) <=> (ii) assume 2 expressions and show they are equal
(i) <=> (iii) and (i) <=> (iv) by dimension formula
(ii) <=> (v) quite simple via basis manipulation

28
Q

Let T:V->W be linear. Then T(0v)=0w (Prop 27)

A

Let z=T(0v)∈W. Consider z+z and show z=0w

29
Q

Let T:V0>W. Then the following are equivilent:
(i) T is linear
(ii) T(av1+bv2)=aT(v1)+bT(v2)
(iii) For any n>=1. T(a1v1+…+anvn)=a1T(v1)+…+anT(vn)
(Prop 28)

A

(i) =>(ii) simply
(ii) =>(i) show it abides definition of linear
(ii) =>(iii) by induction
(iii) =>(i) show it abides definition of linear

30
Q

Do the set of linear transformations form a vector space?
Let S, T be linear maps such that S,T:V→W.
Define S+T:V→W by (S+T)(v)=S(v)+T(v) and
Define λS:V→W by (λS)(v)=λS(v)
(Prop 29)

A

Show the vector space axioms fit

31
Q

Let S:U->V and T:V->W be linear.
Then ToS:U-> W is linear.
(Prop 30)

A

Just do the linear map test: Show (ToS)(au+bv)=a(ToS)(u)+b(ToS)(v)

32
Q

Let T:V->W be linear. Then T is invertible iff T is bijective (Prop 31)

A

Assume T has an inverse S. ST=idv and TS=idw. Show T is 1-1 and onto. To show 1-1 assume T(v)=T(w) and show v=w. To show onto note that TS(w)=w for all w so w is in the image of T

Assume T is bijective. Then T has an inverse S:W->V. Show S is linear. Take w1,w2 such that v1=S(w1),v2=S(w2). This implies T(v1)=w1, T(v2)=w2. Then show S(aw1+bw2)=aS(w1)+bS(w2)

33
Q

Let S:U->V and T:V->W be invertible linear maps. Then TS:U->W is invertible and (TS)^-1=S^-1 T^-1
(Prop 32)

A

Just show (TS)(S^-1 T^-1)=idv and (S^-1 T^-1)(TS)=idu

34
Q

T(v1)=T(v2) iff v1-v2∈KerT

Lemma 33

A

Quite simple. Just rearrange

35
Q

T:V->W. T is injective iff kerT={0} (Cor 34)

A

=> Assume kerT≠{0} for contradiction. Then there is a v such that T(v)=0w. So T(v)=T(0v) so not injective.

<= Same as all injective proofs. Assume T(v1)=T(v2) show v1=v2

36
Q

i) kerT is a subspace of V and ImT is a subspace of W
ii) If A is a spanning set for V, then T(A) is a spanning set for ImT
iii) If V is finite dimensional, then KerT and ImT are finite dimensional

A

i) Use the subspace test
ii) Take a spanning set A. Say that for any w in ImT there is v: T(v)=w then write v in terms of A and expand
iii) KerT<=V so is finite dimensional. ImT is finite dimensional from ii

37
Q

Rank-Nullity Theorem
Let V, W be vector spaces with V finite-dimensional.
Let T:V→W be linear. Then dimV=rank(T)+null(T) (Thm 36)

A

Take a basis, v1,…,vn, for kerT.
Extend it to a basis v1,…,vn,v’1,…,v’r for V.
Let wi=T(v’i)

CLAIM: S={w_1,…,w_r } is a basis for ImT.

S∪{T(v_1),…,T(v_n)} spans ImT from proposition 35 but v1,…,vn∈kerT so T(v1 ),…,T(vn ) do not contribute and so S spans ImT

Show S is linearly independent. Take a linear combination of S = 0w. Rewrite as T(v’i) and use linearity to get an element of kerT.
This can then be written as a linear combination of v1,…,vn. But v1,…,vn,v’1,…,v’r are linearly independent so all scalars=0 so w1,…,wr are linearly independent.

Result Follows

38
Q

Let T:V→V be linear. The following are equivalent: (Cor 37)

(i) T is invertible
(ii) rankT=dimV
(iii) nullT=0

A

(i) =>(ii). T invertible means T bijective means T surjective. So ImT=V so (ii)
(ii) =>(iii) rank-nullity
(iii) =>(i). Show T injective (kerT={0v}). Show T surjective (rank nullity). T bijective=>T invertible

39
Q

Let T:V→V be linear.

Then any one-sided inverse of T is a two-sided inverse, and so is unique. (Cor 38)

A

Say T has a right inverse S so TS=idv. Since idv surjective, T surjective. So rankT=dimV. So T has an inverse say S’. Show S=S’

Say T has a left inverse S so ST=idv. Since idv bijective, T bijective. So nullT=0. So T has an inverse say S’. Show S=S’

40
Q

Let V and W be vector spaces, with V finite-dimensional.
Let T:V→W be linear. Let U≤V. Then
dimU-nullT≤dimT(U)≤dimU.

In particular, if T is injective then dimT(U)=dimU (Lemma 39)

A

Take S:U→W as the restriction of T to U. S is linear and kerS≤kerT so nullS≤nullT.

By rank-nullity. dim⁡(ImS)=rankS=dim⁡U-nullS.

Use the inequalities and the fact that T(U)=Im(S) to get the needed inequalities.

If T injective then nullT=0. So from proved result: dimU≤dimT(U)≤dimU

41
Q

Let V be an n-dimensional vector space over F, let Bv be an ordered basis for V.
Let W be an m-dimensional vector space over F, let Bw be an ordered basis for W.
Then
(i) The matrix of 0:V→W is 0mxn
(ii) The matrix of idv:V→V is In
(iii) If S:V→W, T:V→W are linear and α,β∈F, then M(αS+βT)=αM(S)+βM(T)

Moreover, let T:V→W be linear, with matrix A wrt.Bv and Bw
Take v∈V with co-ordinates x=(x1,…,xn ) with respect to Bv. Then Ax is the coordinate vector of T(v) wrt. Bw

(Prop 40)

A
Consider the standard bases and T(vi) and construct the matrix like normal.
Then for (iii) simply deconstruct the matrix and show the equality.

Consider T(v)=T(sum of xjvj). Use linearity and write as elements of A. Then show that the coordinates of T(v) are the entries of Ax

42
Q

Let U, V, W be finite-dimensional vector spaces over F, with ordered bases Bu,Bv,Bw respectively.
Say |Bu|=m, |Bv|=n, |Bw|=p.
Let S:U→V,T:V→W be linear. Let A be the matrix of S wrt. Bu and Bv. Let B be the matrix of T wrt. Bv and Bw.
Then the matrix of T∘S wrt. Bu and Bw is BA. (Prop 41)

A

Quite intuitive.
Set up what S(ui ) and T(vi) are using summation notation.
Use the ideas of matrix multiplication and show that your final result is the (i,j) entry of BA

43
Q

Take A∈Mmxn(F)
Take B∈Mnxp(F)
Take C∈Mpxq(F).
Then A(BC)=(AB)C (Using linear maps). (Cor 42)

A

Consider A, B and C as left multiplication maps (which we know are linear).
Simple but just a bit nasty with notation.
Make sure you define all the maps with correct domain and codomain.

44
Q

Let V be a finite-dimensional vector space.
Let T:V→V be an invertible linear transformation.
Let v1,…,vn be a basis of V. Let A be the matrix of T with respect to this basis. Then A is invertible and A inverse is the matrix of T inverse with respect to this basis.
(Cor 43)

A

Quite simple. T^(-1) is linear and so has a matrix B. T^(-1) (T)=idv=T(T^(-1)).
The matrices for these compositions must also be equal. This gives you definition of inverse

45
Q

Let A be an n x n matrix. Any one-sided inverse of A is a two-sided inverse (Cor 44) (in terms of linear maps)

A

Let A represent the linear map T:V→V.
If A has a left inverse B such that BA=I. Then T has a left inverse T’:V→V represented by matrix B such that T’T=idv. However, if T’ is a left inverse, T’ must be a unique two-sided inverse T^(-1) (proved earlier) so TT’=idv.
The matrix for TT’ is AB so AB=I=BA so B is a two-sided inverse for A
Same proof for right inverse

46
Q

Change of Basis Theorem
Let V, W be finite-dimensional vector spaces over F.
Let T:V→W be linear.
Let v1,…,vn and v’1,…,v’n be bases for V.
Let w1,…,wm and w’1,…,w’m be bases for W.
Let A=(aij)∈Mmxn(F) be the matrix for T wrt. v1,…,vn and w1,…,wm.
Let B=(bij)∈Mmxn(F) be the matrix for T wrt. v’1,…,v’n and w’1,…,w’m.
Take pij.qij∈F such that v’i=sum of (pjivj) from j=1 to n and w’i’=sum of (qjiwj) from j=1 to m.
Let P=(pij)∈Mnxn(F) and Q=(qij)Mmxm(F).
Then B=Q^(-1)AP.
(Thm 45)

A

First take Q to be invertible and set up Q^(-1)=(r_ij)
so that wi=sum(rjiw’j) from j=1 to m

The rest follows quite nicely. You’re trying to show what B equals so start with T(v’i) as that’s what B represents.

47
Q

Take A∈M_mxn (F), let r=colrank(A).
Then there are invertible matrices P∈Mnxn(F) and Q∈Mmxm(F) such that Q^(-1)AP has the block form
( Ir 0rxs )
( 0txr 0txs )

where s=n-r and t=m-r.
(Prop 47)

A

Consider LA: Fcol^n→Fcol^m (which has matrix A wrt. Standard bases), and find suitable bases for domain and codomain with respect to which the matrix for L_A has the required form.

Find the rank and nullity of LA (considering rank=colrank(A)). Take a basis of kerLA and extend it to a basis of the domain.

You can then find a basis for the image and extend that to a basis for the codomain

Then consider each basis vector of the domain under the transformation and you get a matrix of the required form

Then change of basis theorem then shows that there are change of basis matrices P and Q that give the result

48
Q
Let A be an m x n matrix.
Then colrank(A)=rowrank(A) (Thm 49)
A
State that there exists Q, P such that B=Q^(-1)AP where B is in the block form.
rowrank(Q^(-1)AP)=rowrank(A), 
colrank(Q^(-1)AP)=colrank(A).
But rowrank(B)=colrank(B)=r. 
So rowrank(A)=r=colrank(A)
49
Q

Let A be an m x n matrix
Let x be the n x 1 column vector of variables x1,…,xn.
Let S be the solution space of the system Ax=0 of m homogeneous linear equations in x1,…,xn, that is,
S={v∈ n-colvectors :Av=0}. Then dimS=n-colrankA
(Prop 50)

A

See from theorem that S is the kernel of LA. So consider LA: n-colvectors → m-colvectors

Considering the standard basis e1,…,en of n-colvectors. Im(LA) is spanned by Ae1,…,Aen. Since Aei is the ith column of A,
Im(LA )= colspace(A) ⇒ rank(LA )=colrank(A)

Then by rank-nullity n = null(LA )+rank(LA ) = dim⁡(S)+colrank(A)

50
Q

Let V be a finite-dimensional vector space over F.
Let be a bilinear form on V.
Let v1,…,vn be a basis for V. Let A∈Mnxn(F) be the associated Gram matrix.
For u,v∈V,let x=(x1,…,xn )∈F^n and y=(y_1,…,y_n)∈F^n be the unique coordinate vectors such that
u=x1v1,…,xnvn and
v=y1v1,…,ynvn.
Then <u> =xAy^T
(Prop 51).</u>

A

Just write <u> in terms of the basis. Then use linearity and the entries of the gram matrix and you get the answer eventually</u>

51
Q

Let V be a finite-dimensional real inner product space.
Take u∈V{0}.
Define u^⊥≔{v∈V: =0}. Then u^⊥ is a subspace of V, and dim⁡(u^⊥ )=dim⁡(v)-1 and V=Span(u)⨁u^⊥. (Prop 52)

A

Consider f:V→R given by f(v)=.
f is linear since is bilinear.
Then since Im(f)≤R it can be shown rank(f)=1. Then rank-nullity

Then show span(u)∩u^⊥={0} and since dim⁡(span(u))+dim⁡(u^⊥) = dim⁡(V),
V=span(u)⨁u^⊥.

52
Q

Let {v1,…,vn} be an orthonormal set in an inner product space V.
Then v1,…,vn are linearly independent. (Lemma 53)

So if V is n-dimensional v1,…,vn are a basis

A

As in all linear independent proofs consider 0=α1v1+⋯+αnvn and show αi=0 ∀i

Consider 0=<0,vi> ∀i. Plug in the above and get the result.

53
Q

Let V be an n-dimensional real inner product space.

Then there is an orthonormal basis v1,…,vn of V. (Theorem 54)

A

Proof by induction on n.
n=0 trvial.
n=1 take v∈V.Let v1=v/‖v‖ . Show v1 is an orthonormal basis.

Take n=n. Define v1 in the same way. Consider U=v1^⊥ which is n-1-dimensional. U is an inner product space as you can restrict to U×U

So there is an orthonormal basis for U, v2,…,vn. Show this and v1 is an orthonormal basis for V

54
Q

Take X∈Mnxn(R). Consider R^n equipped with the usual inner product =x⋅y.
The following are equivalent:
(i) XX^T=In
(ii) X^TX=In
(iii) The rows of X form an orthonormal basis of R^n
(iv) The columns of X form an orthonomal basis of Rcol^n
(v) For all x,y∈R^n,we have xX⋅yX=x⋅y

(Lemma 55)

Note (v) says that the right multiplication map of an orthogonal matrix preserves the dot product (and such preserves length and angle). It is therefore an isometry as we already know.

A

(i) ⇔(ii) shown earlier with linear maps
(i) ⇔(iii) Say the rows of X are x1,…,xn. The (i,j) entry of XX^T is xi⋅xj=δij since XX^T=In
(ii) ⇔(iv) Similar to above
(i) ⇒(v) Consider xX∙yX=(xX)(yX)^T=xXX^Ty^T=xy^T=x⋅y
(v) ⇒(iii) Let e1,…,en be the standard basis of R^n. Then e1X is the ith row of X. We have eiX⋅ejX=ei⋅ej=δij. So e1X,…,enX is an orthonormal set and therefore an orthonormal basis.

55
Q

Cauchy-Schwarz Inequality –
Let V be a real inner product space. Take v1,v2∈V. Then
|| ≤ ‖v1‖‖v2‖, with equality iff v1,v2 are linearly dependent. (Thm 56)

A

If v1 or v2=0. Then inequality clear, assume v1,v2≠0.

For t∈R, consider ‖tv1+v2‖^2=. This gives a quadratic. Since always positive (positive definite) the discriminant is always 0 or less. This gives result

If equality, discriminant is 0 so there is a repeated root. So there is an α∈R so that ‖αv1+v2‖=0. By positive-definiteness, αv1+v2=0 so v1,v2 linearly dependent