MA106 - Linear Algebra Theorems Flashcards
Every matrix can be brought to row reduced form by elementary row transformations.
Need an algorithm, where:
After Termination the resulting matrix has Row Reduced Form
The algorithm terminates after finitely many steps
Call (i,j) the Pivot Position and aim the Pivot Entry. Begin with (i,j)=(1,1)
- if akj = 0 for all k ≥ i, move Pivot to (i, j+1)
Repeat Step 1, terminate if j=n - If aij = 0 but akj≠ 0 for some k > i, apply (R2) and swap ri, rk
- Now aij≠ 0, if aij≠1, apply (R3) to multiply ri by 1/aij
- aij=1, If any k≠i, akj≠0, use (R1) to subtract akjxri from rk
- akj=0 ∀ k≠i. Terminate if i=m or j=n, otherwise move Pivot to (i +1, j+1) and go back to Step 1
By Applying Elementary Row & Column operations, a matrix can be brought into the form of Identity in top left surrounded by 0
Use Elementary Row operations to row reduce A. Hence ai,c(i)=1.
Use (C1) ∀ aij≠0 with j≠ c(i) to replace cj with aij-c(i).
∀ i starting from i = 1, exchange ci and cc(i), putting all the zero columns at
the right-hand side.
The vectors v1,v2,…,vn ∈ V are linearly dependent iff either v1 = 0 or, for some r, the vector vr is a linear combination of v1, . . . , vr−1.
If v1=0, set a1=1, and ai=0 ∀ i>1. Hence the linear combination equals 0
If vr is a linear combination,
vr = α1v1 +···+αr−1vr−1
α1v1 + ··· + αr−1vr−1 − 1vr = 0
Conversely, suppose that v1,v2,…,vn ∈V are linearly dependent. Let r be maximal with ar≠0.
then α1v1 +α2v2 +···+αrvr = 0
If r=1, α1v1 = 0 which is only possible if v1=0
Otherwise
vr =−α1/αr. v1−···−αr−1/αr. vr−1.
The vectors v1,…,vn form a basis of V if and only if every v ∈ V can be written uniquely as v = α1v1 + · · · + αnvn; that is, the coefficients α1, . . . , αn are uniquely determined by the vector v.
Suppose there are other scalars βi ∈ K. Write out a vector v as a linear combination of both.
Let 0= v-v then show ai =βi.
Conversely,
Suppose α1v1+α2v2+···+αnvn =0, then
Ifα1v1+α2v2+···+αnvn = 0v1 + 0v2 +…
Uniqueness implies a1=…=an=0. hence they are linearly independent, hence form a basis.
Suppose that the vectors v1,v2,…,vn,w span V and that w is a linear combination of v1,…,vn. Then v1,…,vn span V.
any vector is a linear combination (including w). w is a linear combination of vi.
Substitute into first linear combination.
Suppose that the vectors v1, . . . , vr span the vector space V . Then there is a subsequence of v1,…,vr which forms a basis of V.
Sift v1,…,vr. Hence remaining vectors Span V.
No remaining are 0 or a linear combination
Hence they are linearly independent.
They form a basis
Let V be a vector space over K which has a finite spanning set, and suppose that the vectors v1, . . . , vr are linearly independent in V . Then we can extend the sequence to a basis v1,…,vn of V, where n≥r.
Suppose w1,…,wq is a spanning set.
Sift v1,…,vn,w1,…,wq. This results in a basis for V.
v1,…,vr are linearly independent, hence no vi removed. Hence basis contains them
What is the Exchange Lemma?
Suppose that vectors v1,…,vn span V and that vectors w1, . . . , wm ∈ V are linearly independent. Then m ≤ n.
Suppose that vectors v1,…,vn span V and that vectors w1, . . . , wm ∈ V are linearly independent. Then m ≤ n.
Place one wi in front of v1,…,vn and sift each time.
Since vj span V and wi are linearly independent, at least one vj is deleted.
In total m wi vectors are added, with at least 1 vj removed each time. Hence m ≤ n.
Let V be a vector space of dimension n over K. Then any n vectors which span V form a basis of V, and no n−1 vectors can span V.
After sifting, remaining vectors form a basis.
Hence there must be n=dim(V) vectors.
Let V be a vector space of dimension n over K. Then any n linearly independent vectors form a basis of V and no n+1 vectors can be linearly independent.
Any linearly independent set is contained in a basis. But there must be n=dim(V) vectors in the extended set.
If W1 and W2 are subspaces of V then so is W1 ∩ W2.
Show it is closed under addition and scalar multiplication. (Hint: Start with 2 elements of W1∩ W2)
If W1, W2 are subspaces of V then so is W1 + W2. In fact, it is the smallest subspace that contains both W1 and W2.
Show it is closed under addition & Scalar multiplication (Hint: Start with 2 elements of W1+W2)
Any subspace of V containing W1 and W2 must contain W1 + W2 hence it is the smallest
Let V be a finite-dimensional vector space, and let W1,W2 be sub- spaces of V . Then
dim(W1 + W2) = dim(W1) + dim(W2) − dim(W1 ∩ W2).
Find what is a subset of which. Define a basis for W1 ∩ W2 and extend them to form W1 and W2 separately. Show the resulting bases Span and are linearly independent.
Let T:U→V be a linear map. Then
(i) T(0U)=0V;
(ii) T(−u) = −T(u) for all u ∈ U.
Definition
i: T(0U)=T(0U +0U)=T(0U)+T(0U), Hence T(0U)=0V
ii. a=-1 in definition of T
Let U,V be vector spaces over K, let S be a basis of U and let f:S→V be any function assigning to each vector in S an arbitrary element of V . Then there is a unique linear map T:U→V such that for every s∈S we have T(s)=f(s).
Let u ∈ U, u = α1s1 + · · · + αnsn
If T exists:
T(u)=T(α1s1 +···+αnsn)=α1f(s1)+···+αnf(sn), Hence it would be unique.
Let U,V be vector spaces over K of dimensions n,m, respectively. Then, for a given choice of bases of U and V , there is a one-one correspondence between the set HomK(U,V) of linear maps U → V and the set Km,n of m×n matrices over K.
any linear map T : U → V determines an m × n matrix A over K.
Conversely, let A=(aij) be an mxn matrix over K. Then there is one Linear T: U→V. with T(ej)=Sum from 1 to m=aijfi Hence it is a one to one correspondence.
Let T : U → V be a linear map. Let the matrix A = (aij ) represent T with respect to chosen bases of U and V , and let u and v be the column vectors of coordinates of two vectors u ∈ U and v ∈ V , again with respect to the same bases. Then T(u)=v if and only if Au=v.
Summation Proof
- Let T1, T2 : U → V be linear maps with m × n matrices A, B respectively. Then the matrix of T1 + T2 is A + B.
- LetT:U→V be a linear map with m×n matrix A and let λ∈K be a scalar. Then the matrix of λT is λA.
Use Definitions.