4) Linear Maps And Vector Subspaces Flashcards
Matrix multiplication
L_A( r_1,…,r_n)
n n
( Σ a_1j •r_j,…, Σ a_mj•r_j)
j=1 j=1
WHEN IS A MAP
A LINEAR MAP
A map L: R^p -> R^q is linear if
For all vectors r and r’ ∈ R^p and t∈R:
1) L(r + r’) = L(r) + L(r’)
2) L(tr) = tL(r)
Every linear map by matrix, bijective
q * p
Proposition: linear map and matrix
If A LINEAR map L: R^p -> R^q
THEN equal to a unique q x p matrix
Standard basis -> columns of linear map
Proof:
Any vector expressed in R^p in terms of a standard basis, since linear map properties of addition of L(e_i), which each in R^q, written as standard basis for R^q regrouped shows matrix multiplication
Proposition of composition of linear map
If A,B are matrices and the product AB is defined, L_AB = L_A • L_B
Composition
Directional derivative and linear maps
And dot products?
Directional derivative of F at (u,v) in the direction (a,b)
For x=F(u,v) a smooth real valued function, D(F)(u,v) = [partial derive x wrt u, partial deriv x wrt v] row matrix
So it’s a linear map as represented by matrix
D(F)(u,v) = dot product of (a,b) • gradient =
a (partial F wrt u) + b (partial F wrt v)
Directional deriv dot product of vector with grad f
Generally unit vector
E.g. Cos theta sin theta
Def of grad
Gradient of F ,
Nabla F = grad F
= ( ∂F/ ∂u_1,…, ∂F/∂u_p) ∈ R^p
Where x= F(u_1,…, u_p) is a smooth function of p variables
Definition: vector subspace
Let V be a subset of R^p. Then V us a vector subspace of R^p if:
1) (0 vector) ∈V (always through 0)
2) if vectors r,r’∈V then vector r+r’ ∈V
3) if vector r ∈V and t ∈R, vector tr∈V
Subspaces arise
Trivial subsets {0vector} and R^p are Subspaces of R
R^2: straight lines through origin
ax+by =0 where a^2 + b^2 not equal to 0,
V ={tvector(r) : t∈R}
R^3: straight lines and planes through origin
Planes can be written ax+by+cx=0
Where a^2+ b^2+c^2 not equal to
As linear combo of linearly independent
V= {sr+tr’ : s,t∈R}
Indep i.e. sr+tr’=vector(0) only for (0,0)
In R^3 lines through the origin can be written as {tr : t∈R} for non zero vector r and described by two linear equations ax+by+cx=0 and a’x+b’y+c’z=0 which don’t coincide and intersection -» line
Given an equation for a plane: finding subspace representing
Eg find two linearly independent vectors in the plane by inspection ( cross product doesn’t equal 0)
V as linear combo
Or a line is:
V ={tr : t∈R}
For a subspace with zero vector
V of R^p. With r_1,…r_n ∈ V,
Known such that V= linear combo of those n vectors
If 0 vector ∈ {r_1,…,r_n} are linearly dependent!!!
Two non-zero vectors r,r’ in R^p are linearly dependent if and only if each a scalar multiple of each other
Unique way of writing a vector in a subspace with vectors r_1,…,r_n
When linearly independent
Proposition: let r_1,…,r_n ∈ R^p be linearly independent. Suppose that t_1,…,t_n & t’_1,…,t’_n are scalars such that
t_1r_1 + …+t_nr_n = t’_1r_1 + …+t’_nr_n
Implies t_1=t’_1,… t_n=t’_n
Proof:
If linearly independent then only equal zero when (t_1 -t’_1) =0 i.e. Equal
Definition: spanning set for V
For V subspace of R^p. Then a subset {r_1,…,r_n} of elements of V. Spans V (a spanning set) if every element vector(v) in V can be linear combo
Vector(v) =t_1r_1 + …+t_nr_n for t_1,…,t_n in R
r_1,…,r_n form a basis for V if spanning set spans V and they are linearly independent
Theorem: for V a vector subspace
Spanning set , basis
Theorem for dims
Theorem: Let V be a vector subspace of R^p:
1) given linearly independent vectors r_1,…,r_k in V there are FINITELY MANY FURTHER r_(k+1),…,r_n st they form a basis
2) Given a spanning set r_1,…,r_m there is a finite subset r_1,…,r_p that is a basis for V
3) Every basis for V has same dimension and elements
Dim of R^p is p
dim of {0 vector} is 0 and basis is the empty set
Standard basis spans R^p and is linearly independent
FINDING A BASIS
R^2: (a,b) not equal to 0, on its own linearly independent to (-b,a)
For example: (a,b) (-b,a) basis
R^3: suppose given 2 check linearly independent by cross then basis is r,r’, r x r’
Finding a basis for a line
For au+be=0 a line a^2 + b^2 not equal to 0 be a line through the origin in R^2
V ={(u,v) : au+bv=0}
Clearly (-b,a) in V
For any (u,v) in V
(u,v)• (a,b) =0
Perpendicular so scalar multiple of (-b,a) i.e. (u,v) =t(-b,a) for t in R
(-b,a) spans V and it’s a basis
As a line au+bv=0 is a line in direction (-b,a)
Finding a basis for a plane
For a plane au+bv+cw=0 where a^2 + b^2+ c^2 not equal to 0.
Find a basis for V={(u,v,w) :au+bv+cw =0 } subset of R^3
Find two vectors in the plane
Linearly independent if cross product not equal 0 vector
Try third and then determinant of 3 vectors not equal to 0 then linearly independent
If we are given basis we can find plane equation by cross product of two vectors is a normal
Proposition: For V and V’ subspaces of R
^p
Let V and V’ be subspaces if R^p:
If V is a subset of V’
Then dim V is less than or equal to dim V’
If dim V = dim V’ then V = V’
Proof: we can express a basis b_1,…,b_n as a basis for V and as is a subset for V’ and they’d elinearly independent they can be extended to a basis b_1,…,b_n,b_n+1,…,b_n’ by a the tiren. So we have dimensions bigger or equal to if we have equal dimensions it means it’s a basis also for V’ so spans V’ and equal
Definition: for set of linearly independent vectors
Let r_1,…,r_n be vectors in R^p. The set {r_1,..,r_n} is linearly independent if the only scalars t_1,..,t_n such that t_1+… +t_n r_n = vector(0)
Are t_1=0,…,t_n=0
If one of the vectors is vector(0) then we know it’s linearly dependent
Definition of the span of vectors
Let r_1,..,r_k be vectors in R^p. Then the span of r_1,…,r_k is the set of all linear combinations of the r_1,..,r_k. That is span { r_1,..,r_k} = {t_1r_1 +…+ t_kr_k | t_1,…,t_k in R}
THE SPAN IS ALWAYS A SUBSPACE
Subspaace of R^p or has dimensión less than or equal to
As. If not linearly independent then can span by less vectors.
Proposition: linearly independent vectors and showing a way to obtain intrinsic equation for subspace of R^p
Let r_1,…,r_p-1 be linearly independent vectors in R^p. Write V for the subspace spanned by r_1,r_2,…,r_p-1. Then V is defined by the equation:
Det ( [row 1: r_1..] [Row 2: r_2] [....] [Row p-1: r_p-1 [Row p: x_1 x_2 ... x_p] ) =0 VECTORS AS ROWS
I.e. det is 0 iff the rows are linearly dependent since the first p-1 are the rest are only if vector x is. That’s equivalent to the final row
being in the span of the first p-1 rows.
This gives the intrinsic equation of a plane through 0 by finding the determinant.
Proposition:
Image of a linear map
L:R^p -> R^q be a linear map. Then the image of L, defined by im(L) = { L(vector(r)) | vector r in R^p}
Is a subspace of R^q.
Proof:
We show by subspace criterion that 0_q is in im(L), for r’_1and r’_2 in im(L) Given by..
We have that r’_1 + r’_2 = L(r_1 + r_2) is in (L) and finally for scalar t tr’_1 = tL(r_1) = L(tr_1) in im(L)
Proposition:nullspace or kernel of a linear map
Let L: R^p -> R^q be a linear map. Then the nullspace or KERNEL of L defined by Ker(L)
= {vector(r) in R^p| L(vector(r)) = vector(0)}
Is a SUBSPACE of R^p
P not q
Proof: subspace criterion, vector 0_p is in nullspace, for two vectors in Ker(L) their sum is also: L(r_1 + r_2) = L(r_1) + L(r_2) = 0_q + 0_q and similarly for scalar of vector in kernel.
Definition: of a linear map rank and null
Let L:R^p -> R^q be a linear map
Rank(L) is the dimension of the image of L
Null(L) is the dimension of the nullspace or kernel
Null(L) is an integer and ker(L) the nullspace is a vector subspace
The rank nullity theorem
Let L: R^p -> R^q be a linear map. Then rank(L) + null(L) =p
Proof:
Basis for Ker(L) can be shown to extended for a basis of R^p…
We claim and show that L(b_n+1),..L(b_p) , the image of these vectors, is a basis for the image im(L). By criterion of linearly independent and showing spanning. Thus dimensions must give p.
Corollary of the rank nullity theorem
For a linear map L:R^p -> R^q
Rank(L) is LESS THAN OR EQUAL TO min(p,q)
E.g. If a linear map has rank 0 then it’s image is the zero subspace if R^q and it’s basis is the empty set. So L(r) =0 for all vectors r in R^p and it’s the zero linear map.
Sets of solutions/ solution sets for NULLSPACE
NULLSPACE is the set of solutions to the system of linear equations
Set of vectors x
Vector(x) = (x_1,..x_p)
a_11 x_1 + a_12 x_2 +…+ a_1p x_p =0
a_21 x_1 + a_22 x_2+…+ a_2p x_p =0
….
aq1 x_1 + aq2 x_2 +…. + a_qp x_p =0
Row reductions on these k equations in p unknowns is possible for k less than or equal to q.
When we have k is the number of pivots we have p-k free variables and p-k basic sols
The nullity is the number of basic solutions for which every solution can be expressed uniquely as
Sets of solutions/ solution sets for The IMAGE
The image of map L:R^p -> R^q is the set of (y_1,…,y_q) such that
a_11 x_1 + a_12 x_2 +…+ a_1p x_p =y_1
a_21 x_1 + a_22 x_2+…+ a_2p x_p =y_2
….
a_q1 x_1 + aq2 x_2 +…. + a_qp x_p =y_q
Has a solution (x_1,..,x_p) in R^p
The rank is the dimension of the SUBSPACE OF R^q SPANNED BY THE COLUMNS OF A
no row reduction only find linearly independent columns
The image set of vectors x in R^p is written as a linear combo of this
Same nullspace after reduction not same column span
Writing a general vector in its basis form
A general vector x in R^p can be written as x_1 e_1 +…+ x_p e_p where e_i are standard basis
Useful for matrix multiplication
Proposition: linear map with nullity zero with L(r_1) =L(r_2)
Proposition: let L: R^p -> R^q be a linear map with nullity map with nullity zero and suppose that r_1,r_2 in R^p have L(r_1) = L(r_2). Then r_1 = r_2.
IF THE NULLITY IF L IS 0 then L IS INJECTIVE
Proof: shown as by linearity we have L(r_1) -L(r_2) =0 so r_1 - r_2 is in ker(L) and therefore as the nullity is 0 vectors taken away equal 0 and thus equal.
E.g. If A has nullity zero only solution to equations is a unique solution
Example given a map: find nullity and rank
Check that it’s a linear map.
Find the rank by linearly independent vectors and dimension of column space.
Find the nullity from reduction.
Check by RNT.
Finding inverse matrices
•row operations
Augmented matrix perform row operations
•minors and cofactors
Minor: determinant if I,jth
Cofactor: the determinant multiplied by (-1)^(I+j)
Then take the matrix of cofactors: TAKING TRANSPOSE AND DIVIDING BY DETERMINANT
Example of the determinant matrix of polar coordinates finding the rank and nullity
To find the nullity we can perform row reduction: by adding multiples of costheta or sintheta in one case we here each not equal to 0.
•Consider both not equal to 0. That is by multiplying through rows by them and performing row reduction. Then dividingwe obtain the identity matrix. Hence nullity is 0
• if they are equal to 0 we consider each case separately: costheta not equal to 0 or sintheta not equal to 0. By this we have identity matrix again
•case r us equal to 0 rank is 1 and nullity q
Hence for all theta F(0,theta) = (0,0)
The nullity is zero and the rank is two at all points except the points (0,theta) at which points the nullity is one and the rank is 1.
Example 4.37
Smooth map.
Example 4.38
Look
Example 4.4p
Look
Lemma: for two linear maps such that their composition is the 0 map then we have..
Let L:R^p -> R^q and L’: R^q -> R^r be linear maps such that L’•L is the 0 map R^p -> R^r.
Then im(L) is a subset of ker(L’)
Proof: from the assumption we can show any element of image(L) is in kernel of L’
Last example chapter 4
Look