4) Linear Maps And Vector Subspaces Flashcards

1
Q

Matrix multiplication

A

L_A( r_1,…,r_n)
n n
( Σ a_1j •r_j,…, Σ a_mj•r_j)
j=1 j=1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

WHEN IS A MAP

A LINEAR MAP

A

A map L: R^p -> R^q is linear if
For all vectors r and r’ ∈ R^p and t∈R:

1) L(r + r’) = L(r) + L(r’)
2) L(tr) = tL(r)

Every linear map by matrix, bijective
q * p

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Proposition: linear map and matrix

A

If A LINEAR map L: R^p -> R^q

THEN equal to a unique q x p matrix

Standard basis -> columns of linear map
Proof:
Any vector expressed in R^p in terms of a standard basis, since linear map properties of addition of L(e_i), which each in R^q, written as standard basis for R^q regrouped shows matrix multiplication

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Proposition of composition of linear map

A

If A,B are matrices and the product AB is defined, L_AB = L_A • L_B

Composition

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Directional derivative and linear maps

And dot products?

A

Directional derivative of F at (u,v) in the direction (a,b)
For x=F(u,v) a smooth real valued function, D(F)(u,v) = [partial derive x wrt u, partial deriv x wrt v] row matrix

So it’s a linear map as represented by matrix
D(F)(u,v) = dot product of (a,b) • gradient =
a (partial F wrt u) + b (partial F wrt v)

Directional deriv dot product of vector with grad f

Generally unit vector
E.g. Cos theta sin theta

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Def of grad

A

Gradient of F ,
Nabla F = grad F
= ( ∂F/ ∂u_1,…, ∂F/∂u_p) ∈ R^p

Where x= F(u_1,…, u_p) is a smooth function of p variables

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Definition: vector subspace

A

Let V be a subset of R^p. Then V us a vector subspace of R^p if:
1) (0 vector) ∈V (always through 0)

2) if vectors r,r’∈V then vector r+r’ ∈V
3) if vector r ∈V and t ∈R, vector tr∈V

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Subspaces arise

A

Trivial subsets {0vector} and R^p are Subspaces of R

R^2: straight lines through origin
ax+by =0 where a^2 + b^2 not equal to 0,
V ={tvector(r) : t∈R}

R^3: straight lines and planes through origin
Planes can be written ax+by+cx=0
Where a^2+ b^2+c^2 not equal to
As linear combo of linearly independent
V= {sr+tr’ : s,t∈R}
Indep i.e. sr+tr’=vector(0) only for (0,0)

In R^3 lines through the origin can be written as {tr : t∈R} for non zero vector r and described by two linear equations ax+by+cx=0 and a’x+b’y+c’z=0 which don’t coincide and intersection -» line

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Given an equation for a plane: finding subspace representing

A

Eg find two linearly independent vectors in the plane by inspection ( cross product doesn’t equal 0)
V as linear combo
Or a line is:
V ={tr : t∈R}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

For a subspace with zero vector

A

V of R^p. With r_1,…r_n ∈ V,
Known such that V= linear combo of those n vectors

If 0 vector ∈ {r_1,…,r_n} are linearly dependent!!!

Two non-zero vectors r,r’ in R^p are linearly dependent if and only if each a scalar multiple of each other

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Unique way of writing a vector in a subspace with vectors r_1,…,r_n
When linearly independent

A

Proposition: let r_1,…,r_n ∈ R^p be linearly independent. Suppose that t_1,…,t_n & t’_1,…,t’_n are scalars such that
t_1r_1 + …+t_nr_n = t’_1r_1 + …+t’_nr_n
Implies t_1=t’_1,… t_n=t’_n

Proof:
If linearly independent then only equal zero when (t_1 -t’_1) =0 i.e. Equal

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Definition: spanning set for V

A

For V subspace of R^p. Then a subset {r_1,…,r_n} of elements of V. Spans V (a spanning set) if every element vector(v) in V can be linear combo
Vector(v) =t_1r_1 + …+t_nr_n for t_1,…,t_n in R

r_1,…,r_n form a basis for V if spanning set spans V and they are linearly independent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Theorem: for V a vector subspace
Spanning set , basis
Theorem for dims

A

Theorem: Let V be a vector subspace of R^p:
1) given linearly independent vectors r_1,…,r_k in V there are FINITELY MANY FURTHER r_(k+1),…,r_n st they form a basis

2) Given a spanning set r_1,…,r_m there is a finite subset r_1,…,r_p that is a basis for V
3) Every basis for V has same dimension and elements

Dim of R^p is p
dim of {0 vector} is 0 and basis is the empty set

Standard basis spans R^p and is linearly independent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

FINDING A BASIS

A

R^2: (a,b) not equal to 0, on its own linearly independent to (-b,a)
For example: (a,b) (-b,a) basis

R^3: suppose given 2 check linearly independent by cross then basis is r,r’, r x r’

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Finding a basis for a line

A

For au+be=0 a line a^2 + b^2 not equal to 0 be a line through the origin in R^2

V ={(u,v) : au+bv=0}
Clearly (-b,a) in V
For any (u,v) in V
(u,v)• (a,b) =0
Perpendicular so scalar multiple of (-b,a) i.e. (u,v) =t(-b,a) for t in R
(-b,a) spans V and it’s a basis
As a line au+bv=0 is a line in direction (-b,a)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Finding a basis for a plane

A

For a plane au+bv+cw=0 where a^2 + b^2+ c^2 not equal to 0.
Find a basis for V={(u,v,w) :au+bv+cw =0 } subset of R^3

Find two vectors in the plane
Linearly independent if cross product not equal 0 vector

Try third and then determinant of 3 vectors not equal to 0 then linearly independent

If we are given basis we can find plane equation by cross product of two vectors is a normal

17
Q

Proposition: For V and V’ subspaces of R

^p

A

Let V and V’ be subspaces if R^p:

If V is a subset of V’

Then dim V is less than or equal to dim V’

If dim V = dim V’ then V = V’

Proof: we can express a basis b_1,…,b_n as a basis for V and as is a subset for V’ and they’d elinearly independent they can be extended to a basis b_1,…,b_n,b_n+1,…,b_n’ by a the tiren. So we have dimensions bigger or equal to if we have equal dimensions it means it’s a basis also for V’ so spans V’ and equal

18
Q

Definition: for set of linearly independent vectors

A

Let r_1,…,r_n be vectors in R^p. The set {r_1,..,r_n} is linearly independent if the only scalars t_1,..,t_n such that t_1+… +t_n r_n = vector(0)
Are t_1=0,…,t_n=0

If one of the vectors is vector(0) then we know it’s linearly dependent

19
Q

Definition of the span of vectors

A

Let r_1,..,r_k be vectors in R^p. Then the span of r_1,…,r_k is the set of all linear combinations of the r_1,..,r_k. That is span { r_1,..,r_k} = {t_1r_1 +…+ t_kr_k | t_1,…,t_k in R}

THE SPAN IS ALWAYS A SUBSPACE

Subspaace of R^p or has dimensión less than or equal to

As. If not linearly independent then can span by less vectors.

20
Q

Proposition: linearly independent vectors and showing a way to obtain intrinsic equation for subspace of R^p

A

Let r_1,…,r_p-1 be linearly independent vectors in R^p. Write V for the subspace spanned by r_1,r_2,…,r_p-1. Then V is defined by the equation:

Det ( [row 1: r_1..]
[Row 2: r_2]
[....]
[Row p-1: r_p-1
[Row p: x_1 x_2 ... x_p] )
=0
VECTORS AS ROWS

I.e. det is 0 iff the rows are linearly dependent since the first p-1 are the rest are only if vector x is. That’s equivalent to the final row
being in the span of the first p-1 rows.

This gives the intrinsic equation of a plane through 0 by finding the determinant.

21
Q

Proposition:

Image of a linear map

A

L:R^p -> R^q be a linear map. Then the image of L, defined by im(L) = { L(vector(r)) | vector r in R^p}
Is a subspace of R^q.

Proof:
We show by subspace criterion that 0_q is in im(L), for r’_1and r’_2 in im(L) Given by..
We have that r’_1 + r’_2 = L(r_1 + r_2) is in (L) and finally for scalar t tr’_1 = tL(r_1) = L(tr_1) in im(L)

22
Q

Proposition:nullspace or kernel of a linear map

A

Let L: R^p -> R^q be a linear map. Then the nullspace or KERNEL of L defined by Ker(L)
= {vector(r) in R^p| L(vector(r)) = vector(0)}

Is a SUBSPACE of R^p
P not q

Proof: subspace criterion, vector 0_p is in nullspace, for two vectors in Ker(L) their sum is also: L(r_1 + r_2) = L(r_1) + L(r_2) = 0_q + 0_q and similarly for scalar of vector in kernel.

23
Q

Definition: of a linear map rank and null

A

Let L:R^p -> R^q be a linear map

Rank(L) is the dimension of the image of L
Null(L) is the dimension of the nullspace or kernel

Null(L) is an integer and ker(L) the nullspace is a vector subspace

24
Q

The rank nullity theorem

A

Let L: R^p -> R^q be a linear map. Then rank(L) + null(L) =p

Proof:
Basis for Ker(L) can be shown to extended for a basis of R^p…
We claim and show that L(b_n+1),..L(b_p) , the image of these vectors, is a basis for the image im(L). By criterion of linearly independent and showing spanning. Thus dimensions must give p.

25
Q

Corollary of the rank nullity theorem

A

For a linear map L:R^p -> R^q

Rank(L) is LESS THAN OR EQUAL TO min(p,q)

E.g. If a linear map has rank 0 then it’s image is the zero subspace if R^q and it’s basis is the empty set. So L(r) =0 for all vectors r in R^p and it’s the zero linear map.

26
Q

Sets of solutions/ solution sets for NULLSPACE

A

NULLSPACE is the set of solutions to the system of linear equations
Set of vectors x
Vector(x) = (x_1,..x_p)

a_11 x_1 + a_12 x_2 +…+ a_1p x_p =0
a_21 x_1 + a_22 x_2+…+ a_2p x_p =0
….
aq1 x_1 + aq2 x_2 +…. + a_qp x_p =0

Row reductions on these k equations in p unknowns is possible for k less than or equal to q.
When we have k is the number of pivots we have p-k free variables and p-k basic sols

The nullity is the number of basic solutions for which every solution can be expressed uniquely as

27
Q

Sets of solutions/ solution sets for The IMAGE

A

The image of map L:R^p -> R^q is the set of (y_1,…,y_q) such that

a_11 x_1 + a_12 x_2 +…+ a_1p x_p =y_1
a_21 x_1 + a_22 x_2+…+ a_2p x_p =y_2
….
a_q1 x_1 + aq2 x_2 +…. + a_qp x_p =y_q

Has a solution (x_1,..,x_p) in R^p

The rank is the dimension of the SUBSPACE OF R^q SPANNED BY THE COLUMNS OF A

no row reduction only find linearly independent columns

The image set of vectors x in R^p is written as a linear combo of this

Same nullspace after reduction not same column span

28
Q

Writing a general vector in its basis form

A

A general vector x in R^p can be written as x_1 e_1 +…+ x_p e_p where e_i are standard basis

Useful for matrix multiplication

29
Q

Proposition: linear map with nullity zero with L(r_1) =L(r_2)

A

Proposition: let L: R^p -> R^q be a linear map with nullity map with nullity zero and suppose that r_1,r_2 in R^p have L(r_1) = L(r_2). Then r_1 = r_2.

IF THE NULLITY IF L IS 0 then L IS INJECTIVE
Proof: shown as by linearity we have L(r_1) -L(r_2) =0 so r_1 - r_2 is in ker(L) and therefore as the nullity is 0 vectors taken away equal 0 and thus equal.

E.g. If A has nullity zero only solution to equations is a unique solution

30
Q

Example given a map: find nullity and rank

A

Check that it’s a linear map.

Find the rank by linearly independent vectors and dimension of column space.

Find the nullity from reduction.

Check by RNT.

31
Q

Finding inverse matrices

A

•row operations
Augmented matrix perform row operations
•minors and cofactors
Minor: determinant if I,jth
Cofactor: the determinant multiplied by (-1)^(I+j)
Then take the matrix of cofactors: TAKING TRANSPOSE AND DIVIDING BY DETERMINANT

32
Q

Example of the determinant matrix of polar coordinates finding the rank and nullity

A

To find the nullity we can perform row reduction: by adding multiples of costheta or sintheta in one case we here each not equal to 0.
•Consider both not equal to 0. That is by multiplying through rows by them and performing row reduction. Then dividingwe obtain the identity matrix. Hence nullity is 0
• if they are equal to 0 we consider each case separately: costheta not equal to 0 or sintheta not equal to 0. By this we have identity matrix again
•case r us equal to 0 rank is 1 and nullity q

Hence for all theta F(0,theta) = (0,0)
The nullity is zero and the rank is two at all points except the points (0,theta) at which points the nullity is one and the rank is 1.

33
Q

Example 4.37

A

Smooth map.

34
Q

Example 4.38

A

Look

35
Q

Example 4.4p

A

Look

36
Q

Lemma: for two linear maps such that their composition is the 0 map then we have..

A

Let L:R^p -> R^q and L’: R^q -> R^r be linear maps such that L’•L is the 0 map R^p -> R^r.

Then im(L) is a subset of ker(L’)

Proof: from the assumption we can show any element of image(L) is in kernel of L’

37
Q

Last example chapter 4

A

Look