Direct Sums and Inner Products Flashcards
Sums of Vector Spaces
Intersection
-suppose that U and W are subspaces of vector space V, the intersection of U and W is:
U ∩ W = { v ∈ V | v∈U AND v∈W}
Sum of Vector Spaces
Union
-suppose that U and W are subspaces of vector space V, the union of U and W is:
U ∪ W = { v ∈ V | v∈U OR v∈W (or both) }
Intersection, Union and Subspaces
- intersections of subspaces are always subspaces
- unions of subspaces are not necessarily subspaces
Sum
Definition
-let U and W be subspaces of a vector space V, the sum of U and W is the set:
U+W = { u+w | u∈U , w∈W }
Sum and Subspace
-suppose U and W are subspaces of the vector space V
-the sum of U and W is also a subspace of V
-to prove,
i) |0∈U , |0∈W -> |0 + |0 = |0 ∈ U+W
ii) check that for all x,y∈U+W , x+y∈U+W
x = u + w and y = u’ + w’ where u,u’∈U and w,w’∈W
-> x+y = u + w + u’ + w’ = (u+u’) + (w+w’)
since U is a subspace u+u’∈U and since W is a subspace w+w’∈W so x+y∈U+W
iii) check that for all a∈F and x∈U+W , ax∈U+W
x = u+w where u∈U and w∈W
ax = a(u+w) = au + aw
since U is a subspace au∈U and since W is a subspace aw∈W so ax∈U+W
Sum - Span Lemma
-let U, W be subspaces of V, suppose
U = span{u1, … ,un} and W = span{w1, … ,wm} , THEN:
U+W = span{u1, … , un} + span{w1, … , wm}
= span {u1, … , un ; w1, … , wm}
Sum - Linear (In)Dependence Lemma
Description
-suppose that {u1, … ,un} are linearly INdependent, THEN:
w ∈ {u1, … ,un} <=> {w1 ; u1, … , un} is linearly DEpendent
Direct Sum
Definition
- suppose U & W are subspaces of a vector space V
- the sum U+W is called a direct sum (denoted ) if:
a) U + W = V
b) U ∩ W = { |0 }
Equivalent Characterisation of Direct Sums
-let U and W be subspace of the vectors space V,
a) V = direct sum of U and W
<=>
b) every |v ∈ V can be written in a unique way as |v = |u + |w
- (a) means that V=U+W and U ∩ W = { |0 }
- in (b), uniquely means that |v=|u+|w and |v=|u’ + |w’ implies |u=|u’ and |w=|w’
Direct Sum - Basis Proposition
-let U and W be subspaces of V, suppose {u1, … ,un} is a basis of U and {w1, … ,wm} is a basis of W
(a) the sum of U and W is direct
<=>
(b) {u1, … , un ; w1, … , wm} is linearly independent
-in such a case, a basis of the direct sum of U and W is {u1, … , un ; w1, … , wm}
Direct Sum - Dimension Proposition
-let U and W be subspace of V, then: dim (U+W) = dim(U) + dim(W) - dim(U ∩ W) -in particular: --if V = direct sum of U and W, then; dim V = dim U + dim W --if (dimU + dimW) > dimV , then U ∩ W ≠ { |0 } , hence the sum U+W cannot be direct
How to Find a Basis for the Union of Two Subspaces
1) you can find the dimension of the basis by using:
dimZ + dimW - dim(Z∩W) = dim(Z+W)
Standard Inner Product in ℝ^n
Definition
-let |v = (v1, v2, …, vn) and |w = (w1, w2, …, wn) be in ℝ^n, then:
⟨v,w⟩ = v1w1 + v2w2 + … + vnwn ∈ ℝ
Standard Inner Product in C^n
Definition
-let |v = (v1, v2, …, vn) and |w = (w1, w2, …, wn) be in C^n, then:
⟨v,w⟩ = v1w1* + v2w2* + … + vnwn* ∈ ℝ
-where w1* is the complex conjugate of w1
Properties of the Standard Inner Product in C^n
i) conjugate symmetry: ∀ v,w ∈ C^n
⟨v,w⟩=⟨w,v⟩*
and for ⟨v,v⟩∈C
ii) linearity in the first argument: ∀ v,v’,w∈C^n , ∀ a∈C
⟨v+v’ , w⟩= ⟨v,w⟩ + ⟨v’,w⟩
⟨av,w⟩ = a⟨v,w⟩
iii) positive definiteness:
⟨v,v⟩ ≥ 0 , ∀ v∈C^n
& ⟨v,v⟩ = 0 <=> |v = |0
Properties of the Standard Inner Product
i) conjugate symmetry: ⟨v,w⟩=⟨w,v⟩* , ∀ v,w ∈ V
ii) linearity in the first argument: ∀ v,v’,w∈V, ∀ a∈V
⟨v+v’ , w⟩= ⟨v,w⟩ + ⟨v’,w⟩
⟨av,w⟩ = a⟨v,w⟩
iii) positive definiteness:
⟨v,v⟩ ≥ 0 , ∀ v∈V
⟨v,v⟩ = 0 <=> |v = |0
Standard Inner Product
Linearity in the Second Argument
-an inner product is antilinear in the second argument:
⟨v , w+w’⟩= ⟨v,w⟩ + ⟨v,w’⟩
⟨v,aw⟩ = a⟨v,w⟩
-over R this just means linearity in the second argument since conjugation () in R does nothing
-over C this does have an effect
Standard Inner Product Over R
Linearity
⟨av , bw⟩= ab ⟨v,w⟩
Standard Inner Product Over C
Linearity
⟨av , bw⟩= a b* ⟨v,w⟩
b* = complex conjugate of b
Inner Product Space
Definition
a vector space (over R or C) equipped with an inner product ⟨_ , _⟩
Norm/Length
Definition
-in an inner product space V, the norm or length
|| v || = √ ⟨v , v⟩
- || v || is real since by positive definiteness ⟨v , v⟩≥ 0
-the only vector of norm 0 is |0
Examples of Abstract Inner Products
a) usual dot products in R^n and C^n
b) for λ1,…,λn > 0 consider R^n with following non-standard inner product given |v=(v1,…,vn)∈R^n and |w=(w1,..,wn)∈R^n :
⟨v , w⟩ = Σ λi vi wi
c) an inner product in R^2 given by:
⟨(x,y) , (x’,y’)⟩ = 2xx’ + 2yy’ - xy’ - x’y
****
Orthoganal Projection
-the orthogonal projection of |w on |v is |w’ where:
proj(|w) = ⟨w,v⟩/⟨v v⟩ * v
-also consider w-w’ which is orthogonal to v
Lemma - Orthogonal Projection
-if |v ≠ |0 and |w are vectors in an inner product space V, then:
|v ⟂ (|w - ⟨w,v⟩/⟨v,v⟩ * |v) = |w’
Proof:
-to show that |v and |w’ are orthogonal, show that their inner product is zero
⟨v,w’⟩ = ⟨v, w - ⟨w,v⟩/⟨v,v⟩v⟩
-by antilinearity in the second argument:
⟨v,w’⟩ = ⟨v,w⟩ - ⟨v, ⟨w,v⟩/⟨v,v⟩v⟩
-by antilinearity in the second variable
⟨v,w’⟩ = ⟨v,w⟩ - (⟨w,v⟩/⟨v,v⟩)* ⟨v,v⟩
-since ⟨v,v⟩ is a real number, ⟨v,v⟩* = ⟨v,v⟩, i.e. complex conjugation has no effect:
⟨v,w’⟩ = ⟨v,w⟩ - ⟨w,v⟩*
-and by conjugate symmetry, ⟨v,w⟩ = ⟨w,v⟩* so:
⟨v,w’⟩ = 0 therefore |v ⟂|w’
Theorem - Cauchy-Schwarz Inequality
-for |v, |w in an inner product space V:
| ⟨v,w⟩ | ≤ ||v|| ||w||
Proof:
-consider two cases,:
i) |v=|0 which gives L.H.S.=R.H.S.=0
ii) |v ≠ |0 , consider the projection of |w onto |v
proj(|w) = ⟨w,v⟩/⟨v v⟩ * v = ⟨w,v⟩/||v||² * v
|w’ = |w - proj(|w)
-also we know that 0 ≤ ⟨w’,w’⟩ by positive definiteness
-sub in for 0 ≤ ⟨w’, w - proj(w)⟩
-work through, remember to use the complex number identity z z* = |z|²
-should end up with Couchy-Schwarz theorem squared, then square root
Properties of the Norm
Theorem
-for |v∈V and c∈F (where F can be real or complex) : || cv || = |c| ||v|| Proof: ||cv|| = √⟨cv,cv⟩ -using linearity of the first argument: =√( c⟨v,cv⟩ ) -using antilinearity of the second argument: =√( cc* ⟨v,v⟩) -using the identity zz* = |z|² : =√( |c|² ⟨v,v⟩) -remove c from the square root: =|c| √⟨v,v⟩ -replace the definition of the norm: =|c| ||v||
Properties of the Norm
Theorem - Triangle Inequality
-for |v, |w in an inner product space: || v+w || ≤ ||v|| + ||w|| Proof: -start with the definition of the norm: || v+w || = √⟨v+w , v+w⟩ -expand using linearity and antiliearity: =√( ⟨v,w⟩+⟨v,v⟩+⟨w,v⟩+⟨w,w⟩ ) -reverse definition of norm and use ⟨w,v⟩+⟨v,w⟩ = ⟨w,v⟩+⟨w,v⟩* = 2Re(⟨w,v⟩) =√( ||v||² + ||w||² + 2Re⟨w,v⟩ ) -using the Cauchy-Schwarz inequality: ≤√( ||v||² + ||w||² + 2 ||w|| ||v|| ) -factorise; =√ (||v|| + ||w||)² -simplify: = ||v|| + ||w||
Orthogonal Set
Definition
-let V be an inner product space (over R or C)
-the set of vectors {v1, … , vn} is said to be orthogonal if:
⟨vi , vj⟩ = 0 for i≠j
Orthonormal Set
Definition
-let V be an inner product space (over R or C)
-the set of vectors {v1, … , vn} is said to be orthogonal if:
⟨vi , vj⟩ = 0 for i≠j
AND
⟨vi , vj⟩ = 1 for i=j
Orthogonal / Orthonormal Basis
-a basis {v1, .. , vn} of V is orthogonal (or orthonormal) if the set {v1, … , vn} is orthogonal (or orthonormal)
Normalising
Definiton
-given an orthogonal set {vw, … , vn} of non-zero vectors, we can obtain an orthonormal set by normalising:
{v1 / ||v1|| , … , vn / ||vn|| }
Orthogonal Vectors and Linear Independence Theorem
Statement
-let v1, v2, … , vn be non-zero orthogonal vectors, then the set:
{v1, v2, … , vn} is linearly independent
i.e. LI = Σ λi vi = 0
Orthogonal Vectors and Linear Independence Theorem
Proof
-suppose Σ λi vi = 0 , taking the sum between i=1 and i=n, and where 0 is the zero vector
-take inner product of both sides with vector vj:
⟨0 , vj⟩ = ⟨Σ λi vi , vj⟩
0 = ⟨Σ λi vi , vj⟩
-by linearity in the first argument
0 = Σ λi ⟨vi , vj⟩
-expanding out the sum, every term where i≠j, ⟨vi , vj⟩ = 0 since the vectors are orthogonal
-this leaves only one non-zero term i.e. when i=j :
0 = λj ⟨vj , vj⟩
-we know that the inner product is non-zero so divide through on both sides:
0 / ⟨vj , vj⟩ = λj
λj = 0
Gram-Schmidt Orthogonalisation Method
-starting with a vector space V
-find a set of vectors that form a basis, {w1, … , wn}
-we will transform this set into a set or orthogonal vectors {v1, … , vn}
-apply the following formula to each w vector:
vn = wn - Σ ⟨wn , vi⟩/⟨vi , vi⟩ * vi
-where the sum is between i=1 and i=n-1
-the set of vectors {v1, … , vn} will be orthogonal
-to get to an orthonormal basis, divide each element in the orthogonal basis by its norm
Orthogonal Complement / Orthogonal Subspace
Definition
-if U is a subspace of finite dimensional inner product space V, the orthogonal subspace to U / the orthogonal complement of U is:
U⟂ = {v ϵ V | ⟨v , u⟩ = 0 for all uϵU}
Orthogonal Subspace Lemma
a) U⟂ is a subspace of V
b) if span{u1, … , un} = U, then:
U⟂ = {v ϵ V | ⟨v , ui⟩ = 0 for each i=1, … , n}
Properties of the Orthogonalised Basis
- > all new vectors v1, v2, … , vn are different from |0
- > {v1, v2, … , vn} is an orthogonal set and therefore linearly independent
- > {v1, … , vn} has the same span as {w1, … , wn}
- > { v1/||v1|| , … , vn/||vn|| } is orthonormal and has the same span as {w1, … , wn}
Orthogonal and Orthonormal Bases Theorem
- any finite dimensional inner product space has
a) an orthogonal basis
b) an orthonormal basis
Orthogonal Decomposition Theorem
-if U is a subspace of V, then V = direct sum of U and U⟂
-in particular:
dim(U⟂) + dim(U) = dim(V)