Direct Sums and Inner Products Flashcards

1
Q

Sums of Vector Spaces

Intersection

A

-suppose that U and W are subspaces of vector space V, the intersection of U and W is:
U ∩ W = { v ∈ V | v∈U AND v∈W}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Sum of Vector Spaces

Union

A

-suppose that U and W are subspaces of vector space V, the union of U and W is:
U ∪ W = { v ∈ V | v∈U OR v∈W (or both) }

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Intersection, Union and Subspaces

A
  • intersections of subspaces are always subspaces

- unions of subspaces are not necessarily subspaces

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Sum

Definition

A

-let U and W be subspaces of a vector space V, the sum of U and W is the set:
U+W = { u+w | u∈U , w∈W }

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Sum and Subspace

A

-suppose U and W are subspaces of the vector space V
-the sum of U and W is also a subspace of V
-to prove,
i) |0∈U , |0∈W -> |0 + |0 = |0 ∈ U+W
ii) check that for all x,y∈U+W , x+y∈U+W
x = u + w and y = u’ + w’ where u,u’∈U and w,w’∈W
-> x+y = u + w + u’ + w’ = (u+u’) + (w+w’)
since U is a subspace u+u’∈U and since W is a subspace w+w’∈W so x+y∈U+W
iii) check that for all a∈F and x∈U+W , ax∈U+W
x = u+w where u∈U and w∈W
ax = a(u+w) = au + aw
since U is a subspace au∈U and since W is a subspace aw∈W so ax∈U+W

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Sum - Span Lemma

A

-let U, W be subspaces of V, suppose
U = span{u1, … ,un} and W = span{w1, … ,wm} , THEN:
U+W = span{u1, … , un} + span{w1, … , wm}
= span {u1, … , un ; w1, … , wm}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Sum - Linear (In)Dependence Lemma

Description

A

-suppose that {u1, … ,un} are linearly INdependent, THEN:

w ∈ {u1, … ,un} <=> {w1 ; u1, … , un} is linearly DEpendent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Direct Sum

Definition

A
  • suppose U & W are subspaces of a vector space V
  • the sum U+W is called a direct sum (denoted ) if:
    a) U + W = V
    b) U ∩ W = { |0 }
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Equivalent Characterisation of Direct Sums

A

-let U and W be subspace of the vectors space V,
a) V = direct sum of U and W
<=>
b) every |v ∈ V can be written in a unique way as |v = |u + |w
- (a) means that V=U+W and U ∩ W = { |0 }
- in (b), uniquely means that |v=|u+|w and |v=|u’ + |w’ implies |u=|u’ and |w=|w’

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Direct Sum - Basis Proposition

A

-let U and W be subspaces of V, suppose {u1, … ,un} is a basis of U and {w1, … ,wm} is a basis of W
(a) the sum of U and W is direct
<=>
(b) {u1, … , un ; w1, … , wm} is linearly independent

-in such a case, a basis of the direct sum of U and W is {u1, … , un ; w1, … , wm}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Direct Sum - Dimension Proposition

A
-let U and W be subspace of V, then:
dim (U+W) = dim(U) + dim(W) - dim(U ∩ W)
-in particular:
--if V = direct sum of U and W, then;
dim V = dim U + dim W
--if (dimU + dimW)  >  dimV , then U ∩ W ≠ { |0 } , hence the sum U+W cannot be direct
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

How to Find a Basis for the Union of Two Subspaces

A

1) you can find the dimension of the basis by using:
dimZ + dimW - dim(Z∩W) = dim(Z+W)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Standard Inner Product in ℝ^n

Definition

A

-let |v = (v1, v2, …, vn) and |w = (w1, w2, …, wn) be in ℝ^n, then:
⟨v,w⟩ = v1w1 + v2w2 + … + vnwn ∈ ℝ

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Standard Inner Product in C^n

Definition

A

-let |v = (v1, v2, …, vn) and |w = (w1, w2, …, wn) be in C^n, then:
⟨v,w⟩ = v1w1* + v2w2* + … + vnwn* ∈ ℝ

-where w1* is the complex conjugate of w1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Properties of the Standard Inner Product in C^n

A

i) conjugate symmetry: ∀ v,w ∈ C^n
⟨v,w⟩=⟨w,v⟩*
and for ⟨v,v⟩∈C

ii) linearity in the first argument: ∀ v,v’,w∈C^n , ∀ a∈C
⟨v+v’ , w⟩= ⟨v,w⟩ + ⟨v’,w⟩
⟨av,w⟩ = a⟨v,w⟩

iii) positive definiteness:
⟨v,v⟩ ≥ 0 , ∀ v∈C^n
& ⟨v,v⟩ = 0 <=> |v = |0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Properties of the Standard Inner Product

A

i) conjugate symmetry: ⟨v,w⟩=⟨w,v⟩* , ∀ v,w ∈ V

ii) linearity in the first argument: ∀ v,v’,w∈V, ∀ a∈V
⟨v+v’ , w⟩= ⟨v,w⟩ + ⟨v’,w⟩
⟨av,w⟩ = a⟨v,w⟩

iii) positive definiteness:
⟨v,v⟩ ≥ 0 , ∀ v∈V
⟨v,v⟩ = 0 <=> |v = |0

17
Q

Standard Inner Product

Linearity in the Second Argument

A

-an inner product is antilinear in the second argument:
⟨v , w+w’⟩= ⟨v,w⟩ + ⟨v,w’⟩
⟨v,aw⟩ = a⟨v,w⟩
-over R this just means linearity in the second argument since conjugation (
) in R does nothing
-over C this does have an effect

18
Q

Standard Inner Product Over R

Linearity

A

⟨av , bw⟩= ab ⟨v,w⟩

19
Q

Standard Inner Product Over C

Linearity

A

⟨av , bw⟩= a b* ⟨v,w⟩

b* = complex conjugate of b

20
Q

Inner Product Space

Definition

A

a vector space (over R or C) equipped with an inner product ⟨_ , _⟩

21
Q

Norm/Length

Definition

A

-in an inner product space V, the norm or length
|| v || = √ ⟨v , v⟩
- || v || is real since by positive definiteness ⟨v , v⟩≥ 0
-the only vector of norm 0 is |0

22
Q

Examples of Abstract Inner Products

A

a) usual dot products in R^n and C^n
b) for λ1,…,λn > 0 consider R^n with following non-standard inner product given |v=(v1,…,vn)∈R^n and |w=(w1,..,wn)∈R^n :
⟨v , w⟩ = Σ λi vi wi
c) an inner product in R^2 given by:
⟨(x,y) , (x’,y’)⟩ = 2xx’ + 2yy’ - xy’ - x’y
****

23
Q

Orthoganal Projection

A

-the orthogonal projection of |w on |v is |w’ where:
proj(|w) = ⟨w,v⟩/⟨v v⟩ * v

-also consider w-w’ which is orthogonal to v

24
Q

Lemma - Orthogonal Projection

A

-if |v ≠ |0 and |w are vectors in an inner product space V, then:
|v ⟂ (|w - ⟨w,v⟩/⟨v,v⟩ * |v) = |w’
Proof:
-to show that |v and |w’ are orthogonal, show that their inner product is zero
⟨v,w’⟩ = ⟨v, w - ⟨w,v⟩/⟨v,v⟩v⟩
-by antilinearity in the second argument:
⟨v,w’⟩ = ⟨v,w⟩ - ⟨v, ⟨w,v⟩/⟨v,v⟩
v⟩
-by antilinearity in the second variable
⟨v,w’⟩ = ⟨v,w⟩ - (⟨w,v⟩/⟨v,v⟩)* ⟨v,v⟩
-since ⟨v,v⟩ is a real number, ⟨v,v⟩* = ⟨v,v⟩, i.e. complex conjugation has no effect:
⟨v,w’⟩ = ⟨v,w⟩ - ⟨w,v⟩*
-and by conjugate symmetry, ⟨v,w⟩ = ⟨w,v⟩* so:
⟨v,w’⟩ = 0 therefore |v ⟂|w’

25
Q

Theorem - Cauchy-Schwarz Inequality

A

-for |v, |w in an inner product space V:
| ⟨v,w⟩ | ≤ ||v|| ||w||
Proof:
-consider two cases,:
i) |v=|0 which gives L.H.S.=R.H.S.=0
ii) |v ≠ |0 , consider the projection of |w onto |v
proj(|w) = ⟨w,v⟩/⟨v v⟩ * v = ⟨w,v⟩/||v||² * v
|w’ = |w - proj(|w)
-also we know that 0 ≤ ⟨w’,w’⟩ by positive definiteness
-sub in for 0 ≤ ⟨w’, w - proj(w)⟩
-work through, remember to use the complex number identity z z* = |z|²
-should end up with Couchy-Schwarz theorem squared, then square root

26
Q

Properties of the Norm

Theorem

A
-for |v∈V and c∈F (where F can be real or complex) :
|| cv || = |c| ||v||
Proof:
||cv|| = √⟨cv,cv⟩
-using linearity of the first argument:
=√( c⟨v,cv⟩ )
-using antilinearity of the second argument:
=√( cc* ⟨v,v⟩)
-using the identity zz* = |z|² :
=√( |c|² ⟨v,v⟩)
-remove c from the square root:
=|c| √⟨v,v⟩
-replace the definition of the norm:
=|c| ||v||
27
Q

Properties of the Norm

Theorem - Triangle Inequality

A
-for |v, |w in an inner product space:
|| v+w || ≤ ||v|| + ||w||
Proof:
-start with the definition of the norm:
|| v+w || = √⟨v+w , v+w⟩
-expand using linearity and antiliearity:
=√( ⟨v,w⟩+⟨v,v⟩+⟨w,v⟩+⟨w,w⟩ )
-reverse definition of norm and use ⟨w,v⟩+⟨v,w⟩ = ⟨w,v⟩+⟨w,v⟩* = 2Re(⟨w,v⟩)
=√( ||v||² + ||w||² + 2Re⟨w,v⟩ )
-using the Cauchy-Schwarz inequality:
≤√( ||v||² + ||w||² + 2 ||w|| ||v|| )
-factorise;
=√ (||v|| + ||w||)²
-simplify:
= ||v|| + ||w||
28
Q

Orthogonal Set

Definition

A

-let V be an inner product space (over R or C)
-the set of vectors {v1, … , vn} is said to be orthogonal if:
⟨vi , vj⟩ = 0 for i≠j

29
Q

Orthonormal Set

Definition

A

-let V be an inner product space (over R or C)
-the set of vectors {v1, … , vn} is said to be orthogonal if:
⟨vi , vj⟩ = 0 for i≠j
AND
⟨vi , vj⟩ = 1 for i=j

30
Q

Orthogonal / Orthonormal Basis

A

-a basis {v1, .. , vn} of V is orthogonal (or orthonormal) if the set {v1, … , vn} is orthogonal (or orthonormal)

31
Q

Normalising

Definiton

A

-given an orthogonal set {vw, … , vn} of non-zero vectors, we can obtain an orthonormal set by normalising:
{v1 / ||v1|| , … , vn / ||vn|| }

32
Q

Orthogonal Vectors and Linear Independence Theorem

Statement

A

-let v1, v2, … , vn be non-zero orthogonal vectors, then the set:
{v1, v2, … , vn} is linearly independent
i.e. LI = Σ λi vi = 0

33
Q

Orthogonal Vectors and Linear Independence Theorem

Proof

A

-suppose Σ λi vi = 0 , taking the sum between i=1 and i=n, and where 0 is the zero vector
-take inner product of both sides with vector vj:
⟨0 , vj⟩ = ⟨Σ λi vi , vj⟩
0 = ⟨Σ λi vi , vj⟩
-by linearity in the first argument
0 = Σ λi ⟨vi , vj⟩
-expanding out the sum, every term where i≠j, ⟨vi , vj⟩ = 0 since the vectors are orthogonal
-this leaves only one non-zero term i.e. when i=j :
0 = λj ⟨vj , vj⟩
-we know that the inner product is non-zero so divide through on both sides:
0 / ⟨vj , vj⟩ = λj
λj = 0

34
Q

Gram-Schmidt Orthogonalisation Method

A

-starting with a vector space V
-find a set of vectors that form a basis, {w1, … , wn}
-we will transform this set into a set or orthogonal vectors {v1, … , vn}
-apply the following formula to each w vector:
vn = wn - Σ ⟨wn , vi⟩/⟨vi , vi⟩ * vi
-where the sum is between i=1 and i=n-1
-the set of vectors {v1, … , vn} will be orthogonal
-to get to an orthonormal basis, divide each element in the orthogonal basis by its norm

35
Q

Orthogonal Complement / Orthogonal Subspace

Definition

A

-if U is a subspace of finite dimensional inner product space V, the orthogonal subspace to U / the orthogonal complement of U is:
U⟂ = {v ϵ V | ⟨v , u⟩ = 0 for all uϵU}

36
Q

Orthogonal Subspace Lemma

A

a) U⟂ is a subspace of V
b) if span{u1, … , un} = U, then:
U⟂ = {v ϵ V | ⟨v , ui⟩ = 0 for each i=1, … , n}

37
Q

Properties of the Orthogonalised Basis

A
  • > all new vectors v1, v2, … , vn are different from |0
  • > {v1, v2, … , vn} is an orthogonal set and therefore linearly independent
  • > {v1, … , vn} has the same span as {w1, … , wn}
  • > { v1/||v1|| , … , vn/||vn|| } is orthonormal and has the same span as {w1, … , wn}
38
Q

Orthogonal and Orthonormal Bases Theorem

A
  • any finite dimensional inner product space has
    a) an orthogonal basis
    b) an orthonormal basis
39
Q

Orthogonal Decomposition Theorem

A

-if U is a subspace of V, then V = direct sum of U and U⟂
-in particular:
dim(U⟂) + dim(U) = dim(V)