Chapter 3 Flashcards
Suppose U and V are subspaces of W. Then which of U⋃V and U∩V is a subspace?
U⋃V is not a subspace
U∩V is
Let U and V be subspaces of some vector space W. In terms of a set, what is U + V?
U + V = {𝐮 + 𝐯: 𝐮 ⋲ U, 𝐯 ⋲ V}
Is the sum of 2 subspaces a subspace?
Yes
What does the sum of 2 subspaces contain?
U⋃V
Suppose U and V are subspaces of some vector space. Then what is a direct sum?
U + V is called a direct sum, denoted U ⊕ V, if U ∩ V={𝟎}.
If U, V ⊆ W with U = span({𝐮₁, …, 𝐮n}) and V = span({𝐯₁, …, 𝐯m}), then what is the span of W = U + V?
W = span({𝐮₁, …, 𝐮n, 𝐯₁, …, 𝐯m})
If V = U ⊕ W, what does this mean about how every 𝐯 ⋲ V can be written?
Every 𝐯 ⋲ V can be written in a unique way as 𝐯 = 𝐮 + 𝐰 where 𝐮 ⋲ U and 𝐰 ⋲ W
If U and W are subspaces of a finite-dimensional vector space, then what does this mean for dim(U ∩ W) and dim(U + V)?
dim(U + V) + dim(U ∩ W) = dim U + dim W
(In particular, if V = U ⊕ V, then dim V = dim U + dim W
Let V be a vector space, and F = R or ℂ. What is an inner product?
A map V x V → F, (𝐮, 𝐯) →〈𝐮, 𝐯〉
What does the bracket satisfy in the definition of the inner product?
(𝐮, 𝐯) →〈𝐮, 𝐯〉
1) conjugation symmetry: 〈𝐮, 𝐯〉= 〈𝐮, 𝐯〉(with line over top) for all 𝐮, 𝐯 ⋲ V
2) linearity: 〈𝐮 + 𝐮’, 𝐯〉= 〈𝐮, 𝐯〉+ 〈𝐮’, 𝐯〉and 〈𝛂𝐮, 𝐯〉= 𝛂〈𝐮, 𝐯〉
3) positive definite 〈𝐯, 𝐯〉>/= 0 and 〈𝐯, 𝐯〉= 0 ⤄ 𝐯 = 𝟎
What is the standard inner product on R^n and ℂ^n?
〈𝐮, 𝐯〉= u₁v̅₁ + … + vnu̅n
What is an inner product space?
A vector space over R or ℂ with an inner product
In an inner product, what is the norm?
||𝐯|| = √〈𝐯, 𝐯〉
What is another word for the norm in an inner product?
The length
Why is the norm of an inner product real?
Due to positive definiteness, and is >/= 0
In terms of inner product, when are 𝐯 and 𝐰 orthogonal?
If 〈𝐯, 𝐰〉= 0
For the set of vectors {𝐯₁, …, vn}, when is it orthogonal?
This set is orthogonal if 〈𝐯i, 𝐯j〉= 0 for all i ≠ j
For the set of vectors {𝐯₁, …, vn}, when is it orthonormal?
If ||𝐯i|| = 1 for all i
When given an orthogonal set {𝐯₁, …, vn}, how do you make an orthonormal one?
{𝐯₁ / ||𝐯|| , …, vn / ||𝐯||}
What is meant by normalising the set {𝐯₁, …, vn}?
Dividing each vector in the set by the norm
What things do you need to show if you want to prove something is an inner product?
1) conjugation
2) linearity
3) positive definite
What is the Gram-matrix of a set of vectors {𝐯₁, …, vn}?
The n by n matrix A with elements A_ij = 〈𝐯i, 𝐯j〉
try to write out diagram
In terms of a Gram matrix, when are the vectors {𝐯₁, …, vn} linearly independent?
When the Gram matrix is non singular
When is a matrix singular?
If A𝐮 = 𝟎 for some 𝐮
equivalent to detA = 0
What is the Gram matrix of an orthogonal set?
A diagonal Gram matrix
What is the Gram matrix of an orthonormal set?
The gram matrix is the identity matrix
If 𝐯 ≠ 𝟎 and 𝐰 are vectors, then what is 𝐯 orthogonal to?
𝐰 - (⟨𝐰, 𝐯⟩/ ⟨𝐯, 𝐯⟩) 𝐯
What is the Cauchy-Schwartz inequality?
Let 𝐮, 𝐯 belong to a inner product space. Then
|⟨𝐮, 𝐯⟩| <= ||𝐮|| ||𝐯||
What is the triangle inequality?
For 𝐮, 𝐯 in an inner product space,
|| 𝐮 + 𝐯 || <= ||𝐮|| + ||𝐯||
What is the Gram-Schmidt process?
Given a linearly independent set {𝐰₁, …, 𝐰n}, define a new set of vectors as
𝐯₁ = 𝐰₁
𝐯₂ = 𝐰₂ - (⟨𝐰₂, 𝐯₁⟩ / ⟨𝐯₁, 𝐯₁⟩) 𝐯₁
𝐯₃ = 𝐰₃ - (⟨𝐰₃, 𝐯₁⟩ / ⟨𝐯₁, 𝐯₁⟩) 𝐯₁ - (⟨𝐰₃, 𝐯₂⟩ / ⟨𝐯₂, 𝐯₂⟩) 𝐯₂
etc.
What is true of the set{𝐯₁, …, 𝐯n} that is formed in the Gram-Schmidt process? How is an orthonormal set made from this?
{𝐯₁, …, 𝐯n} is an orthogonal set which is linearly independent and has the same span as {𝐰₁, …, 𝐰n}.
For an orthonormal set, normalise the 𝐯s.
Does any finite dimensional inner product have an orthonormal basis?
Yes
What is an upper triangular matrix?
Only the upper right hand side of the matrix is non-zero
i.e. R_ij = 0 for i > j
If U is a subspace of V, what is the orthogonal subspace to U?
U⟂ = {𝐯 ⋲ V: ⟨𝐯, 𝐮⟩ = 0 ∀𝐮 ⋲ U}
What is orthogonal decomposition?
If U is a finite-dimensional subspace of V, then V = U ⊕ U⟂
Why is U⟂ important?
You can always divide finite-dimensional spaces into a subspace and its orthogonal subspace, due to orthogonal decomposition
What is the best approximation theorem?
Let the vector space V = U ⊕ U⟂. Suppose that U has an orthogonal basis {𝐮₁, …, 𝐮n}. The element
𝐮 = (⟨𝐯, 𝐮₁⟩ / ⟨𝐮₁, 𝐮₁⟩) 𝐮₁ + …+ (⟨𝐯, 𝐮n⟩ / ⟨𝐮n, 𝐮n⟩) 𝐮n
is the unique element of U minimising ||𝐯 - 𝐮||
How would you show U ⊕ V = R³?
1) show U+V = R³ using span
2) show U ∩ N = {𝟎}