Lin Alg 2 Flashcards

1
Q

the “dot product” on Rⁿ (column vectors)

:= [ ]

A

vᵗw

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

There is also a related “dot product” on Cⁿ

:= [ ]

A

⁻vᵗw (v conjugate transpose)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Let V be a vector space over a field F. A bilinear form on V is a map
F : [ ]
such that for all u, v, w ∈ V, λ ∈ F: [ ]

A
A bilinear form on V is a map
F : V × V → F
such that for all u, v, w ∈ V, λ ∈ F:
(i) F(u + v, w) = F(u, w) + F(v, w)
(ii) F(u, v + w) = F(u, v) + F(u, w)
(iii) F(λv, w) = λF(v, w) = F(v, λw).
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

We say a bilinear form F is symmetric if [ ]
We say a bilinear form F is non-degenerate if [ ]
We say a bilinear form F is positive definite if [ ]

A

We say,
F is symmetric if: F(v, w) = F(w, v) for all v, w ∈ V .
F is non-degenerate if: F(v, w) = 0 for all v ∈ V implies w = 0.
When F = R we’ll say F is positive definite if for all v =/= 0 ∈ V : F(v, v) > 0.
Note that a positive definite form is always non-degenerate (since F(v, v) cannot
be 0 for v =/= 0).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

A real vector space V endowed with a bilinear, symmetric positive definite form
F(·, ·) is called an [ ]

A

inner product space

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Let V be a vector space over C. A sesquilinear form on V is a map F: [ ]

such that for all u, v, w ∈ V, λ ∈ C : [ ]

A
A sesquilinear form on V is a map
F : V × V → C
such that for all u, v, w ∈ V, λ ∈ C :
(i) F(u + v, w) = F(u, w) + F(v, w)
(ii) F(u, v + w) = F(u, v) + F(u, w)
(iii) F(¯λv, w) = λF(v, w) = F(v, λw).
(lambda conjugate)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

We say a bilinear form F is conjugate symmetric if [ ], and, if so, F(v, v)…

A

F(v, w) = ¯F(w, v) for all v, w ∈ V, (conjugate)

and, if so, F(v, v) ∈ R as F(v, v) = ¯F(v, v).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

A complex vector space V with a sesquilinear, conjugate symmetric, positive definite form F = < ·, · >
is called…

A

A complex vector space V with a sesquilinear, conjugate symmetric, positive definite form F = < ·, · >
is called a (complex) inner product space

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Given a real or complex inner product space, we say {w1, · · · , wn} are mutually orthogonal
if…
and are orthonormal if…

A

< wi, wj > = 0 for all i =/= j,

they are orthonormal if they are mutually orthogonal and < wi, wj > = 1 for each i.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Let V be an inner product space over K (equal R or C) and {w1, · · · , wn} ⊂ V
be orthogonal with wi =/= 0 for all i. Then w1, · · · , wn are…

A

linearly independent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Let V be an inner product space over K (equal R or C) and {w1, · · · , wn} ⊂ V
be orthogonal with wi =/= 0 for all i. Then w1, · · · , wn are linearly independent.

Prove it.

A

bottom of pg 37

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Perform the Gram-Schmidt orthonormalisation process on B = {v1, · · · , vn}, a basis of the inner product space V over K = R, C.

A

Top of pg 38

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Every finite dimensional inner product space V over K = R, C has an orthonormal [ ]
Prove it.

A

basis

done by Gram-Schmidt orthonormalisation process

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Let V be an inner product space over K = R, C. Then for all v ∈ V ,
< v, · > : V → K
w → < v, w >
is a…

A

linear functional, as < , > is linear in the second co-ordinate

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

The map defined by v → < v, · > is a natural…

every complex vector space V is in particular a real vector space, and if it is finite dimensional then…

A

natural injective R-linear map φ : V → V′,
which is an isomorphism when V is finite dimensional.

2 dim𝒸 V = dimᵣ V.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

The map defined by v → < v, · > is a natural injective R-linear map φ : V → V′,
which is an isomorphism when V is finite dimensional.
Prove it.

A

Note φ : v → < v, · >, so and we must first show φ(v + λw) = φ(v) + λφ(w) for all v, w ∈
V, λ ∈ R, i.e.

< v + λw, ·> = < v, · > + λ< w, · >.

And this is true. So φ is R-linear. (Note it is conjugate linear for λ ∈ C.) As < ·, · > is non-degenerate, < v, · > = ⁻< ·, v > is not the zero functional unless v = 0. Hence, φ is injective.

If V is finite dimensional, then
dimᵣ V = dimᵣ V′, and hence Im φ = V′.
Thus, φ is surjective and hence
an R-linear isomorphism.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Let U ⊆ V be a subspace of an inner product space V . The orthogonal complement is defined as…

A

U⊥ := {v ∈ V | < u, v > = 0 for all u ∈ U}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

We have that U⊥ is a subspace of…

A

V

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

We have that U⊥ is a subspace of V

Prove it.

A

First 0 ∈ U⊥. Now let v, w ∈ U⊥ and λ ∈ K. Then, for all u ∈ U,
< u, v + λw > = < u, v > + λ< u, w > = 0 + 0 = 0.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q
  1. U ∩ U⊥ = …
  2. U ⊕ U⊥ = … (and so dim U⊥ = …)
  3. (U + W)⊥ = …
  4. (U ∩ W)⊥ ⊇ … (with equality if …)
  5. U ⊆ … (with equality if …)
A
  1. U ∩ U⊥ = {0}
  2. U ⊕ U⊥ = V if V is finite dimensional (and so dim U⊥ = dim V − dim U)
  3. (U + W)⊥ = U⊥ ∩ W⊥
  4. (U ∩ W)⊥ ⊇ U⊥ + W⊥ (with equality if dim V < ∞)
  5. U ⊆ (U⊥)⊥ (with equality if V is finite dimensional)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Prove

  1. U ∩ U⊥ = {0}
  2. U ⊕ U⊥ = V if V is finite dimensional (and so dim U⊥ = dim V − dim U)
  3. (U + W)⊥ = U⊥ ∩ W⊥
  4. (U ∩ W)⊥ ⊇ U⊥ + W⊥ (with equality if dim V < ∞)
  5. U ⊆ (U⊥)⊥ (with equality if V is finite dimensional)
A

pg 40

  1. and 4. are exercises
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Let V be finite dimensional. Then, under the R-linear isomorphism φ : V → V′
given by v → < v, · >, the space U⊥ maps…

A

Let V be finite dimensional. Then, under the R-linear isomorphism φ : V → V′
given by v → < v, · >, the space U⊥ maps isomorphically to U⁰
(considered as R vector spaces).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Let V be finite dimensional. Then, under the R-linear isomorphism φ : V → V′
given by v → < v, · >, the space U⊥ maps isomorphically to U⁰
(considered as R vector spaces).

Prove it.

A

Let v ∈ U⊥. Then for all u ∈ U,
< u, v > = ¯< v, u > = 0, and hence < v, · > ∈ U⁰.
Hence Im(φ|ᵤ⊥ ) ⊆ U⁰.

We also have that dim U⊥ = dim V − dim U = dim U⁰, and so φ(U⊥) = U⁰.

(Note that φ has trivial kernel so we need only check equality of dimensions.)

24
Q

Given a linear map T : V → V , a linear map T∗: V → V is its adjoint if…

A

Given a linear map T : V → V , a linear map T∗: V → V is its adjoint if for all v, w ∈ V ,
< v, T(w) > = < T∗(v), w >.

25
If T∗ exists it is...
unique
26
Prove that if T∗ exists it is unique.
Let T˜ be another map satisfying (∗). Then for all v, w ∈ V < T∗(v) - T˜(v), w > = < T∗(v), w > − < T˜(v), w > = < v, T(w) > − < v, T(w) > = 0. But < , > is non-degenerate and hence for all v ∈ V T∗(v) − T˜(v) = 0, and so T∗ = T˜.
27
Let T : V → V be linear where V is finite dimensional. Then the adjoint...
exists and is linear.
28
Let T : V → V be linear where V is finite dimensional. Then the adjoint exists and is linear. Prove it
Bottom of pg 42
29
``` Let T : V → V be linear and let B = {e1, · · · , en} be an orthonormal basis for V . Then ᵦ[T∗]ᵦ =... ```
ᵦ[T∗]ᵦ = ¯(ᵦ[T]ᵦ)ᵗ.
30
``` Let T : V → V be linear and let B = {e1, · · · , en} be an orthonormal basis for V . Then ᵦ[T∗]ᵦ = ¯(ᵦ[T]ᵦ)ᵗ. ``` Prove it
Let A = ᵦ[T]ᵦ. Then aij = < ei, T(ej ) >. Let B = ᵦ[T∗]ᵦ. Then bij = < ei, T∗(ej ) > = < T∗(ej ), ei > = < ej, T(ei) > = ¯aji, and hence, B = ¯Aᵗ.
31
Let S, T : V → V be linear, V finite dimensional and λ ∈ K. Then: (1) (S + T)∗ = ... (2) (λT)∗ = ... (3) (ST)∗ = ... (4) (T∗)∗ = ... (5) If mT is the minimal polynomial of T then mT∗ = ... Prove it
Let S, T : V → V be linear, V finite dimensional and λ ∈ K. Then: (1) (S + T)∗ = S∗ + T∗ (2) (λT)∗ = ¯λT∗ (3) (ST)∗ = T∗S∗ (4) (T∗)∗ = T (5) If mT is the minimal polynomial of T then mT∗ = ¯mT . The proof is an exercise
32
A linear map T : V → V is self-adjoint if...
A linear map T : V → V is self-adjoint if | T = T∗
33
If λ is an eigenvalue of a self-adjoint linear operator then...
If λ is an eigenvalue of a self-adjoint linear operator then λ ∈ R
34
If λ is an eigenvalue of a self-adjoint linear operator then λ ∈ R Prove it.
Assume w =/= 0 and T(w) = λw for some λ ∈ C. Then λ< w, w > = < w, λw > = < w, T(w) > = < T∗(w), w > = < T(w), w > = < λw, w > = ¯λ< w, w >. Hence, as < w, w > =/= 0, λ = ¯λ and λ ∈ R.
35
If T is [ ] and U ⊆ V is T-invariant, then so is U⊥.
If T is self-adjoint and U ⊆ V is T-invariant, then so is U⊥.
36
If T is self-adjoint and U ⊆ V is T-invariant, then so is U⊥. Prove it
Let w ∈ U⊥. Then for all u ∈ U, < u, T(w) > = < T∗(u), w > = < T(u), w > = 0, as T(u) ∈ U and w ∈ U⊥. Hence, T(w) ∈ U⊥.
37
If T : V → V is self-adjoint and V is finite dimensional, then there exists an [ ] for T.
If T : V → V is self-adjoint and V is finite dimensional, then there exists an orthonormal basis of eigenvectors for T.
38
If T : V → V is self-adjoint and V is finite dimensional, then there exists an orthonormal basis of eigenvectors for T.
Thm 8.21 pg 44
39
What does it mean for a matrix A to be orthogonal/unitary?
A¯Aᵗ = ¯AᵗA = I and A⁻¹ = ¯Aᵗ, that is to say A is orthogonal if K = R and unitary if K = C.
40
Let V be a finite dimensional inner product space and T : V → V be a linear transformation. If T∗ = T⁻¹ then T is called ...
orthogonal when K = R; | unitary when K = C.
41
The following are equivalent: (1) T∗ = T⁻¹; (2) T ... (3) T ...
(1) T∗ = T⁻¹; (2) T preserves inner products: < v, w > = for all v, w ∈ V; (3) T preserves lengths: ||v|| = ||Tv|| for all v ∈ V.
42
The following are equivalent: (1) T∗ = T⁻¹; (2) T preserves inner products: < v, w > = < Tv, Tw > for all v, w ∈ V; (3) T preserves lengths: ||v|| = ||Tv|| for all v ∈ V. Prove it.
top of pg 45
43
The length function determines the inner product: Given two inner products < , >₁ and < , >₂ ...
< v, v >₁ = < v, v >₂ ∀ v ∈ V ⇔ < v, w >₁ = < v, w >₂ ∀ v, w ∈ V.
44
The length function determines the inner product: Given two inner products < , >₁ and < , >₂, < v, v >₁ = < v, v >₂ ∀ v ∈ V ⇔ < v, w >₁ = < v, w >₂ ∀ v, w ∈ V. Prove it.
The implication ⇐ is trivial. For the implication ⇒ note that < v + w, v + w > = < v, v > + < v, w > + ¯< v, w > + < w, w > and (for K = C) < v + iw, v + iw > = < v, v > + i< v, w > - i¯< v, w > + < w, w > . Hence, Re = 1/2(||v + w||² − ||v||² − ||w||²) Im = −1/2(||v + iw||² − ||v||² − ||w||²). Thus the inner product is given in terms of the length function.
45
Note that inner product spaces are metric spaces with d(v, w) = ||v − w|| and orthogonal/unitary linear transformations are isometries, so we have another equivalence: (4) d(v, w) = ... = ... = ... for all v, w ∈ V. (with (1) T∗ = T⁻¹; (2) T preserves inner products: < v, w > = < Tv, Tw > for all v, w ∈ V; (3) T preserves lengths: ||v|| = ||Tv|| for all v ∈ V.)
d(v, w) = ||v − w|| (4) d(v, w) = ||v − w|| = ||Tv − Tw|| = d(Tv, Tw) for all v, w ∈ V.
46
What is: 1. the orthogonal group 2. the special orthogonal group 3. the unitary group 4. the special unitary group.
O(n) = {A ∈ Mn×n(R)|AᵗA = Id}, the orthogonal group SO(n) = {A ∈ O(n)| det A = 1}, the special orthogonal group U(n) = {A ∈ Mn×n(C)|¯AᵗA = Id}, the unitary group SU(n) = {A ∈ U(n)| det A = 1}, the special unitary group.
47
If λ is an eigenvalue of an orthogonal/unitary linear transformation T : V → V , then |λ| = ...
1
48
If λ is an eigenvalue of an orthogonal/unitary linear transformation T : V → V , then |λ| = 1 Prove it
``` Let v =/= 0 be a λ-eigenvector. Then < v, v > = < Tv, Tv > by Theorem 8.23 which equals < λv, λv > = λ¯λ< v, v > and so 1 = λ¯λ , that is |λ| = 1. ```
49
If A is an orthogonal/unitary n × n-matrix then |det A| =...
|det A| = 1
50
If A is an orthogonal/unitary n × n-matrix then |det A| = 1 | Prove it.
Working over C we know that det A is the product of all eigenvalues (with repetitions). Hence, |det A| = |λ1λ2...λn| = |λ1||λ2|...|λn| = 1
51
Assume that V is finite dimensional and T : V → V with T∗ T = Id. Then if U is T-invariant so is...
U⊥
52
Assume that V is finite dimensional and T : V → V with T∗ T = Id. Then if U is T-invariant so is U⊥. Prove it
Let w ∈ U⊥. Then for all u ∈ U, < u, Tw > = < T∗u, w > = < T⁻¹u, w >. As U is invariant under T it must be invariant under T⁻¹ (= T∗). (This follows since writing mT = xᵐ + aₘ₋₁xᵐ⁻¹ + · · · + a1x + a0 we see that T(Tᵐ⁻¹ + aₘ₋₁Tᵐ⁻² + · · · + a1) = −a0I and since a0 =/= 0 we get that T⁻¹ is a polynomial in T.) Hence, T∗u ∈ U and = 0 for all u ∈ U. Thus < u, Tw > = 0 for all u ∈ U so Tw ∈ U⊥.
53
Assume V is finite dimensional and T : V → V is unitary. Then there exists an ... of eigenvectors
orthonormal basis
54
Assume V is finite dimensional and T : V → V is unitary. Then there exists an orthonormal basis of eigenvectors. Prove it
As K = C is algebraically closed, there exists a λ and v =/= 0 ∈ V such that Tv = λv. Then U = < v > is T-invariant and so is its complement U⊥. Therefore the restriction T|ᵤ⊥ is a map of U⊥ to itself which satisfies the hypothesis of the theorem. Working by induction on the dimension n := dim(V) and noting that dim U⊥ = n − 1, we may assume that there exists an orthonormal basis {e2, · · · , en} of U⊥. Put e1 = v/||v||. Then {e1, e2, · · · , en} is an orthonormal basis of eigenvectors for V .
55
Let A ∈ U(n). Then there exists P ∈ U(n) such that P⁻¹AP ...
is diagonal
56
Let T : V → V be orthogonal and V be a finite dimensional real vector space. Then there exists an orthonormal basis B such that: ᵦ[T]ᵦ = and prove it.
bottom of pg 47 (block matrix)