Linear Algebra Flashcards

1
Q

Define a field

A

A set F with two binary operations + and × is a field if both (F, +, 0) and
(F \ {0}, ×, 1) are abelian groups and the distribution law holds:
(a + b)c = ac + bc, for all a, b, c ∈ F.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Define the characteristic of F

A

The smallest integer p such that
1 + 1 + · · · + 1 (p times) = 0
is called the characteristic of F. If no such p exists, the characteristic of F is defined to be zero.
If such a p exists, it is necessarily prime.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Define a vector space V over a field F in terms of groups

A

A vector space V over a field F is an abelian group (V, +, 0) together with a scalar multiplication F × V → V such that for all a, b ∈ F, v, w ∈ V :

(1) a(v + w) = av + aw
(2) (a + b)v = av + bv
(3) (ab)v = a(bv)
(4) 1.v = v

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Let V be a vector space over F

Define a set S ⊆ V being linearly independent

A

(1) A set S ⊆ V is linearly independent if whenever a1, · · · , an ∈ F, and
s1, · · · , sn ∈ S,
a1s1 + · · · + ansn = 0 ⇒ a1 = · · · = an = 0.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Let V be a vector space over F

Define what it means for a set S ⊆ V to be spanning

A

(2) A set S ⊆ V is spanning if for all v ∈ V there exists a1, · · · , an ∈ F and s1, · · · , sn ∈ S with
v = a1s1 + · · · + ansn

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Let V be a vector space over F

Define what it means for a set S ⊆ V to be a basis of V

A

(3) A set B ⊆ V is a basis of V if B is spanning and linearly independent. The size of B is the
dimension of V

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Define a linear map/transformation

A

Suppose V and W are vector spaces over F. A map T : V → W is a linear
transformation (or just linear map) if for all a ∈ F, v, v′ ∈ V ,
T(av + v’) = aT(v) + T(v’)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is a bijective linear map called?

A

an isomorphism of vector spaces.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the assignment T → ᵦ’[T]ᵦ ?

Meant to be fancy B’ subscript

A

an isomorphism of vector spaces from Hom(V, W)

to the space of (m×n)-matrices over F. It takes composition of maps to multiplication of matrices.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

In particular, if T : V → V and B and B′ are two different bases with ᵦ’[Id]ᵦ the change of basis matrix then:
ᵦ’[T]ᵦ’ = ???

all Bs are meant to be fancy Bs

A

ᵦ’[T]ᵦ’ = ᵦ’[Id]ᵦ ᵦ[T]ᵦ ᵦ[Id]ᵦ’ with ᵦ’[Id]ᵦ ᵦ[Id]ᵦ’ = I the identity matrix

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Define a ring

A

A non-empty set R with two binary operations + and × is a ring if (R, +, 0) is an
abelian group, the multiplication × is associative and the distribution laws hold: for all a, b, c ∈ R,
(a + b)c = ac + bc and a(b + c) = ab + ac.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Define a commutative ring

A

The ring R is called commutative if for all a, b ∈ R we have ab = ba.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Define a ring homomorphism

A

A map φ : R → S between two rings is a ring homomorphism if for all
r, r′ ∈ R:
φ(r + r’) = φ(r) + φ(r’) and φ(rr’) = φ(r)φ(r’).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Define a ring isomorphism

A

A bijective ring homomorphism is called a ring isomorphism.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Define an ideal

A

A non-empty subset I of a ring R is an ideal if for all s, t ∈ I and r ∈ R we have s − t ∈ I and sr, rs ∈ I.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is the first isomorphism theorem? (rings)

A

The kernel Ker(φ) := φ⁻¹(0) of a ring homomorphism φ : R → S is an ideal, its image Im(φ) is a subring of S, and φ induces an isomorphisms of rings R/Ker(φ) ∼= Im(φ)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Prove the first isomorphism theorem (rings)

A

Exercise

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What is the “division algorithm” for polynomials?

A
Let f(x), g(x) ∈ F[x] be two polynomials
with g(x) ≠ 0. Then there exists q(x), r(x) ∈ F[x] such that
f(x) = q(x)g(x) + r(x) and deg r(x) < deg g(x).
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Prove the “division algorithm” for polynomials

A

If deg f(x) < deg g(x), put q(x) = 0, r(x) = f(x). Assume now that deg f(x) ≥ deg g(x)
and let
f(x) = aₙxⁿ + aₙ₋₁xⁿ⁻¹ + … + a₀
g(x) = bₖxᵏ + bₖ₋₁xᵏ⁻¹ + … + b₀
Then
deg( f(x) - aₙ/bₖ xⁿ⁻ᵏg(x) ) < n
By induction on deg f − deg g, there exist s(x), t(x) such that
f(x) - aₙ/bₖ xⁿ⁻ᵏg(x) = s(x)g(x) + t(x) and deg g(x) ? deg t(x)
Hence put q(x) = aₙ/bₖ xⁿ⁻ᵏ + s(x) and r(x) = t(x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

For all f(x) ∈ F[x] and a ∈ F,

f(a) = 0 ⇒ ???

A

For all f(x) ∈ F[x] and a ∈ F,

f(a) = 0 ⇒ (x − a)|f(x).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

For all f(x) ∈ F[x] and a ∈ F,
f(a) = 0 ⇒ (x − a)|f(x).
Prove it

A

By division alg for polyn there exist q(x), r(x) such that
f(x) = q(x)(x − a) + r(x)
where r(x) is constant (as deg r(x) < 1). Evaluating at a gives
f(a) = 0 = q(a)(a − a) + r = r
and hence r = 0.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Assume f ≠ 0. If deg f ≤ n then f has [ ] roots

A

Assume f ≠ 0. If deg f ≤ n then f has at most n roots.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Assume f ≠ 0. If deg f ≤ n then f has at most n roots.

Prove it

A

Follows from
For all f(x) ∈ F[x] and a ∈ F,
f(a) = 0 ⇒ (x − a)|f(x).
and induction

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Let a(x), b(x) ∈ F[x] be two polynomials. Let c(x) be a monic polynomial of highest degree dividing
both a(x) and b(x) and write c = gcd(a, b) (also wrote less commonly hcf(a, b)).
Let a, b ∈ F[x] be non-zero polynomials and let gcd(a, b) = c. Then there exist
s, t ∈ F[x] such that:
a(x)s(x) + b(x)t(x) =

A

Let a, b ∈ F[x] be non-zero polynomials and let gcd(a, b) = c. Then there exist
s, t ∈ F[x] such that:
a(x)s(x) + b(x)t(x) = c(x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

Let a(x), b(x) ∈ F[x] be two polynomials. Let c(x) be a monic polynomial of highest degree dividing
both a(x) and b(x) and write c = gcd(a, b) (also wrote less commonly hcf(a, b)).
Let a, b ∈ F[x] be non-zero polynomials and let gcd(a, b) = c. Then there exist
s, t ∈ F[x] such that:
a(x)s(x) + b(x)t(x) = c(x)

Prove it

A

If c ≠ 1, divide a and b by c. We may thus assume deg(a) ≥ deg(b) and gcd(a, b) = 1, and
will proceed by induction on deg(a) + deg(b).
By the Division Algorithm there exist q, r ∈ F[x] such that
a = qb + r with deg(b) > deg(r).
Then deg(a) + deg(b) > deg(b) + deg(r) and gcd(b, r) = 1.
If r = 0 then b(x) = λ is constant since gcd(a, b) = 1. Hence
a(x) + b(x)(1/λ)(1 − a(x)) = 1.
Assume r ≠ 0.Then by the induction hypothesis, there exist s’, t′ ∈ F[x] such that
bs′ + rt′ = 1.
Hence,
bs′ + (a − qb)t′ = 1 and at′ + b(s’ − qt’) = 1
So, we may put t = t’ and s = s’− qt′

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

Let A ∈ Mn(F) and f(x) = aₖxᵏ + · · · + a₀ ∈ F[x]. Then

f(A) := [ ]

A

aₖAᵏ + · · · + a₀I ∈ Mn(F).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

Let A ∈ Mn(F) and f(x) = aₖxᵏ + · · · + a₀ ∈ F[x]. Then
f(A) := aₖAᵏ + · · · + a₀I ∈ Mn(F).
Since AᵖAʳ = AʳAᵖ and λA = Aλ for p, q ≥ 0 and λ ∈ F, then for all f(x), g(x) ∈ F[x] we have that
f(A)g(A) =
Av = λv ⇒

A
f(A)g(A) = g(A)f(A);
Av = λv ⇒ f(A)v = f(λ)v
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

For all A ∈ Mn(F), there exists a non-zero polynomial f(x) ∈ F[x] such that f(A) = ?

A

For all A ∈ Mn(F), there exists a non-zero polynomial f(x) ∈ F[x] such that f(A) = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

For all A ∈ Mn(F), there exists a non-zero polynomial f(x) ∈ F[x] such that
f(A) = 0
Prove it

A
Note that the dimension dim Mn(F) = n × n is finite. Hence {I, A, A², · · · , Aᵏ} as a subset of Mn(F) is linearly dependent for k ≥ n². So there exist scalars ai ∈ F, not all zero, such that
aₖAᵏ + · · · + a₀I = 0,
and f(x) = aₖxᵏ + · · · + a₀ is an annihilating polynomial
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

For any (n × n)-matrix A,
the assignment f(x) 7→ f(A) defines a ring homomorphism
Eₐ: F[x] → ???
Capital subscript A

A

Eₐ: F[x] → Mₙ(F)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q
For any (n × n)-matrix A,
the assignment f(x) 7→ f(A) defines a ring homomorphism
Eₐ: F[x] → Mₙ(F)

“For all A ∈ Mn(F), there exists a non-zero polynomial f(x) ∈ F[x] such that f(A) = 0” tells us what about the kernel?

A

The kernel is non-zero

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q
For any (n × n)-matrix A,
the assignment f(x) 7→ f(A) defines a ring homomorphism
Eₐ: F[x] → Mₙ(F)

As F[x] is commutative so is the [ ]

A

As F[x] is commutative so is the Eₐ, that is f(A)g(A) = g(A)f(A) for all polynomials f and g.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
33
Q

What is the minimal polynomial of A?

A

The minimal polynomial of A, denoted by mₐ(x), is the monic polynomial
p(x) of least degree such that p(A) = 0.

should be a capital subscript A

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
34
Q
Thm:
If f(A) = 0 then mₐ divides ...
Furthermore mₐ is [ ] (hence showing that mₐ is well-defined)
A
If f(A) = 0 then mₐ|f
Furthermore mₐ is unique hence showing that mₐ is well-defined)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
35
Q
If f(A) = 0 then mₐ|f
Furthermore mₐ is unique hence showing that mₐ is well-defined)

Prove it

A

By the division algorithm, , there exist polynomials q, r with deg r < deg mₐ such that
f = qmA + r.
Evaluating both sides at A gives r(A) = 0. By the minimality property of mₐ,
r = 0 and mₐ divides f.
To show uniqueness, let m be another monic polynomial of minimal degree and m(A) = 0. Then by the above mₐ|m. Also m and mₐ must have the same degree, and so
m = amₐ for some a ∈ F. Since both polynomials are monic it follows that a = 1 and m = mₐ

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
36
Q

Define the characteristic polynomial of A

A

The characteristic polynomial of A is defined as

χA(x) = det(A − xI).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
37
Q

χA(x) = (-1)ⁿxⁿ + …..????

A

χA(x) = (-1)ⁿxⁿ + (-1)ⁿ⁻¹tr(A)xⁿ⁻¹ + … + det(A)

Proof see lin alg 2 (prelims)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
38
Q

λ is an eigenvalue of A
⇔ ?? (χA(x))
⇔ ???(mₐ(x))

A

λ is an eigenvalue of A
⇔ λ is a root of χA(x)
⇔ λ is a root of mₐ(x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
39
Q

λ is an eigenvalue of A
⇔ λ is a root of χA(x)
⇔ λ is a root of mₐ(x)

Prove it

A
χA(λ) = 0 ⇔ det(A − λI) = 0
⇔ A − λI is singular
⇔ ∃ v ≠ 0 : (A − λI)v = 0
⇔ ∃ v ≠ 0 : Av = λv
⇒ mₐ(λ)v = mₐ(A)v = 0
⇒ mₐ(λ) = 0 (as v ≠ 0)

Conversely, assume λ is a root of mₐ. Then mₐ(x) = g(x)(x − λ) for some polynomial g. By minimality of mₐ, we have g(A) ≠ 0. Hence there exists w ∈ Fⁿ such that g(A)w ≠ 0. Put
v = g(A)w then
(A − λI)v = mₐ(A)w = 0,
and v is a λ-eigenvector for A.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
40
Q

Let C, P, A be (n × n)-matrices such that C = P⁻¹AP. Then m𝒸(x) = mₐ(x) for:
f(C) = f(P⁻¹AP) = P⁻¹f(A)P
for all polynomials f. Thus
0 = m𝒸(C) = [ ] and so m𝒸(A) = 0, and mₐ|m𝒸. Likewise m𝒸|mₐ and therefore mₐ = [ ] as both are monic.

A

Let C, P, A be (n × n)-matrices such that C = P⁻¹AP. Then m𝒸(x) = mₐ(x) for:
f(C) = f(P⁻¹AP) = P⁻¹f(A)P
for all polynomials f. Thus
0 = m𝒸(C) = [P⁻¹m𝒸(A)P] and so m𝒸(A) = 0, and mₐ|m𝒸. Likewise m𝒸|mₐ and therefore mₐ = [m𝒸] as both are monic.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
41
Q

Let V be a finite dimensional vector space and T : V → V a linear transformation. Define the minimal polynomial

A

Define the minimal polynomial of T as
mₜ(x) = mₐ(x)
where A = ᵦ[T]ᵦ with respect to some basis B of V . As mₐ(x) = mP⁻¹AP (x) the definition of
mₜ(x) is independent of the choice of basis.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
42
Q

For a linear transformation T : V → V define its characteristic polynomial

A

define its characteristic polynomial
as χT (x) = χA(x)
where A = ᵦ[T]ᵦ with respect to some basis B of V . As χA(x) = χP ⁻¹AP (x) the definition of χT (x) is independent of the choice of basis.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
43
Q

What does it mean for a field to be algebraically closed?

A

A field F is algebraically closed if every non-constant polynomial in F[x] has a root in F.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
44
Q

What is the fundamental theorem of algebra

A

The field of complex numbers C is algebraically closed.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
45
Q

What is an algebraic closure of F?

A

An algebraically closed field F¯ containing F with the property that there does not
exist a smaller algebraically closed field L with
F¯ ⊇ L ⊇ F
is called an algebraic closure of F

F¯ is F bar (all fancy Fs)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
46
Q

Every field has an algebraic [ ]

A

Every field F has an algebraic closure F¯.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
47
Q

Let V be a vector space over a field F and let U be a subspace.
What is the quotient space?

A
The set of cosets
V /U = {v + U | v ∈ V }
with the operations
(v + U) + (w + U) := v + w + U
a(v + U) := av + U
for v, w ∈ V and a ∈ F is a vector space, called the quotient space
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
48
Q
The set of cosets
V /U = {v + U | v ∈ V }
with the operations
(v + U) + (w + U) := v + w + U
a(v + U) := av + U
for v, w ∈ V and a ∈ F is a vector space, called the quotient space
Prove
A
We need to check that the operations are well-defined. Assume v + U = v' + U and
w + U = w' + U. Then v = v' + u, w = w'+ ˜u for u, u˜ ∈ U. Hence:
(v + U) + (w + U) = v + w + U
= v' + u + w'+ ˜u + U as u + ˜u ∈ U
= v'+ w' + U
= (v' + U) + (w' + U).
Similarly,
a(v + U) = av + U
= av′ + au + U as au ∈ U
= av′ + U
= a(v' + U)
That these operations satisfy the vector space axioms follows immediately from the fact that the
operations in V satisfy them.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
49
Q
Let E be a basis of U, and extend E to a basis B of V (we assume this is possible, which we certainly know to be the case at least for V finite dimensional).
Define
B¯ := {e + U | e ∈ B\E} ⊆ V /U.
Fancy B bar
What is the set B¯ a basis of?
A

V/U

50
Q
Let E be a basis of U, and extend E to a basis B of V (we assume this is possible, which we certainly know to be the case at least for V finite dimensional).
Define
B¯ := {e + U | e ∈ B\E} ⊆ V /U.
Fancy B bar
The set B¯ is a basis of V/U
Prove it
A

pg 12

51
Q

Let U ⊂ V be vector spaces, with E a basis for U, and F ⊂ V a set of vectors
such that
{v + U : v ∈ F} is a basis for the quotient V /U. Then the union E ∪ F is a basis for ??

A

V

52
Q

Let U ⊂ V be vector spaces, with E a basis for U, and F ⊂ V a set of vectors
such that
{v + U : v ∈ F} is a basis for the quotient V /U. Then the union E ∪ F is a basis for V
Prove it

A

Exercise

53
Q

If V is finite dimensional then

dim(V ) = dim(U) + [ ]

A

dim(V ) = dim(U) + dim(V /U).

54
Q

Let T : V → W be a linear map of vector spaces
over F. Then….

What is the first isomorphism theorem?

A

Then
T¯ : V /Ker(T) → Im(T)
v + Ker(T) → T(v)
is an isomorphism of vector spaces.

55
Q
Let T : V → W be a linear map of vector spaces
over F. Then
T¯ : V /Ker(T) → Im(T)
v + Ker(T) → T(v)
is an isomorphism of vector spaces.

Prove it

A

Proof. It follows from the first isomorphism theorem for groups that T¯ is an isomorphism of
(abelian) groups. T¯ is also compatible with scalar multiplication. Thus T¯ is a linear isomorphism.

56
Q

If T : V → W is a linear transformation and V is finite dimensional, then
(Rank-nullity theorem)

A

dim(V ) = dim(Ker(T)) + dim(Im(T)).

57
Q

If T : V → W is a linear transformation and V is finite dimensional, then dim(V ) = dim(Ker(T)) + dim(Im(T)).

prove it

A

Use dim(V ) = dim(U) + dim(V /U), with U = ker(T). Then
dim(V ) = dim(Ker(T)) + dim(V /Ker(T)).
By the First Isomorphism Theorem also:
dim(V /Ker(T)) = dim(Im(T)).

58
Q

Let T : V → W be a linear map and let A ⊆ V, B ⊆ W be subspaces
The formula T¯(v + A) := T(v) + B gives a well-defined linear map of quotients
T¯ : V /A → W/B if and only if [ ]

A

T(A) ⊆ B.

59
Q

Let T : V → W be a linear map and let A ⊆ V, B ⊆ W be subspaces
The formula T¯(v + A) := T(v) + B gives a well-defined linear map of quotients
T¯ : V /A → W/B if and only if T(A) ⊆ B.

Prove it

A

Assume T(A) ⊆ B. Now T¯ will be linear if it is well-defined. Assume v + A = v′ + A. Then
v = v′ + a for some a ∈ A. So
T¯(v + A) = T(v) + B by definition
= T(v′ + a) + B
= T(v′) + T(a) + B as T is linear
= T(v’) + B as T(A) ⊆ B
= T¯(v’ + A).
Hence T¯ is well-defined. Conversely, assume that T¯ is well-defined and let a ∈ A. Then
B = 0𝓌/ᵦ= T¯(0ᵥ/ₐ) = T¯(A) = T¯(a + A) = T(a) + B.
Thus T(a) ∈ B, and so T(A) ⊆ B.

60
Q

Assume now that V and W are finite dimensional. Let B = {e₁, · · · , eₙ} be a basis for V with E = {e₁, · · · , eₖ} a basis for a subspace A ⊆ V (so k ≤ n). Let B′ = {e′₁, · · · , e′ₘ} be a basis for W with E′ = {e′₁, · · · , e′ℓ} a basis for a subspace B ⊆ W. The induced bases for V /A and W/B are given by
B¯ =
B¯’ =

A
B¯ = eₖ₊₁ + A, · · · , eₙ + A and
B¯′ = e'ₗ₊₁ + B, · · · , e′ₘ + B
61
Q

Let T : V → W be a linear map such that T(A) ⊆ B. Then T induces a map T¯ on quotients by Lemma 3.7 and restricts to a linear map
T|ₐ : A → B with T|ₐ(v) =[ ]

A

T(v) for v ∈ A.

62
Q

What is the block matrix decomposition for ᵦ’[T]ᵦ?

A

Top left: ɛ’[T|ₐ]ɛ
Top right: *
Bottom left: 0
Bottom right: ᵦ¯’[T¯]ᵦ¯

where ᵦ¯’[T¯]ᵦ¯ = (aᵢⱼ)
for l+1≤i≤m, k+1≤j≤n

63
Q

Prove the block matrix decomposition for ᵦ’[T]ᵦ

A

For j ≤ k, T(eⱼ) ∈ B and hence aᵢⱼ = 0 for i > ℓ and aij is equal to the (i, j)-entry of ɛ′ [T|ₐ]ɛ for i ≤ ℓ. To identify the bottom right corner of the matrix, note that
T¯(eⱼ + A) = T(eⱼ) + B
= a₁ⱼe’₁ + … + aₘⱼe’ₘ + B
= aₗ₊₁,ⱼ(e’ₗ₊₁ + B) + … + aₘⱼ(e’ₘ + B)

64
Q

Let T : V → V be a linear transformation.

What does it mean for a subspace to be T-invariant?

A

A subspace U ⊆ V is called T-invariant if T(U) ⊆ U.

By the result of the previous section, such a T induces a map T¯ : V /U → V /U

65
Q
Let T : V → V be a linear transformation.
Let S : V → V be
another linear map.
If U is T- and S-invariant, then U is also invariant under the following maps:
1. the zero map
2. [ ]
3. aT, ∀a ∈ F
4. [ ] 
5. [ ]
A
  1. the zero map
  2. the identity map
  3. aT, ∀a ∈ F
  4. S + T
  5. S ◦ T
66
Q

Let T : V → V be a linear transformation.
Let S : V → V be
another linear map.
If U is T- and S-invariant, the U is invariant under any polynomial p(x) evaluated at [ ].
p(T) indices a map of quotients [ ]

A

evaluated at T
p(T)¯ : V /U → V /U

the whole p(T) is barred

67
Q

Let T : V → V be a linear transformation and assume U ⊆ V is T-invariant.
Then
χT (x) =

Note that this formula does not hold for the minimal polynomial

A

χT (x) = χT|U (x) × χT⁻(x)

Note that this formula does not hold for the minimal polynomial

68
Q

Let T : V → V be a linear transformation and assume U ⊆ V is T-invariant.
Then
χT (x) = χT|U (x) × χT⁻(x)

Prove it

A

Middle pg 16

Extend a basis E for U to a basis B of V

69
Q

Let V be a finite-dimensional vector space, and let T : V → V be a linear map such that its characteristic polynomial is a product of linear factors. Then, there exists a basis B
of V such that ᵦ[T]ᵦ is [ ]

A

Upper triangular

70
Q

Let V be a finite-dimensional vector space, and let T : V → V be a linear map such that its characteristic polynomial is a product of linear factors. Then, there exists a basis B
of V such that ᵦ[T]ᵦ is Upper triangular

Prove it

A

By induction on the dimension of V

End pg 16

71
Q

If A is an n×n matrix with a characteristic polynomial that is a product of linear
factors, then there exists an (n × n)-matrix P such that P⁻¹AP is [ ]

A

upper triangular.

72
Q

Let A be an upper triangular (n × n)-matrix with diagonal entries λ₁, . . . , λₙ.
Then
ⁿ∏ᵢ₌₁ ( A - λᵢI) = ??

A

ⁿ∏ᵢ₌₁ ( A - λᵢI) = 0

73
Q

Let A be an upper triangular (n × n)-matrix with diagonal entries λ₁, . . . , λₙ.
Then
ⁿ∏ᵢ₌₁ ( A - λᵢI) = 0

Prove it

A
Let e₁, . . . , eₙbe the standard basis vectors for F
n. Then
(A − λnI)v ∈  for all v ∈ Fⁿ and more generally
(A − λᵢI)w ∈  for all w ∈ .
Hence, since
Im(A − λₙI) ⊆ 
Im(A − λₙ₋₁I)(A − λnI) ⊆ 
and so on, we have that
ⁿ∏ᵢ₌₁ ( A - λᵢI) = 0
as required.
74
Q

What is the Cayley-Hamilton theorem?

A

If T : V → V is a linear transformation and V is a finite dimensional vector space, then χT (T) = 0. Hence, in particular, mT (x) | χT (x).

75
Q

Prove the Cayley-Hamilton theorem

A

Bottom pg 19

76
Q

V is a vector space

What does it mean for V to be the direct sum of subspaces W1, …, Wr

A

V = W1 ⊕ … ⊕ Wr
if every vector v ∈ V can be written uniquely as a sum
v = w1 + · · · + wr with wi ∈ W

77
Q

If V is the direct sum of the subspaces W1, … Wr. Describe the basis of V in terms of the bases of Wi/

A

For each i, let Bi be a basis for Wi. Then
B = ∪ᵢBᵢ
is a basis for V.

78
Q

If V is the direct sum of the subspaces W1, … Wr. Assume from now on that V is finite dimensional. If T : V → V is a linear map such that each Wi is T-invariant, then the matrix of T with respect to the basis B is block diagonal ….
(What does ᵦ[T]ᵦ look like?)
What is the relationship between Xₜ(x) in terms of the characteristic polynomial of the subspaces?

A

pg 21 top

79
Q
Assume f(x) = a(x)b(x) with gcd(a, b) = 1 and f(T) = 0. Then
V = Ker(a(T)) ⊕ [ ]  is a T-invariant direct sum decomposition
A

V = Ker(a(T)) ⊕ Ker(b(T))

80
Q

Assume f(x) = a(x)b(x) with gcd(a, b) = 1 and f(T) = 0. Then
V = Ker(a(T)) ⊕ Ker(b(T))
is a T-invariant direct sum decomposition. Furthermore, if f = mT is the minimal polynomial of T and a and b are monic, then
mₜ|ₖₑᵣ₍ₐ₍ₜ₎₎ (x) =
mₜ|ₖₑᵣ₍ᵦ₍ₜ₎₎ (x) =

A

mₜ|ₖₑᵣ₍ₐ₍ₜ₎₎ (x) = a(x)

mₜ|ₖₑᵣ₍ᵦ₍ₜ₎₎ (x) = b(x)

81
Q

Assume f(x) = a(x)b(x) with gcd(a, b) = 1 and f(T) = 0. Then
V = Ker(a(T)) ⊕ Ker(b(T))
is a T-invariant direct sum decomposition. Furthermore, if f = mT is the minimal polynomial of T and a and b are monic, then
mₜ|ₖₑᵣ₍ₐ₍ₜ₎₎ (x) = a(x)
mₜ|ₖₑᵣ₍ᵦ₍ₜ₎₎ (x) = b(x)
Prove it

A

pg 21

82
Q

What is the Primary Decomposition Theorem?

A

Let mT be the minimal polynomial and write
it in the form
mT (x) = fᵃ¹₁(x)· · · fᵃʳᵣ(x) where the fᵢ are distinct monic irreducible polynomials. Put Wi
:= Ker(fᵃᶦᵢ(T)). Then
1) ) V = W1 ⊕ · · · ⊕ Wr
2) Wi is T-invariant
3) mₜ|𝓌ᵢ = fᵃᶦᵢ

83
Q

Prove the primary decomposition theorem

A

pg 22

84
Q

There exists unique distinct irreducible monic polynomials f1, … fr ∈ F[x]
and positive integers ni ≥ ai > 0 (1 ≤ i ≤ r) such that
mT(x) =
and
XT =

A
mT(x) = fᵃ¹₁(x)· · · fᵃʳᵣ(x)
XT = ± fⁿ¹₁ ... fⁿʳᵣ
85
Q

There exists unique distinct irreducible monic polynomials f1, … fr ∈ F[x]
and positive integers ni ≥ ai > 0 (1 ≤ i ≤ r) such that
mT(x) = fᵃ¹₁(x)· · · fᵃʳᵣ(x)
and
XT = ± fⁿ¹₁ … fⁿʳᵣ
Prove it

A

Mid pg 22

86
Q

T is triangularisable (over a given field)
⇐⇒ χT [ ]
⇐⇒ [ ]
⇐⇒ mT [ ]

A

⇐⇒ χT factors as a product of linear polynomials (over that field)
⇐⇒ each fi is linear
⇐⇒ mT factors as a product of linear polynomials

87
Q

T is diagonalisable ⇐⇒ mT [….]

A

mT factors as a product of distinct linear polynomials

88
Q

T is diagonalisable ⇐⇒ mT factors as a product of distinct linear polynomials.
Prove it

A

Bottom pg 22

89
Q

Let V be finite dimensional and T : V → V be a linear transformation
What does nilpotent mean?

A

If Tⁿ = 0 for some n > 0

then T is called nilpotent.

90
Q

If T is nilpotent, then its minimal polynomial has the form mT (x) = [ ]

A

If T is nilpotent, then its minimal polynomial has the form mT (x) = xᵐ for some m

91
Q

If T is nilpotent, then its minimal polynomial has the form mT (x) = xᵐ for some m and there exists a basis B of V such that:
ᵦ[T]ᵦ =

A

pg 24 top

92
Q

If T is nilpotent, then its minimal polynomial has the form mT (x) = xᵐ for some m and there exists a basis B of V such that:
ᵦ[T]ᵦ = (matrix, mostly 0s, one diag of 1s and 0s) pg 24

Prove it

A

Pg 24-26:

Very long

93
Q

Let V be finite dimensional and T : V → V be a linear transformation. Assume
mT (x) = (x−λ)ᵐ for some m. Then, there exists a basis B of V such that ᵦ[T]ᵦ is block diagonal with blocks of the form:

A

Jᵢ(λ) := λIᵢ + Jᵢ = [matrix, 0 bottom right, lambdas down the diagonal, 1s down the next diag, 0 top right]
and 1≤i≤m

94
Q

Let V be finite dimensional and T : V → V be a linear transformation. Assume
mT (x) = (x−λ)ᵐ for some m. Then, there exists a basis B of V such that ᵦ[T]ᵦ is block diagonal with blocks of the form:
Jᵢ(λ) := λIᵢ + Jᵢ = [matrix, 0 bottom right, lambdas down the diagonal, 1s down the next diag, 0 top right]
and 1≤i≤m

Prove it

A

pg 27 middle

95
Q

Let V be finite dimensional and let T : V → V be a linear map with minimal
polynomial
mT (x) = (x − λ1)ᵐ¹· · ·(x − λr)ᵐʳ Then there exists a basis B of V such that ᵦ[T]ᵦ is a [ ] and each diagonal block is of the form [ ]

A

block diagonal

Ji(λj ) for some 1 ≤ i ≤ mj and 1 ≤ j ≤ r.

96
Q

Let V be a vector space over F. What is a dual?

A

Its dual V’ is the vector space of linear maps

from V to F, i.e V’ = Hom(V, F).

97
Q

Let V be a vector space over F. What is a linear functional?

A

Its dual V’ is the vector space of linear maps

from V to F, i.e V’ = Hom(V, F). Its elements are called linear functionals

98
Q
Let V be finite dimensional and let B = {e1, . . . , en} be a basis for V . Define the
dual e'i of ei (relative to B) by
e'i(ej) = δij. 
What is the dual basis?
Describe the assignment ei → e'i
dim V = ?
A

Then B′ := {e’1, . . . , e′n} is a basis for V’, the dual basis. In particular, the assignment ei
→ e’i defines an isomorphism of vector spaces. In particular, dim V = dim V’

99
Q

Let V be finite dimensional and let B = {e1, . . . , en} be a basis for V . Define the
dual e’i of ei (relative to B) by
e’i(ej) = δij.

Then B′ := {e’1, . . . , e′n} is a basis for V’, the dual basis. In particular, the assignment ei
→ e’i defines an isomorphism of vector spaces. In particular, dim V = dim V’

Prove it

A

Bottom pg 29

100
Q

Let V be a finite dimensional vector space. Then, V → (V’)’ =: V’’ defined by v → Eᵥ is a natural linear [ ]
How is Eᵥ defined?
What does natural mean?

A

isomorphism
Ev(f) := f(v) for f ∈ V’
“Natural” here means independent of a choice of basis

101
Q

Let V be a finite dimensional vector space. Then, V → (V’)’ =: V’’ defined by v → Eᵥ is a natural linear isomorphism.
Ev(f) := f(v) for f ∈ V’
Prove it

A

pg 30

102
Q

When V has dimension n, the kernel of a non-zero linear functional f : V → F is of dimension n − 1.
The preimage f⁻¹({c}) for a constant c ∈ F is a called [ ] (not necessarily containing zero) of dimension n − 1.

A

hyperplane

103
Q

When V has dimension n, the kernel of a non-zero linear functional f : V → F is of dimension n − 1.
The preimage f⁻¹({c}) for a constant c ∈ F is a called hyperplane (not necessarily containing zero) of dimension n − 1.

When V = Fⁿ (column vectors) every hyperplane
is defined by an equation:

A

a1b1 + · · · + anbn = c

for a fixed scalar c and fixed b = (b1, . . . , bn) ∈ (Fⁿ)ᵗ(row vectors)

104
Q

Let U ⊆ V be a subspace of V

What is an annihilator of U?

A

Define the annihilator of U to be:

U⁰ = {f ∈ V’: f(u) = 0 for all u ∈ U}.

105
Q

Annihilators:

f ∈ V’ iff [ ]

A

f |ᵤ = 0

106
Q

Annihilators:

[ ] is a subspace of V

A

U⁰

107
Q

Prove that U⁰ is a subspace of V’

A

Top pg 31

108
Q

Let V be finite dimensional and U ⊆ V be a subspace. Then dim(U⁰) = [ ]

A

dim(V ) − dim(U)

109
Q

Let V be finite dimensional and U ⊆ V be a subspace. Then dim(U⁰) = dim(V ) − dim(U)

Prove it

A

mid pg 31

110
Q

Let U, W be subspaces of V . Then

1) U ⊆ W ⇒ [ ]
2) (U + W)⁰= [ ]
3) U⁰ + W⁰ ⊆ ([ ])⁰ and equal if [ ]

A

1) U ⊆ W ⇒ W⁰ ⊆ U⁰
2) (U + W)⁰= U⁰ ∩ W⁰
3) U⁰ + W⁰ ⊆ (U ∩ W)⁰ and equal if dim(V ) is finite.

111
Q

Let U, W be subspaces of V . Then

1) U ⊆ W ⇒ W⁰ ⊆ U⁰
2) (U + W)⁰= U⁰ ∩ W⁰
3) U⁰ + W⁰ ⊆ (U ∩ W)⁰ and equal if dim(V ) is finite.

Prove it

A

pg 32

112
Q

Let U be a subspace of a finite dimensional vector space V . Under the natural
map V → V’’(:= (V’)’) given by v → Eᵥ, how is U mapped?

A

It is mapped isomorphically to

U⁰⁰ (:= (U⁰)⁰)

113
Q

Let U be a subspace of a finite dimensional vector space V . Under the natural
map V → V’’(:= (V’)’) given by v → Eᵥ, U is mapped isomorphically to
U⁰⁰ (:= (U⁰)⁰)

Prove it

A

Bottom pg 32

114
Q

Let U ⊆ V be a subspace. Then there exists a natural isomorphism
U⁰ ≃ (V/U)’ given by f → f¯ where f¯(v + U) :=

A

f¯(v + U) := f(v) for v ∈ V

115
Q

Let U ⊆ V be a subspace. Then there exists a natural isomorphism
U⁰ ≃ (V/U)’ given by f → f¯ where f¯(v + U) := f(v) for v ∈ V
Prove it

A

pg 33

116
Q

What is a dual map?

A

Let T : V → W be a linear map of vector spaces. Define the dual map by
T’: W′ → V’, f → f ◦ T
Note that f ◦ T : V → W → F is linear, and hence f ◦ T ∈ V’

117
Q

The dual map T’ is a [ ] map

A

linear

118
Q

Prove that T’ is a linear map

A
Let f, g ∈ W′, λ ∈ F. We need to show T'(f + λg) = T'(f) + λT′(g) (an identity of
functionals on V ). So let v ∈ V . Then, T'(f + λg)(v) 
= ((f + λg) ◦ T)(v)
= (f + λg)(T v)
= f(T v) + λg(T v)
= T'(f)(v) + λT′(g)(v)
= (T'(f) + λT′(g))(v),
as required.
119
Q

Let V and W be two finite dimensional vector spaces. The assignment T → T’ defines a natural isomorphism from [ ] to [ ]

A

hom(V, W)

hom(W’, V’)

120
Q

Let V and W be two finite dimensional vector spaces. The assignment T → T’ defines a natural isomorphism from hom(V, W) to hom(W’, V’)
Prove it

A

pg 34

121
Q

Let V and W be finite dimensional, and let B𝓌 and Bᵥ be bases for W and V .
Then, for any linear map T : V → W
(ᵦ𝓌[T}ᵦᵥ)ᵗ =

A

(ᵦ𝓌[T}ᵦᵥ)ᵗ = ᵦ’ᵥ[T]ᵦ’𝓌

where B’𝓌 and B’ᵥ are the dual bases

122
Q

Let V and W be finite dimensional, and let B𝓌 and Bᵥ be bases for W and V .
Then, for any linear map T : V → W
(ᵦ𝓌[T}ᵦᵥ)ᵗ = (ᵦ𝓌[T}ᵦᵥ)ᵗ = ᵦ’ᵥ[T]ᵦ’𝓌
where B’𝓌 and B’ᵥ are the dual bases

Prove it

A

end pg 34