Workshop 1 Flashcards

1
Q

Define the kernel of a linear operator

A

The kernel of a linear map L : V → W is the subset
KerL = {u ∈ V : L(u) = 0} ⊂ V.

The image Im L is a subset in W, defined as the collection of all w ∈ W for which there exists u ∈ V
such that L(u) = w

  • 0 is an element of this too as in particular alpha 1 and alpha 2 can be 0 as any linear combo lies
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Self-Check Question I.8. Can you check, using the definitions above, that the kernel Ker L is indeed a subspace in V and W?

A

Rewrite the basic arguement that this is a vector subspace:
Suppose vectors e_1, and e_2 that lie in the kernel of L. L(e_1)=0 and L(e_2)=0
Then by linearity of L
L(α_1e_1 + α_2e_2) =
L(α_1e_1) + L(α_2e_2)
= α_1L(e_1) + α_2L(e_2)
=α_2.0 + α_2.0 = 0
then
We have shown that linear combinations of vectors in the kernal also lie in the kernal and thus it is a vector subspace

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Write the matrix for the anticlockwise rotation through angle π/2 around the origin

A

was one of the questions in the exam last year-particular question will not be on the exam this year??

First we can use the formula from the matrix rotation in this same basis q2.
setting ϴ to the angle
[cos ϴ -sin ϴ]
[sin ϴ cos ϴ]
=
[0 -1]
[1 0]

or we could derive ourselves see how acts on vectors e_1= (1,0) and e_2=(0,1) become e_2 and -e_1 respectively thus checking gives same matrix R_ π/2
e_2=0e_1 + 1e_2 forming coeffs of column

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Give an example of an orthogonal transformation

A

An example of an orthogonal transformation is a reflection.

In a 2D Cartesian coordinate system, a reflection across the x-axis is an example of an orthogonal transformation. This transformation preserves distances and angles, and its matrix representation is orthogonal.

The matrix representation of a reflection across the x-axis is:

[1 0]
[0 -1]

This matrix represents the transformation that takes a point (x,y) to its reflection (x,−y) across the x-axis.

This transformation is orthogonal because the columns of the matrix are orthonormal, meaning they are orthogonal (perpendicular) to each other and each has a magnitude of 1. Additionally, the determinant of the matrix is -1, indicating that the transformation is a reflection and not a rotation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is an orthogonal transformation?

A

It is a linear map L:R^n to R^n s.t. L(u).L(v) = u.v for all u,v in R^n

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Can you write down the matrix of the linear transformation?

A

Let {v_1, . . . , v_n} and {w_1, . . . , w_m} be bases in V and W respectively. Then to any linear operator L : V → W, we can assign an m × n-matrix A_L
it will be relative to the basis

It is a linear map L:R^n to R^n s.t. L(u).L(v) = u.v for all u,v in R^n

A IS THE MATRIX OF OUR ORTHOGONAL TRANSOFRMATION in our orthonormal basis
s.t identity matrix I….

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Problem 1. Let ℓᵥ = {tv : t ∈ R}, where |v| = 1, be a line through zero in Rⁿ. For a
point p ∉ v the distance to ℓᵥ is defined as
dist(p, ℓᵥ) =ᵈᵉᶠ= inf{|p − u| : u ∈ ℓᵥ}.

Show that a point w ∈ ℓᵥ satisfies dist(p, ℓᵥ) = |p − w| if and only if p − w is orthogonal
to v. Find a formula for such a point w.

A

projection of the line

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Problem 2. Determine the kernel and image of linear operators below. Using your answers, deduce the values of nullity and rank.
(i) L : Rⁿ → Rⁿ , L(u) = u − (u · v)v, where v ∈ Rⁿ is a fixed unit vector;

(ii) L : Rⁿ × Rⁿ → Rⁿ , L(u, v) = u + v, where u, v ∈ Rⁿ

A

“some of you might strugle with this one let’s focus on this one”
It’s a linear operator, if given by a matrix in basis this would be useful.

KerL = {u ∈ Rⁿ : L(u) = 0}
= {u ∈ Rⁿ : u = (u · v)v}⊂{u ∈ Rⁿ : u = ts, t} = ℓᵥ

the subspace is a non-zero subspace since it contains v with dimension not greater than 1 then show ….

alt we use
v is proportional to u, if u=tv for some t ∈ R then dot product with v (u.v)= t(v.v) and using v is a unit vector
LHS={u ∈ Rⁿ : u = tv for some t ∈ R}

Im L = {w ∈ Rⁿ: L(u) = w for some u ∈ Rⁿ}

nul L = dim Ker L =1
d rank L = dim Im L = n − 1
see slides

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Problem 3. Consider a linear operator R^4 → R^2 given by the matrix
A =

[1 α αβ β³]
[β βγ αγ α²]

in some bases, where α, β, γ ∈ R are unknown parameters. Find all values of the
parameters α, β, and γ such that rank A = 1.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Problem 4. Find all real numbers α, β, and γ such that the matrix
[5/ 13 0 12/13]
[−48/65 3/5 4/13]
[ α β γ ]
represents an orthogonal transformation in some orthonormal basis in R^3

A

id like to say a few words about this one

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly