Orthogonality and Least Squares Flashcards

1
Q

What does it mean for two vectors to be orthogonal?

A

v dot w = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What does it mean for a vector to be orthonormal?

A

Vectors are orthonormal if they are all unit vectors and are orthogonal to one another

Ex. u_i dot u_k = 0
but u_i dot u_i = 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Provide a formula for the orthogonal projection of a vector onto V in a subspace of R^n.

A

proj(x) = x’’ = (u1 dot x)u1 + … (um dot x)um

where the vectors um form an ORTHONORMAL basis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Define the orthogonal complement of a subspace.

A

Consider a subspace V of R^n. The orthogonal complement V(perp) of V is the set of those vectors x in R^n that are orthogonal to all vectors in V. Note that V(perp) is the kernel of the orthogonal projection onto V

Shown: T(x) = proj(x) = x’’ = 0 meaning x = x(perp)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

4 properties of orthogonal complements

A

1) the orthogonal complement V(perp) of V is a subspace of R^n
2) The intersection V and V(perp) consists of the zero vector
3) dim(V) + dim(V(perp)) = n
4) (V(perp))perp = V

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How do you find the angle between two vectors?

A

angle = arccos((x dot y)/(length of x times length of y))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Prove that the length of the orthogonal projection of x is less than or equal to the length of the vector x.

A

Note x = x’’ + x(perp)
Now we want to prove llproj(x)ll <= llxll
llxll^2 = llproj(x)ll^2 + llx(perp)ll^2
Therefore it must be less than or equal since there is another term that must be added to proj(x) to be equal to x

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What does the Cauchy-Schwartz inequality state?

A

lx dot yl <= llxll llyll

DO THE PROOF!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Describe the Gram-Schmidt process.

A
We want to construct an orthonormal basis from a given subspace of V of R^n. As defined before, an orthornormal basis consists of the unit vectors that are orthogonal to one another. To find the first unit vector we would divide its respective nonzero vector by its length. After the first vector is computed, each proceeding unit vector will be computed similarly, however, now we will use each vector's v(perp) to find the proceeding unit vector.
Note v(perp) = v - proj(v) = v - (u1 dot v)u1
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Provide an alternative characterization of orthogonal projections applying the concepts of least squares solutions

A

Consider a vector x in R^n and a subspace V of R^n. Then the orthogonal projection is the vector in V CLOSEST to x, in that
llx - proj(x)ll < llx-vll
for all v in V different from proj(x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Formally, describe the least squares solution for an inconsistent linear system. What if the system is consistent?

A

Consider an inconsistent linear system Ax = b where A is an n x m matrix. The vector x* is called a least squares solution of this system is
ll b - Ax* ll <= ll b - Ax ll for all x in R^m.
If the system is consistent, then the least squares solutions are its exact solutions: The error llb-Axll is zero

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the normal equation? How is it derived?

A
A^T(Ax) = A^T(b) is the normal equation of Ax = b
ll b - Ax* ll <= ll b - Ax ll
Ax* = proj(b)
b - Ax*
A^T(b - Ax*) = 0
A^T(b) = A^T(Ax*)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

If A is an n x n matrix of the linear system Ax = b and ker(A) = {0}, what can you say about its least squares solution?

A

x* = (A^TA)^-1(A^T(b))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly