Chapter 6 Flashcards
Definition: dot products
For vectors u, v in R^n the inner product or dot product of u and v is given by u*v = u^T * v, where u and v and viewed as n x 1 matrices and is a scalar.
Theorem 6.1 (Properties of the Dot Product)
Let vectors u, v, w be elements of R^n, c is an element of the real numbers. Then:
a) u dot v = v dot u
b) (u+v) dot w = u dot w + v dot w
c) (cu) dot v = c(u dot v) = u dot (cv)
d) u dot u >=0, and u dot u = 0 if and only if u = zero vector
Definition: distance between u and v
For vectors u, v are elements of R^n, the distance between u and v, dist(u,v), is given by dist(u,v) = ||u-v||
Definition: orthogonal vectors
Two vectors u and v are called orthogonal (u is perpendicular to v) if u dot v = 0
Theorem 6.2 (The Pythagorean Theorem)
Two vectors u and v are orthogonal if and only if
|| u + v ||^2 = ||u||^2 + ||v||^2
Orthogonal Complement
Let W be a subspace of R^n and let z be an element of R^n. The vector z is orthogonal to W if Z is orthogonal to every vector in W. The orthogonal complement of W, W perp is the set of all vectors orthogonal to the subspace W.
Theorem 6.3
For any m x n matrix A, (Row A) perp = Nul A and (Col A) perp = Nul A^T
Other facts about W perp
For a subspace W of R^n:
1: x is an element of W perp if and only if x is orthogonal to every element in a basis for W
2: W perp is a subspace of R^n
Definition of Orthogonality
A set {u1, … , up} in R^n is orthogonal if ui * uj = o whenever i is not equal to j.
* An orthogonal set of unit vectors is called an orthonormal set.
Theorem 6.4
If a set S = {u1, … , up} is an orthogonal set of nonzero vectors in R^n, the S is linearly independent.
Orthogonal basis
An orthogonal basis for a subspace W of R^n is a basis that is also an orthogonal set.
Theorem 6.5
Let {u1, … , up} be an orthogonal basis for a subspace W of R^n. For every vector y is an element of W, if
y = c1u1+ … + cpup, then cj = (yuj)/(ujuj) for j = 1, … , p
Theorem 6.6
An m x n matrix U has orthonormal columns if and only if U^T * U = I
Theorem 6.8 The Orthogonal Decomposition Theorem
Let W be a subset of R^n. Then each vector y is an element of R^n can be written uniquely as y = y hat + x, where y hat is an element of W and z is an element of W perp. In fact, if {u1, … , up} is an orthogonal basis for W, then
1: y hat= projection of y onto W = (yu1)/(u1u1)u1 + … + (yup)/(upup)up and
2: z = y - y hat
Theorem 6.9 The Best Approximation Theorem
Let W be a subspace of R^n, a vector y is an element of R^n, y hat = projection of y onto W. Then y hat is the closest point in W to y, in the sense that:
|| y - y hat || < || y - v ||
for all v in W such that v is not equal to y hat.