Chapter 4 – Key Concepts Flashcards
What is a vector space?
A vector space is a nonempty set V of objects, called vectors, on which are defined two operations, called addition and multiplication by scalars (real numbers), subject to the ten axioms (or rules) listed below. The axioms must hold for all vectors u, v, and w in V and for all scalars c and d:
- u + v ∈ V.
- u + v = v + u
- (u + v) + w = u + (v + w)
- ∃ 0 ∈ V s.t. v + 0 = v
- ∀ u ∈ V, ∃ (-u) ∈ V s.t. u + (-u) = 0
- The scalar multiple of u by c, denoted by cu, is in V.
- c(u + v) = cu + cv
- (c + d)u = cu + du
- c(du) = (cd)u
- 1u = u
For each u in a vector space V and scalar c, evaluate the following:
(1) 0u
(2) c0
(3) -u
(1) 0u = 0
(2) c0 = 0
(3) -u = (-1)u
What does it mean for H to be a subspace of a vector space V?
A subspace of a vector space V is a subset H of V that has three properties:
a. 0 ∈ H
b. ∀ u, v ∈ H, u + v ∈ H
c. ∀ u ∈ H and c ∈ ℝ, cu ∈ H
If v1, …, vp are in a vector space V, what must be true about Span{v1, …, vp}?
Span{v1, …, vp} is a subspace of V.
What is the null space of an m×n matrix A?
The null space of an m×n matrix A, written as Nul A, is the set of all solutions of the homogeneous equation Ax = 0. In set notation, Nul A = {x : x ∈ ℝn and Ax = 0}
The null space of an m×n A is a subspace of what? In other words, the set of all solutions to Ax = 0 is a subspace of what?
ℝn
What is the column space of an n×m matrix A?
The column space of an m×n matrix A, written as Col A, is the set of all linear combinations of the columns of A. If A = [a1 … an], then Col A = Span{a1, …, an}.
The column space of an m×n matrix A is is a subspace of what?
ℝm
For A is m×n, describe Col A in set notation.
Col A = {b : b = Ax for some x in ℝn}
If the column space of an m×n matrix A is all of ℝm, what must be true about the equation Ax = b?
The column space of an m×n matrix A is all of ℝm if and only if the equation Ax = b has a solution for all b ∈ ℝm.
What is the definition of a linear transformation from a vector space V into a vector space W?
A linear transformation from a vector space V into a vector space W is a rule that assigns to each vector x in V a unique vector T(x) in W s.t.:
(i) T(u + v) = T(u) + T(v) ∀ u, v ∈ V, and
(ii) T(cu) = cT(u) ∀ u ∈ V and c ∈ ℝ.
If an indexed set {v1, …, vp} of two or more vectors is linearly dependent and v1 ≠ 0, what must be true about an arbitrary vector vj for j > 1, in relation to v1, …, vj-1?
There must be some vj that is a linear combination of v1, …, vj-1.
If H is a subspace of V, what must be true about B = {b1, …, bp} in V for it to be a basis for H?
(i) B is a linearly independent set, and
(ii) the subspace spanned be B coincides with H. That is, H = span{b1, …, bp}.
What does the spanning set theorem say?
Let S = {v1, …, vp} be a set in V, and let H = span{v1, …, vp}.
a. If vk ∈ S is a linear combination of the remaining vectors in S, then S \ {vk}.
b. If H ≠ {0}, some subset of S is a basis for H.
Which columns of a matrix A form the basis for Col(A)?
The pivot columns of a matrix A form a basis for Col A.