Chapter 4: Real vector spaces Flashcards
Vector space
A set of objects that are closed under addition and scalar multiplication.
Closure under addition
If u and v are objects in V, then u + v is in V.
Closure under scalar multiplication
if k is any scalar and u is any object in V, then ku is in V.
Subspace
A subset of a vector space (say V), that is itself a vector space under the addition and scalar multiplication defined on V.
If W is a set of one or more vectors in a vector space V, then W is a subspace of V iff:
- u and v are vector is W, then u+v is in W.
- k is any scalar and u is any vector in W, then ku is in W.
Create a new subspace from known subspaces:
If W1, W2, … Wr are subspaces of a vector space V, then the intersection of these subspaces is also a subspace of V.
Span of S
The subspace of a vector space V, that is formed from all possible linear combinations of the vectors in a nonempty set S.
We say that S span that subspace.
A subspace of Rn using the matrix A.
The solution set of Ax = 0, in n unknowns.
Theorem 4.2.5, considering the equality of the spans of 2 vector spaces.
If S = {v1, v2, …, vr} and S’ = {w1, w2, …, wk} are nonempty sets of vectors in a vector space V, then span{v1, v2, …, vr} = span{w1, w2, …, wk} iff each vector in S is a linear combination of those in S’, and each vector in S’ is a linear combination of those in S.
Definition of linear independence:
If S = {v1, v2, …, vr} is a nonempty set of vectors in a vector space V, then the vector equation k1v1 + k2v2 + … + krvr = 0 has at least one solution, namely, k1 = k2 = … = kr = 0. We call this the trivial solution. If this is the only solution, then S is said to be a linearly independent set.
Interpretation of linear independence:
A set with 2+ vectors is: Linearly independent iff no vector in S is expressible as a linear combination of the other vectors in S.
Basic theorem concerned with linear independence ( 1 or 2 vectors):
- A finite set that contains 0 is linearly dependent.
- A set with exactly one vector is linearly independent iff that vector is not 0.
- A set with exactly 2 vectors is linearly independent iff neither vector is a scalar multiple of the other.
Let S = {v1, v2, …, vr} be a set of vectors in Rn. When is the set linearly dependent?
If r > n
Wronskian of f1, f2, …, fn
If f1 = f1 (x), f2 = f2 (x), …, fn<strong> </strong>= fn (x) are functions that are n-1 times differentiable on the interval (-∞, ∞):
Theorem 4.3.4 using the Wronskian to determine linear independence of functions
If the functions f1, f2,…, fn have n-1 continous derivatives on the interval (-∞, ∞)
and if the Wronskian of these functions is not identically zero on (-∞, ∞), then these functions form a linearly independent set of vectors in C(n-1)(-∞, ∞)
Define a Basis
If V is any vector space and S = {v1, v2, …, vn} is a finite set of vectors in V, then S is called a basis for V if the following 2 conditions hold:
- S is linearly independent
- S spans V.
Theorem 4.4.1: Uniqueness of Basis Representation
If S = {v1, v2, … vn} is a basis for a vector space V, then every vector v in V can be expressed in the form v = c1v1 + c2v2 + … + cnvn in exactly one way.
Definition for a coordinate vector of v relative to S.
If S = {v1, v2,… vn} is a basis for a vector space V, and v = c1v1+ c2v2 + … + cnvn
is the expression for a vector v in terms of the basis S, then the scalars c1, c2, … cn are called the coordinates of v relative to the basis S. The vector (c1, c2, … cn) in Rn constructed from these coordinates is called the coordinate vector of v relative to S*; * it is denoted by
(v)S = (c1, c2, … cn)