Chapter 4: Real vector spaces Flashcards

1
Q

Vector space

A

A set of objects that are closed under addition and scalar multiplication.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Closure under addition

A

If u and v are objects in V, then u + v is in V.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Closure under scalar multiplication

A

if k is any scalar and u is any object in V, then ku is in V.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Subspace

A

A subset of a vector space (say V), that is itself a vector space under the addition and scalar multiplication defined on V.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

If W is a set of one or more vectors in a vector space V, then W is a subspace of V iff:

A
  • u and v are vector is W, then u+v is in W.
  • k is any scalar and u is any vector in W, then ku is in W.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Create a new subspace from known subspaces:

A

If W1, W2, … Wr are subspaces of a vector space V, then the intersection of these subspaces is also a subspace of V.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Span of S

A

The subspace of a vector space V, that is formed from all possible linear combinations of the vectors in a nonempty set S.

We say that S span that subspace.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

A subspace of Rn using the matrix A.

A

The solution set of Ax = 0, in n unknowns.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Theorem 4.2.5, considering the equality of the spans of 2 vector spaces.

A

If S = {v1, v2, …, vr} and S’ = {w1, w2, …, wk} are nonempty sets of vectors in a vector space V, then span{v1, v2, …, vr} = span{w1, w2, …, wk} iff each vector in S is a linear combination of those in S’, and each vector in S’ is a linear combination of those in S.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Definition of linear independence:

A

If S = {v1, v2, …, vr} is a nonempty set of vectors in a vector space V, then the vector equation k1v1 + k2v2 + … + krvr = 0 has at least one solution, namely, k1 = k2 = … = kr = 0. We call this the trivial solution. If this is the only solution, then S is said to be a linearly independent set.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Interpretation of linear independence:

A

A set with 2+ vectors is: Linearly independent iff no vector in S is expressible as a linear combination of the other vectors in S.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Basic theorem concerned with linear independence ( 1 or 2 vectors):

A
  • A finite set that contains 0 is linearly dependent.
  • A set with exactly one vector is linearly independent iff that vector is not 0.
  • A set with exactly 2 vectors is linearly independent iff neither vector is a scalar multiple of the other.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Let S = {v1, v2, …, vr} be a set of vectors in Rn. When is the set linearly dependent?

A

If r > n

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Wronskian of f1, f2, …, fn

A

If f1 = f1 (x), f2 = f2 (x), …, fn<strong> </strong>= fn (x) are functions that are n-1 times differentiable on the interval (-∞, ∞):

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Theorem 4.3.4 using the Wronskian to determine linear independence of functions

A

If the functions f1, f2,…, fn have n-1 continous derivatives on the interval (-∞, ∞)

and if the Wronskian of these functions is not identically zero on (-∞, ∞), then these functions form a linearly independent set of vectors in C(n-1)(-∞, ∞)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Define a Basis

A

If V is any vector space and S = {v1, v2, …, vn} is a finite set of vectors in V, then S is called a basis for V if the following 2 conditions hold:

  • S is linearly independent
  • S spans V.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Theorem 4.4.1: Uniqueness of Basis Representation

A

If S = {v1, v2, … vn} is a basis for a vector space V, then every vector v in V can be expressed in the form v = c1v1 + c2v2 + … + cnvn in exactly one way.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Definition for a coordinate vector of v relative to S.

A

If S = {v1, v2,… vn} is a basis for a vector space V, and v = c1v1+ c2v2 + … + cnvn

is the expression for a vector v in terms of the basis S, then the scalars c1, c2, … cn are called the coordinates of v relative to the basis S. The vector (c1, c2, … cn) in Rn constructed from these coordinates is called the coordinate vector of v relative to S*; * it is denoted by

(v)S = (c1, c2, … cn)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Theorem 4.5.1 on the number of vectors in a basis.

A

All bases for a finite-dimensional vector space have the same number of vectors.

20
Q

Theorem 4.5.2 on the a basis of a vector space and linear independence

A

Let V be a finite dimensional vector space, and let {v1, v2,vn} be any basis.

  • If a set has more than n vectors, then it is linearly dependent.
  • If a set has fewer than n vectors, then it does not span V.
21
Q

Dimension of a finite-dimensional vector space V.

A

The number of vectors in a basis for V.

The zero vector space has a dimension of zero.

22
Q

Dimension of 0

23
Q

Plus/Minus Theorem

A

Let S be a nonempty set of vectors in a vector space V.

  • If S is a linearly independent set and if v is a vector in V that is outside of span(S), then the set S ⋃ {v} that results by inserting v into S is still linearly independent.
  • If v is a vector in S that is expressible as a linear combination of other vectors in S, and if S - {v} denotes the set obtained by removing v from S, then S and S - {v} span the same space; that is,

span(S) = span(S - {v})

24
Q

Theorem 4.5.4 on the conditions of a basis.

A

Let V be an n-dimensional vector space, and let S be a set in V with exactly n vectors.

Then S is a basis for V if and only if S spans V or S is linearly independent.

25
Theorem 4.5.6 that relates the dimension of a vector space to the dimensions of its subspaces.
If W is a subspace of a finite-dimensional vector space V, then: * W is finite-dimensional * dim(W) ≤ dim(V) * W = V if and only if dim(W) = dim(V).
26
The columns of a transition matrix
Are the coordinate vectors of the old basis, relative to the new basis.
27
Theorem 4.6.1 on the invertibility of transition matrices
If P is the transition matrix from a basis B' to a basis B for a finite-dimensional vector space V, then P is invertible and P-1 is the transition matrix from B to B'.
28
The procedure for computing PB → B'
Form the matrix [B' | B] Use elementary row operations to reduce the matrix to reduced row echelon form. The resulting matrix will be [*I* | PB → B'] [new basis | old basis] → [***I*** | transition from old to new]
29
Theorem 4.6.2 on the transitionto the Standard Basis
Let B' = {**u1**, **u2**, ... , **un**} be any basis for the vector space Rn and let S = {**e1**, **e2**, ... , **en**} be the standard basis for Rn. If the vectors in these basis are written in column form, then: PB'→S = [**u**1 | **u**2 | ... | **u**n]
30
**Row space** of A.
The subspace spanned by the row vectors of A.
31
**Column space** of A
The subspace spanned by the column vectors of A.
32
**Null space** of A
Solution space of the homogeneous system of equations A**x** = **0**.
33
Theorem 4.7.2 on the null space and elementary row operations
Elementary row operations do not change the null space of a matrix
34
Theorem 4.7.4 on the row space and elementary row operations
Elementary row operations do not change the row space of a matrix.
35
Theorem 4.8.1 on the dimesnion of the row and column space of a matrix
The row space and column space of a matrix A have the same dimension.
36
**Rank** of matrix A
The common dimension of the row space and column space of A.
37
**Nullity** of A
The dimension of the null space of A.
38
Theorem 4.8.2: Dimension Theorem for Matrices
If matrix A has n columns: rank(A) + nullity(A) = n
39
Theorem 4.8.3 on rank & nullity
If A is an m x n matrix, then: * *rank*(A) = *the number of leading variables in the general solution of A**x** = **0*** * *nullity*(A) = *the number of parameters in the general solution of A**x** = **0***
40
Number of parameters contained in the general solution of a consistent linear system A**x** = **b**​, of **m** equations in **n** unknowns, where A has rank **r**
n - r parameters.
41
**Orthogonal complement** of W
The set of all vectors that are orthogonal to every vector in W.
42
Theorem 4.8.9 on orthogonal complements & null, row, column space
If A is an m x n matrix: * The ***null* space of A** and the ***row* space of A** are orthogonal complements. * The ***null* space of *AT*** and the ***column* space of A** are orthogonal complements.
43
Theorem 4.9.1 **Properties** of matrix transfrmations:
* TA(**0**) = **0** * TA(*k***u**) = *k*TA(**u**) * TA(**u** + **v**) = TA(**u**) + TA(**v**) * TA(**u** - **v**) = TA(**u**) - TA(**v**)
44
Steps: Finding the Standard Matrix for a Matrix Transformation
* Find the images of the standard basis vectors **e**1, **e**2, ... **e**n for *Rn* in column form. * Construct the matrix that has the images obtained in Step 1 as its successive columns.
45
4.10.1: Equivalent statements for transformation matrix A:
* A is invertible * The range of TA is Rn * TA is one-to-one
46
4.10.2: Relationships between **u**, **v** and matrix Transformations
* *T*(**u** + **v**) = T(**u**) + T(**v**) * T(*k* **u**) = *k* T(**u**)
47