New Deck Flashcards

1
Q

Basis (for a subspace)

A

A basis for a subspace is a set of vectors v_1, …, v_k in W such that :

  1. v_1, … v_k are linearly independent; and
  2. v_1, … v_k span W
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Characteristic Polynomial of a matrix:

A

the characteristic polynomial of a n by n matrix A is the polynomial in t given by the formula det(A-t*I)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Column Space of a matrix:

A

the column space of a matrix is the subspace spanned by the columns of the matrix considered as vectors

Also row space

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Defective matrix

A

a matrix A is defective if A has an eigenvalue whose geometric multiplicity is less than its algebraic multiplicity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Diagonalizable matrix

A

A matrix is diagonalizable if its dimension of a subspace: the dimension of a subspace Wi s the number of vectors in any basis of W (if W is the subspace{0}, we say that its dimension is 0)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Row echelon form of a matrix

A

A matrix is in row echelon form if

  1. all rows that consist entirely of zeros are grouped together at the bottom of the matrix; and
  2. the first counting left to right) nonzero entry in each non zero row appears in a column to the right of the first nonzero entry in the preceeding row (if there is an preceeding row)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Reduce row echelon form of a matrix:

A

A matrix is in reduce row echleoon form if

  1. matrix is in row echedlon form
  2. the first nonzero entry in each nonzero row is the number 1; and
  3. the firs tnonzero entry in each nonzero row is the only nonzero entry in its column
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Eigenspace of a matrix

A

The eigenspace associated with the eigenvalue c of a matrix A is the null space of A-c*I

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Eigenvalue of a matrix:

A

An eigenvalue of a n by n matrix A is a scalar c such that Ax =cx holds for some nonzero vector x

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Eigenvector of a matrix:

A

An eigenvector of a n by n matrix A i a nonzero vector x such that Ax=cx holds for some scalar c

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

equivalent linear systems

A

Two system of linear equations in n unknowns are equivalent if they have the same set of solutions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

homogenous linear system

A

A system of linear equations A*x=b is homogeneous if b=0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

inconsistent linear system

A

A system of linear equations is inconsistent if it has no solutions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

inverse of a matrix

A

the matrix B is an inverse for the amtrix A if AB = BA = I , identity matrix

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Least squares solution of a linear system

A

A leat-squares soution to a system of linear equations Ax = b is a vector x that minimizes the length of the vector Ax-b

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Linear Combination of vectors

A

vector v is a linear combination of the vectors v_1, …, v_k if there exist scalars a_1, …, a_k such that v=a_1 * v_1 + …+ a_k*v_k

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Linearly Dependent vectors

A

vectors v_1, … , v_k are linearly dependent if the equation a_1 * v_1 + …+ a_k*v_k =0 has a solution where not all the scalar a_1, …, a_k are zero

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Linearly Independent vectors

A

vectors v_1, … , v_k are linearly dependent if and only if the solution to the equation a_1 * v_1 + …+ a_k*v_k =0 is the solution where all the scalar a_1, …, a_k are zero

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Linear Transformation

A

a linear transformation from V to W is a function T from V to W such that:

  1. T(u+v) = T(u) + T(v) fpr all vectors u and v in V; and
  2. T(av) = aT(v) for all vectors v in V and all scalars a
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

algebraic multiplicity of an eigenvalue:

A

The algebraic multiplicity of an eigenvalue c of a matrix A is the number of times the factor (t-c) occurs in the characteristic polynomial of A

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Geometric mutliplicity of an eigenvalue

A

The geometric multiplicity of an eigenvalue c of a matrixA is the dimension of the eigenspace of c.

22
Q

Symmetric matrix

A

a matrix is symmetric if it equals its transpose

23
Q

Subspace

A

A subspace Q of n-space is a subspace if

  1. the zero vector is in W
  2. x+y is in W whenever x and y are in W; and
  3. a*x is in W whenever x is in W and a is any scalar
24
Q

Span of a set of vectors

A

the span of the vectors v_1, … v_k is the subset V consisting of all linear combinations pof v_1, … v_k . One can also say that the subspace V is spanned by the vectors v_1, .. , v_k and that these vectors span V

25
Q

Singular matrix

A

An by n matrix A is singular if the equation A*x = 0 (where x is an n-tuple) has a nonzero solution for x

26
Q

Row space of a matrix

A

the row space of a matrix is the subspace spanned by the rows of the matrix considered as vectors

27
Q

Row equivalent matrices

A

Two matrices are row equivalent if one can be obtained from the other by a sequence of elementary row operations:
The elementary row operations performed on a matrix are :
* interchange two rows;
* multiply a row by a nonzero scalar;
add a constant ultiple of one row to another

28
Q

Rank

A

of a linear transformation

29
Q

Similar Matrices

A

Matrices A and B are similar if there is a square nonsingular matrix S such that s^{-1} AS=B

30
Q

Nonsingular matrix

A

an n by n atrix A is nonsingular if the only solution to the equation A*x = 0 is x =0

31
Q

Null Space of a matrix

A

The null space of a m by n matrix is the ser of all n-tuple x such that A*x=0

32
Q

Null Space of a linear transformation

A

Null space of a linear transformation T is the set of vectors v in its domain such that T(v) =0

33
Q

Nullity of a matrix

A

The nullity of a matrix is the dimension of its null space

34
Q

Nullity of a linear transformation

A

The nullity of a dimension of its null space

35
Q

Orthogonal set of vectors

A

A set of n-tuples is orthogonal if the dot product of any two of them is 0

36
Q

Orthogonal matrix

A

A matrix A is orthogonal if A is invertible and its orthogonal linear transformation. A linear transformation T from V to W orthogonal if T(v) has the same length as v for all vectors v in V

37
Q

Orthonormal set of vectors

A

A set of n-tuples os orthonormal if it is orthonormal if it is orthogonal and each vector has length 1

38
Q

Range of a matrix

A

The range of a m by n matrix A is the set of all m-tuples A*x, where x is any n-tuples

39
Q

Range of a linear transformation

A

The range of a linear transformation T is the set of all vectors T(v) , where v is any vector in tis domain

40
Q

Rank of a matrix

A

The rank of a matrix is the number of nonzero rows in any row equivalent matrix that is in row echelon form

41
Q

Rank of a linear transformation

A

the rank of a linear transformation (and hence of any matrix regarded as a linear transformation )is the dimension of its range. A theorem tell us that the two definitions of rank of a matrix are equivalent

42
Q

row equivalent matrices

A

two matrices are row equivalent if one can be obtained from the other by a sequence of elementary row operations:
The elementary row operations performed on a matrix are:
interchange two rows;
multiplying a row by a nonzero scalr
add a constant multiple of one row to another

43
Q

Sheldon Axler Notes:

Vectors are defined first

Linear independence, span , basis and dimension are defined in the chapter , which present the basic theory of finite-dimensional vector spaces

Linear maps are introduced next leading to Fundamental Theorem of Linear Maps: if T is a linear map on V, then dimV - dim nullT = dim range T. Quotient spaces and duality are topics with high abstraction

Theory of polynomials is needed to understand linear operators (no linear algebra).

Studying linear operators by restricting it to small subspaces leads to eigenvectors. Complex vector spaces, eigvenvalues always exist. Eachlinear operator on a complex vector space has a upper triangular matrix with respect to some basis.

Inner product spaces are defined in this chapter and their basic properties are developed along with orthonormal bases and Gram-schimdt procedure. Orthogonal projections can be used to solve certain minimalization problems

Spectral theorem characterizes linear operators where there is an orthonormal basis consisting of eigenvector is the highlight.

A

notes

44
Q

Invertible Matrix

A

How do you know if A^{-1} exists: no nonzero solution to Ax = 0

columns are linearly independent of A

rows of A are Linearly Independent

detA != 0

unique solution to Ax=B

all eigenvalues are nonzero: means it is invertible
nonsingular means invertible

45
Q

if Ax=0, only solution is 0 if A is invertible and ths A^{-1} A = I

A

How do you now if you can invert A, IF A is invertible A must be nonsingular (A^{-1} is also nonsingular)

Det(A) != 0 same as Ax=0 same as x=0

46
Q

Solutions to system

Consistent: IFF right most column is not

Inconsistent:

A

consistent and inconsistent

47
Q

Reduced Row Echelon Form (RREF)

A

Free variable if pivot has 0 below it

unique solution : no free variable

infinitely many solutions: at least 1 free variable

48
Q

Linear combination

A

y = c_1x_1 + …. + c_nx_n

49
Q

set

A

{[ ] , [ ] , [ ] }

50
Q

Span

A

some linear combination of set

51
Q

identity matrix

A

1 0 0
0 1 0
0 0 1