Linear Algebra Flashcards
5.1: Distance Between Two Vectors
5.1: The Cauchy-Schwarz Inequality
5.1: The Triangle Inequality
5.1 The Pythagorean Thereom
5.2: Definition of Inner Product
5.2 Orthogonal Projection and Distance
5.3 Orthonormal Basis
A set of vectors that are both mutually orthogonal and unit vectors.
5.3 Gram-Schmidt Orthonormalization Process
- B={v₁, v₂…,v}, a set of vectors that are the basis for an inner product space V.
- B’={w₁, w₂…,w}, w₁=v₁, w₂=v₂-proj_v₂w, w₃=v₃-proj_v₃w₁-proj_v₃w₂; orthogonalization
- Find the unit vectors for each w vector.
3.1 Minor
3.1 Cofactor
3.1 Determinant of a Triangular Matrix
The product of all the entries on the principal diagonal.
3.2 Elementary Row Operations and Determinants
3.3 Determinant of a Matrix Product
Determinant of a Scalar Multiple of a Matrix
Determinant of an inverse Matrix
Determinant of a Transpose of a Matrix
3.4 Adjoint of a Matrix
Adj(A)=the transpose of a cofactor matrix.
3.4 Inverse of a nxn Matrix Using its Adjoint
3.4 Cramer’s Rule
3.4 Area of a Triangle with vertices
(x₁, y₁), (x₂, y₂), and (x₃, y₃)
3.4 Two-Point Form of the Equation of a Line (x₁, y₁), (x₂, y₂)
3.4 Volume of a Tetrahedron with vertices
(x₁, y₁,z₁), (x₂, y₂, z₂), (x₃, y₃, z₃), and (x₄, y₄, z₄)
3.4 Three-Point Form of the Equation of a Plane
(x₁, y₁,z₁), (x₂, y₂, z₂), and (x₃, y₃, z₃)
2.3 Inverse of a 2x2 matrix
2.3 Inverse of an nxn matrix
2.3: Solve a system of equations
2.4 Definition of an Elementary Matrix
A nxn matrix that can be obtained from the identity matrix I by a single elementary row operation.
2.5 Stochastic Matrics
P=Probability Matrix (columns add up to 1): PX
2.5 Leontief Input-Output Models
2.5 Matrix Form for Linear Regression
2.5 Encryption
1.1: Row Echelon Form
A stair step pattern with leading coefficients of 1.
1.2: Reduced Row Echelon Form
Every column that has a leading 1 has zeros in
every position above and below its leading 1.
1.2: Gaussian Eliminination with Back Substitution
- Write the augmented matrix of the system of linear equations.
- Use elementary row operations to rewrite the matrix in row-echelon form.
- Use back-substitution to find the solution.
1.2: Gauss-Jordan Elimination
Same as Gaussian Elimination but instead of row-echelon form you rewrite the matrix in reduced row-echelon form.
1.2: Homogeneous System of Linear Equations
- Systems of linear equations in which each of the constant terms is zero.
- A trivial solution is where all variables equal 0.
- Must have at least one solution, which is the trivial solution.
- If the system has fewer equations than variables, then it must have infinitely many solutions.
1.3.: Polynomial Curve Fitting
Subsitute each of the given points into the polynomial function then solve for each variable.
4.1: Properties of Vector Addition and Scalar Multiplication in Rn
4.1: Properties of Additive Identity and Additive Inverse
4.2: Properties of Scalar Multiplication
4.3: Test for a Subspace
Subspaces must contain the zero vector.
4.4: Finding a Linear Combination
4.5: Definition of Basis
A set of vectors S in a vector space V that span V and are linearly independent.
4.5: Characteristics of Bases
- Uniqueness of Basis Representation
- Bases and Linear Dependence
- Number of Vectors in a Basis
- If S is a basis for a vector space V then every vector in V can be written in one and only one way as a linear combination of vectors in S.
- If S is a basis for a vector space V then every set containing more than n vectors in V is linearly dependent.
- If a vector space V has one basis with n vectors, then every basis for V has n vectors.
4.5: Number of Dimensions in a Vector Space
Rn, Pn, Mm,n
Rn: n, Pn:n+1, Mm,n: mn
4.6 Basis for the Row Space of a Matrix
If a matrix A is row-equivalent to a matrix B in row-echelon form, then the nonzero row vectors of B form a basis for the row space of A.
4.6: Basis for a Column Space of a Matrix
Nonzero row vectors of B form a basis for the row space of AT.
4.6: Rank of a Matrix
The dimension of the row (or column) space of a matrix A is called the rank of A and is denoted by rank(A).
4.6: Nullspace
If A is an m x n matrix, then the set of all solutions of the homogeneous system of linear equations Ax= 0 is a subspace of Rn called the nullspace of A and is denoted by N(A).
4.6: Finding the Nullspace
- Write the coefficient matrix in row-echelon form.
- Solve for the variables, making use of parametric form.
- Write the system as linear combinations of the variables.
4.6: Dimension of the Solution Space
Rank (A)–nullity
4.6: Finding the Solution Set of a Nonhomogenuous System of Linear Equations Ax=b
- Write the augmented matrix in row-echelon form.
- Solve for the variables, making use of parametric form.
- Write the vectors as a linear combinations and remove the particular solution x<em>p</em>.
4.6: Summary of Equivalent Conditions for Square Matrices
4.7: Finding a Coordinate Matrix Relative to a Standard Basis
4.7: Finding a Coordinate Matrix Relative to a Nonstandard Basis.
- Write x as a linear combination of the nonstandard basis u. x=c1u1+c2u2+c3u3.
- Write as a system of linear equations and matrix equation.
- Solve for the variable
4.7: Change of Basis of Rn
P [x]<em>B’ </em>= [x]<em>B ; </em>P–1 [x]<em>B </em>= [x]<em>B’</em>
P is the transitional matrix
P–1 is the inverted transitional matrix
[x]<em>B</em> is the coordinate matrix of x relative to B
[x]<em>B’ </em>is the coordinate matrix of x relative to B’
4.8: The Wronskian