Linear Algebra Flashcards

1
Q

Eigenvector

A

An eigenvector of a linear transformation is a non-zero vector that does not change its direction and is only scaled when the linear transformation is applied to it. The factor by which the eigenvector is scaled is called the corresponding eigenvalue.

Eigenvalue problem:

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Singular Value Decomposition (SVD)

A

Singular Value Decomposition (SVD) is a factorization of a matrix that generalizes the eigendecomposition of a square matrix to any m x n matrix:

M = UΣVT

The columns of U are called the left-singular vectors and are an orthonormal basis for the row space of M.

The columns of V are called the right-singular vectors and are an orthonormal basis for the column space of M.

Σ is a diagonal matrix of singular values, which are the explained variances for each of the left- and right-singular vectors.

The product of the three matrices denotes a composition of linear transformations with U and V being rotation transformations and Σ being a scaling transformation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Determinant

A

Determinant of a square matrix is a scalar value. Geometrically, it can be viewed as the volume scaling factor of the linear transformation described by the matrix. This is also the signed volume of the n-dimensional parallelepiped spanned by the column or row vectors of the matrix. The determinant is positive or negative according to whether the linear transformation preserves or reverses the orientation of a real vector space.

The determinant measures how much the volume changes by applying the linear transformation.

If the determinant is 0, (1) the vector space is squished into a smaller dimension, (2) the columns of the matrix are linearly dependent, (3) a system of linear equations has no unique solution, and (4) the matrix has no inverse.

If the determinant is < 0, the orientation of space inverts by applying the linear transformation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Cosine Similarity

A

Cosine Similarity is a measure of the similarity between two non-zero vectors and is the cosine of the angle between them.

It is thus a judgment of orientation and not magnitude. The values range from 1 (same direction), through 0 (orthogonal) to -1 (opposite direction).

Formula:

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Euclidean Distance

A

Euclidean Distance is a measure of the similarity between two vectors and is the length of the line segment between them.

It is thus a judgment of magnitude and not orientation. The more similar the vectors, the closer the Euclidean distance is to 0.

Formula:

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Frobenius Norm

A

The Frobenius Norm is a measure of the magnitude of a matrix and represents an extension of the Euclidean norm of vectors.

Formula:

How well did you know this?
1
Not at all
2
3
4
5
Perfectly