Linear Transformation And Linear System Flashcards

1
Q

Affine transformation

A

An affine transform is a combination of a linear transform with a translation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Affine Transform - Formal Definition

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Householder reflection matrix

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Anisotropic

A

When the scaling factors across different dimensions are different, the scaling is said to be anisotropic

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

All linear transformations defined by matrix multiplication can be expressed as a sequence of

A

Rotations/reflections, together with a single anisotropic scaling

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Dimensionality of a vector space

A

The number of members in every possible basis set of a vector space V is always the same. This value is referred to as the dimensionality of the vector space.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Matrix Invertibility and Linear Independence

A

An n × n square matrix A has linearly independent columns/rows if and only if it is invertible

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

The normal equation

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Left-inverse of the matrix A in the normal equation

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Right-inverse of the matrix A in the normal equation

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Converting coordinates from one system to another

A
  • A is the original coordinates system
  • B is the target coordinates system
  • xb and xa and the vectors being transformed
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

The normal equation expressed as a change of coordinates system

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Disjoint Vector Spaces

A

Two vector spaces U⊆Rn and W⊆Rn are disjoint if and only if the two spaces do not contain any vector in common other than the zero vector

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Orthogonal Vector Spaces

A

Two vector spaces U⊆Rn and W ⊆ Rn are orthogonal if and only if for any pair of vectors u ∈ U and w ∈ W, the dot product of the two vectors is 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Rectangular matrices are said to be of full rank when either the rows or the columns are linearly independent

A

The former is referred to as full row rank, whereas the latter is referred to as full column rank

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Left Null Space

A

The left null space of an n × d matrix A is the subspace of Rn containing all column vectors x ∈ Rn, such that AT x = 0. The left null space of A is the orthogonal complementary subspace of the column space of A

17
Q

The four fundamental subspaces of linear algebra

A
18
Q

Linear Independence and Gram Matrix

A

The matrix AT A is said to be the Gram matrix of the column space of an n × d matrix A. The columns of the matrix A are linearly independent if and only if AT A is invertible

19
Q

Orthogonality of Basis Vectors for the discrete cosine transform

A

The dot product of any pair of basis vectors bp and bq of the discrete cosine transform for p != q is 0

20
Q
A
21
Q

Regularized left and right inverse

A
22
Q

Orthogonal projection of a point onto a line

A
23
Q

Projection of a point onto a line

A

the Subspace goes in the Subscript

24
Q

Moore-Penrose pseudoinverse

A
25
Q

Moore-Penrose pseudoinverse relation to other inverses

A
  • The conventional inverse, the left-inverse, and the right-inverse are special cases of the Moore-Penrose pseudoinverse.
  • When the matrix A is invertible, all four inverses are the same.
  • When only the columns of A are linearly independent, the Moore-Penrose pseudoinverse is the left-inverse.
  • When only the rows of A are linearly independent, the Moore-Penrose pseudoinverse is the right-inverse.
  • When neither the rows nor columns of A are linearly independent, the Moore-Penrose pseudoinverse provides a generalized inverse that none of these special cases can provide.
  • Therefore, the Moore-Penrose pseudoinverse respects both the best-fit and the conciseness criteria like the left- and right inverses.
26
Q

Orthonormal matrix relation to the identity matrix

A
27
Q

Projection matrix in the normal equation

A
28
Q

Computing the projection matrix via QR decomposition

A
29
Q

Condition Number

A

Let A be a d×d invertible matrix. Let ||Ax||/||x||
be the scaling ratio of vector x. Then, the condition number of A is defined as the ratio of
the largest scaling ratio of A (over all d-dimensional vectors) to the smallest scaling ratio
over all d-dimensional vectors.

30
Q

Inner Products: Restricted Definition

A

A mapping from x, y ∈ Rn to x, y ∈ R is an inner product if and only if 〈x, y〉 is always equal to the dot product between Ax and Ay for some n × n non-singular matrix A. The inner product 〈x, y〉 can also be expressed using the Gram matrix S = AT A

31
Q

Inner product: cosines and distances

A

When the linear transformation A is a rotreflection matrix, the matrix S is the identity matrix, and the inner product specializes to the normal dot product

32
Q

Inner-Product: General Definition

A

The real value 〈u, v 〉 is an inner product between u and v, if it satisfies the following axioms for all u and v:

33
Q

Complex-valued inner product

A
34
Q

Conjugate Transpose of Vector and Matrix

A

The conjugate transpose v of a complex vector v is obtained by transposing the vector and replacing each entry with its complex conjugate. The conjugate transpose V of a complex matrix V is obtained by transposing the matrix and replacing each entry with its complex conjugate

35
Q
A
36
Q

Orthogonality in Cn

A
37
Q

Orthogonal Matrix with Complex Entries

A
38
Q

The Discrete Fourier Transform

A

Used for finding an orthonormal basis for time-series in the complex domain