Chapter 16 Singular Value Decomposition Flashcards
What’s Matrix decomposition/Matrix defactorization?
P 139
Matrix decomposition, also known as matrix factorization, involves describing a given matrix using its constituent elements.
Perhaps the most known and widely used matrix decomposition method is ____.
P 139
the Singular-Value Decomposition, or SVD
What makes SVD, more stable than other methods of matrix factorization?
P 139
All matrices have an SVD, which makes
it more stable than other methods, such as the eigendecomposition.
The formula for SVD matrix decomposition is as below:
A = U · Σ · Vh
What is each constituent part size?
And what are the column vectors of U and V called?
P 140
A is the real n × m matrix that we wish to decompose
U is a m × m matrix,
Σ (sigma) is a m × n diagonal matrix
Vh is a n × n matrix
The columns of the U matrix are called the left-singular vectors of A, and the columns of V are called the right-singular vectors of A.
The diagonal values in the Σ matrix are known as ____
P 140
the singular values of the original matrix
A.
The SVD can be calculated by calling the ____ function (from scipy). The function takes a matrix and returns the ____, ____ and ____ elements.
P 140
svd()
U, Σ and Vh
Can we reconstruct the original matrix directly, using what svd() function returns?
P 141
elements returned from the svd() cannot be multiplied directly. The s (sigma) vector must be converted
into a diagonal matrix using the diag() function. (then broadcasted to a m × n matrix of zeros)
What’s the pseudoinverse?
P 143
The pseudoinverse is the generalization of the matrix inverse for square matrices to rectangular matrices where the number of rows and columns are not equal.
How is pseudoinverse for rectangular matrices calculated?
P 143
The pseudoinverse is denoted as A+, where A is the matrix that is being inverted.The pseudoinverse is calculated using the singular value decomposition of A:
A+ = VhT· D+ · UT
Where D+ is the pseudoinverse of the diagonal matrix Σ, UT is transpose of U and VhT is transpose of Vh.
NumPy provides the function____for calculating the pseudoinverse of a rectangular matrix.
P 144
pinv()
How is dimensionality reduction using SVD done?
P 145
To do this we can perform an SVD operation on the original data and select the top k largest singular values in Σ. These columns can be selected from Σ and the respective rows are selected from Vh.
If B is A after reducing dimension to k, using SVD how is B constructed? (formula)
P 145
Answer ⬇️
It’s constructed as below:
B = U · Σk · Vhk
In practice, we can retain and work with a descriptive subset of the data called T.
This is a dense summary of the matrix or a projection.
T = U · Σk
this transform can be calculated and applied to the original matrix A as well as other similar matrices.
T = A · VhTk
The scikit-learn provides a ____ class that implements SVD dimensionality reduction directly.
P 147
TruncatedSVD
from sklearn.decomposition import TruncatedSVD
Do we need to use all the components of Sigma to reconstruct the original matrix A?
| P 146 code
No, we can use the first K columns (containing singular values) of Sigma and consequently, the first K rows of Vh, to reconstruct the original matrix with a good precision.
The maximum K needed to reconstruct the original matrix is min(A.shape), more than that, isn’t going to make any difference because the values in Sigma matrix would be all zeros
Singular values in Sigma matrix of SVD decomposition, are sorted from largest to smallest. True/False
| P 145
True