0 - Special Topic: Eigenvalues + Eigenvectors Flashcards
Define an eigenvector and eigenvalue in plain English.
Eigenvector = a vector that doesn’t change direction or doesn’t rotate/shear when a given linear transformation or matrix multiplication is done
Eigenvalue = the value by which the eigenvector is stretched or scaled by after a given linear transformation or matrix multiplication is done
Define an eigenvector and eigenvalue mathematically in one algebraic line.
A·x = lambda · x
where lambda = the eigenvalue
and x = the eigenvector
Starting with A·x = lambda·x ,
show how to solve for the eigenvectors
A·x = lambda·x
set to zero
A·x - lambda·x = 0
multiply by the Identity matrix
A·x - lambda·I·x = 0
factor out the eigenvector x
(A - lambda·I) · x = 0
Multiply both sides by the multiplicative inverse (aka - divide)
(A - lambda·I)-1 · (A - lambda·I) · x = (A - lambda·I)-1 · 0
x = (A - lambda·I)-1 · 0
x = 0
However, we want non-trivial answers to this equation (not like the above) so we can’t have the matrix be invertible. This is equivalent to saying the below:
det(A - lambda·I) = 0
where the determinant of a matrix is the “volume of scaling factor change” after applying some linear transformation.
The LHS of the expression above is also known as the “characteristic polynomial”.
The solutions to this equation are the eigenvalues lambda. We can then plug these eigenvalues lambda back into the equation below to solve for the eigenvectors x.
(A - lambda·I) · x = 0
[subtract lambdas from our diagonals of our matrix A] times our vector x, and set to 0
[x1, x2, …] = [0, 0, …]
and solve via usual Gauss-Jordan Elimination
= [… , … | 0, 0, …]
The set of all eigenvectors of a linear transformation, each paired with its corresponding eigenvalue is called the _________ of that transformation.
The set of all eigenvectors of a linear transformation, each paired with its corresponding eigenvalue is called the eigensystem of that transformation.
The set of all eigenvectors of T corresponding to the same eigenvalue, together with the zero vector, is called the ______ or the ________ of T associated with that eigenvalue.
The set of all eigenvectors of T corresponding to the same eigenvalue, together with the zero vector, is called the eigenspace or the characteristic space of T associated with that eigenvalue.
What is the…
- matrix
- characteristic polynomial
- eigenvalues
- eigenvectors
…associated with the following linear transformation?
What is the…
- matrix
- characteristic polynomial
- eigenvalues
- eigenvectors
…associated with the following linear transformation?
What is the…
- matrix
- characteristic polynomial
- eigenvalues
- eigenvectors
…associated with the following linear transformation?
What is the…
- matrix
- characteristic polynomial
- eigenvalues
- eigenvectors
…associated with the following linear transformation?
In Principle Component Analysis (PCA) the method is done on the _____ matrix and the eigenvectors correspond to the ________ and the eigenvalues correspond to the ________.
In Principle Component Analysis (PCA) the method is done on the covariance matrix or correlation matrix (in which each variable is scaled to have its sample variance equal to one) and the eigenvectors correspond to the principle components while the eigenvalues correspond to the variance explained by the components.
In PCA, what do the largest eigenvalues correspond to?
Principal component analysis of the correlation matrix provides an orthogonal basis for the space of the observed data: In this basis, the largest eigenvalues correspond to the principal components that are associated with most of the covariability among a number of observed data.
In vibration analysis, the eigenvalues are the ________ of a vibration, while the eigenvectors are the _______.
The eigenvalues are the natural frequencies (or eigenfrequencies) of vibration, and the eigenvectors are the shapes of these vibrational modes.
PCA is used for _______-________. Name one common benefit of this, one common pitfall, and one uncommon benefit.
PCA is used for dimensionality-reduction.
One common benefit is to reduce our computational burden when we have an incredibly large data set.
One common pitfall is that we lose interpretability (since our features are no longer the original features, but projections of them).
One uncommon benefit is that this can be used to anonymize data from reverse-engineering hacks as it is hard to decompress and expand back out our feature space.
In spectral graph theory, the eigenvalue of a graph is defined as the graph’s _______ or ______ matrix. The former is represented by ____, and the latter is calculated from the former by ______.
The principal eigenvector of a graph is defined as the eigenvector corresponding to the kth largest or smallest eigenvalue of the Laplacian. It is used to measure the ______ of its vertices.
In spectral graph theory, the eigenvalue of a graph is defined as the graph’s adjacency or Laplacian matrix. The former is represented by A, and the latter is calculated from the former by L = D-A.
The principal eigenvector of a graph is defined as the eigenvector corresponding to the kth largest or smallest eigenvalue of the Laplacian. It is used to measure the centrality of its vertices.