Test 3 Flashcards
If Ax = $x for some vector x, then $ is an eigenvalue of A
TRUE - by definition
If Ax = $x for some scalar $, then x is an eigenvector of A
TRUE - by definition
A matrix A is invertible if and only if 0 is an eigenvalue of A
FALSE - not enough info, eigenvalue can be 0 does not mean invertible
Finding an Eigenvector of A may be difficult, but checking whether a given vector is in fact an eigenvector is easy
TRUE - there should be a scalar associated with said vector resulted in A
To find eigenvalues of A, reduce A to echelon form
FALSE - not necessarily
If v1 and v2 are linearly independent eigenvectors, then they correspond to distinct eigenvalues
TRUE - eigenvectors associate with their own eigenvalues
If v is an eigenvector with eigenvalue 2, then 2v is an eigenvector with eigenvalue 4.
TRUE - eigenvalue gets scaled by the same scalar factor (2) used to multiply the eigenvector
If 0 is an eigenvalue of A, then A is invertible
FALSE - but if A is non-invertible (meaning it doesn’t have an inverse), then 0 must be an eigenvalue of A
The matrices A and B-1.A.B have the same sets of eigenvalues for every invertible matrix B
TRUE - pre-multiplying and post-multiplying a matrix by the inverse of another invertible matrix doesn’t change the eigenvalues because the transformation properties are preserved by the similarity relation.
A is diagonalizable if A =PDP-1 for some matrix D and some invertible matrix P
TRUE - by definition
A is diagonalizable if and only if A has n eigenvalues, counting multiplicities
FALSE - missing condition: A needs to have n linearly independent eigenvectors (where n is the dimension of the matrix) AND The n eigenvectors must also span the entire vector space that A operates on
A is diagonalizable if A has n eigenvectors
FALSE - missing condition
Similar matrices have the same eigenvalues
TRUE - the similarity relation between matrices preserves the eigenvalues because it only changes the representation of the transformation, not the underlying scaling properties captured by the eigenvalues.
Similar matrices have the same eigenvectors
FALSE -while similar matrices share the same “stretching factors” (eigenvalues), the specific directions along which this stretching occurs (eigenvectors
Only linear transformation on finite vectors spaces have eigenvectors
FALSE - if the eigenvalues turn out to be complex and the scalars are restricted to real numbers), the concept itself is not limited to finite dimensions.