eigensystems and canonical forms take 2 Flashcards
eigenvector:
a vector x is an eigenvector of A (a square matrix) if x is nonzero and Ax=λx where λ is a scalar
eigenvalue:
a scalar λ is an eigenvalue of A (a square matrix) if λ is nonzero and Ax=λx where x is nonzero
eigenpair:
an eigenvector and its associated eigenvalue, each eigenvector has one eigenvalue but each eigenvalue has many associated eigenvectors
characteristic polynomial:
p(λ)=det(λI-A)=0
spectrum:
set of all eigenvalues of A, written Λ(A)
algebraic multiplicity:
of λ, its multiplicity as a 0 of the characteristic polynomial, which I Think means the number of times λ appears as a root of p(λ)?
invariant subspace:
a subspace X in C^n is invariant for A if AX in X, that is, x in X implies Ax in X
matrices and subspaces:
let the columns of X in C^(nxp), p<=n, form a basis for a subspace Y of C^n, then Y is an invariant subspace for A iff AX=XB for some B in C^(pxp). when the latter holds, the spectrum of B is contained within the spectrum of A
similar:
A and B (both square) are similar if there exists a nonsingular matrix P such that B=P^(-1)AP
similarity transformation:
P^(-1)AP
transforming matrix:
P in a similarity transformation
unitarily similar:
B=U*AU where U is a unitary matrix
orthogonally similar:
A and B are real, B=U^(T)AU where U is a real orthogonal matrix
diagonalisable:
if a matrix A is similar to a diagonal matrix, A is diagonalisable or simple
schur’s theorem:
let A be square, then there exists a unitary matrix U and an upper triangular matrix T such that T=U^(-1)AU=U*AU
schur decomposition:
A=UTU*, equivalent to T=U^(-1)AU
schur vectors:
the columns of U
normal:
a matrix is normal if AA=AA
spectral theorem:
let A be square, then A is normal iff there exists a unitary matrix U and diagonal matrix Λ such that A=UΛU*
normalcy and orthogonal vectors:
an nxn matrix A is normal iff it has n orthogonal eigenvectors
diagonalisability and eigenvectors:
a square matrix A is diagonalisable iff A has n linearly independent eigenvectors
diagonalisability and eigenvalues:
a matrix with distinct eigenvalues is diagonalisable
the jordan canonical form:
any square matrix can be expressed in the form X^(-1)AX=J=
[J1(λ1)
…
Jp(λp)]
Jk=Jk(λk)=
[λk 1
λk 1
… …
1
Ak] in C^mkxmk
where X is nonsingular and m1+…+mp=n
jordan block:
the mkxmk matrices in the jcl
number of jordan blocks:
the number p of jordan blocks is the number of linearly independent eigenvectors of A, so it’s diagonalisable iff p=n
algebraic multiplicity of an eigenvalue:
the algebraic multiplicity of a given eigenvalue λ is the sum of dimensions of the jordan blocks in which it appears
geometric multiplicity of an eigenvalue:
the geometric multiplicity of a given eigenvalue λ is the number of associated jordan blocks, so the number of associated linearly independent eigenvectors, so dim(null(A-λI))
defective (eigenvalue):
if the eigenvalue appears in a jordan block of size greater than 1, or equivalently if its algebraic multiplicity > its geometric multiplicity
defective (matrix):
a matrix is defective if it has a defective eigenvalue, or equivalently if it does not have a complete set of linearly independent eigenvectors
how to find the jcl (but not the similarity transformation matrices just the jcl):
find all the distinct eigenvalues by finding the roots of the characteristic polynomial or whatever
for each distinct eigenvalue λi of A form (A-λiI), (A-λiI)^2, … and analyse the sequence of ranks as follows - the smallest value of ki for which rank(A-λiI)^ki attains its minimum value is the order of the largest jordan block corresponding to λi, called the index, and the number of jordan blocks of size k in J with eigenvalue λi is rank(A-λiI)^(k-1)+rank(A-λiI)^(k+1)-2rank(A-λiI)^k
generalised eigenvectors:
take the X in X^(-1)AX=J (J=the jcl) - the columns of X in positions 1, m1+1, m1+m2+1, …, m1+…+m(p-1)+1 and these are linearly independent eigenvectors of A, the Other columns are the generalised eigenvectors
jordan chain:
equating the first m1 columns of X to the first jordan block J1 gives Ax1=λ1x1, Axi=λ1xi+x(i-1), i=2,…,m1 - the vectors x1,x2,…,xm1 are called a jordan chain, the columns of X form p jordan chains - next one is x(m1+1),…,x(m1+m2) and so on
cayley-hamilton theorem:
if p is the characteristic polynomial of an nxn matrix A, then p(A)=0
minimal polynomial:
let A be an nxn matrix with s distinct eigenvalues, the minimal polynomial of A is q(A)=(s)Π(i=1)(λ-λi)^ni, where ni is the dimension of the largest jordan block in which λi appears