final exam review Flashcards
solution for continuous dynamic systems with eigenvalues p +/- iq
for the linear system dx/dt = Ax, where A is a 2 x2 matrix with eigen vals p + iq (and q != 0),
Consider an eigen vector v + iwwith eigenvalue p + iq
Then x(t) = e^pt * S * [ (cos(qt) - sin(qt)) | (sin(qt) cos(qt)) ] * S^-1 x0
where S = [w v]. (S-1 * x0) is the coordinate vector of x0 with respect to basis w, v.
trajectories of continuous dyanmical systems with eigenvalues p + iq
ellipses (linearly distorted circles) if p = 0
spirals inwawrd if p is negative
spirals outward if p is postiive.
stability of continuous dynamic systems (three conditions)
for the system dx/dt = Ax where A is a real 2 x 2 matrix, the zero state is asymtotically stable equilibrium solution iff tr A < 0 and det A > 0. Another case is that iff the real parts of all eigenvalues of A are negative.
euler’s formula
e^it = cos(t) + i sin(t)
complex exponential function characterization
if Y is a complex number, then z = e^Yt is the unique, complex valued function such that
dz/dt = Yz and z(0) = 1
general solution of a continuous dynamical system
for the system dx/dt = Ax, suppose there is an eigenbasis v1, …, vn for A, with associated eigen values Y1, …, Yn
Then the general solution of the system is
(c1)(e^Y1t)v1 + … (cn)(e^Ynt)vn
OR
x(t) = [eigenbasis] * [diagonals of e^Yi*t] [matrix of coordinates c]
how to solve a linear differential equation
for linear differential equation dx/dt = kx, with initial value x0,
the solution is x(t) = (e^kt)x0
ways a linear dynamical sstem can be modeled
discrete: x(t + 1) = Bx(t)
continuous: dx/dt = Ax`
singular value decomposition
any n x m matrix A can be written as A = U Z V^T.
Z is the diagonal of the matrix’s singular values. (It always has the same dimensions as original matrix). Singular values are the square roots of the symmetric matrix A^T(A)
V is the orthonormalized eigenbasis of A^T(A). Orthonormalizing a basis usually only involves making vectors unit vectors, especially if they are perpendicular.
Each column vector of U is produced by
u1 = 1/(singularvalue1) * A * V1
replacing 1 with whatever floats your boat.
significance of singular value decompositions
for L(x) = Ax (a linear transform from Rm to Rn), there’s an orthonormal basis v1, v2…vm such that
Vectors L(vi) are orthogonal and their lengths are the singular values of matrix A
v1..vm are the orthonormal eigenbasis fo A^T(A) - V in the sigular value decomposition.
definiteness of a quadtraic form
For a quadtraic form q(x) = xAx where A is a symmetric n x n matrix,
A is positive definite if q(x) is positive for all nonzero x
positive semidefinite if >= 0
and negative definite/semidefinite analogousy.
Indefinite otherwise.
A symmetric matrix is positive definite iff all of its eigenvalues are positive. Semi if >=0. And so forth.
how to diagonalize a quadratic form
for q(x) = x * Ax Find the orthonormal eigen basis with eigen values Y1...Yn.
Have q(x) = Y1c1^2 + … Yn(cn^2)
Wheren teh ci are the coordinates of x with respect to B.
principal axes
the eigenspaces of A are teh principal axes of q when q(x) = x * Ax
They are one-dimensional.
orthogonal diagonalization of a symmetric matrix A
find the eigenvalues of A and a basis of each eigenspace.
Use Gram-Schmdt to find an orthonormla basis of each eigenspace.
Form an orthonomral eigenbasis v1, v2…vn for A by concatenating the orthonormal basis I find.
S = [ v1 v2 … vn] and is orthogonal and S^-1AS will be diagonal. Finding the later finishes the diagonalization.
eigenvalue test for matrix symmetryq
a symmetric n xn matrix A has n real eigenvalues if they are counted with their algebraic multiplicities
spectral theorem
a matrix is orthogonally diagonalizable (there exists an orthogonal matrix S such that S^-1AS = S^TAS) iff A is symmetric (A^T = A)
orthogonality of eigenvectors
if A is symmetric and v1 and v2 are eigenvectors of A with distinct eigenvalues, then v1 * v2 = 0. And the two are orthogonal.
modulus and argument
modulus: |z| wheren z = b + ai equals… sqrt(b^2 + a^2)
argument: the polar angle of the complex number.
Found by drawing [b a] and then using trig to find the angle (usually involves arctan(a/b).
arg(zw) = arg(z) + arg(w)
polar form
z + r(cosZ + isinZ)
where Z is the polar angle and r is the magnitude of the compex number
de moivre’s formula
(cosZ + isinZ)^n = cos(nZ) + isin(nZ)
trace, determininat, complex eigenvalues
tr A = Y1 + .. Yn
det A + Y1 * .. Yn
when the eigenvalues are all complex.
stability and eigenvalues
a dynamical system x(t + 1) = Ax(t): its zero state is asymtotically stable iff the modulus of all the complex eigenvalues of A is less than 1.
discrete model with complex eigen vales - solution
for x(t + 1) = Ax(t), with eigen values p+/- iq = r(cosZ +/i sin(Z))
let v + iw be an eigen vector of A with eigenvalue p + iq,
x(t) = r^t S[cos(Zt) -sin(Zt) | sinZt cosZt ] [coordinates]
where S = [w v]
how to determine if a matrix is diagonalizable
we wish tifnd an invertible matrix S such that S^-1 A S = B is diagonal.
a. find the eigenvalues of A
b. for eigen eigenvalue find the eigenbasis v1..vn
c A is diagonalizable iff the dimensions of the eigenspaces add up to n.
S = [v1 …vn] and B = [diagonals of eigenvalues]
eigenvalues of similar matrices
if A is similar to B,
A and B hav the same characteristic polynomial,
rank A = rank B and nullity A = nullity B
A and B have the same eigenvalues, etc,
same determinant, same trace
characterizations of invertible
Ax = b has a unique solution for all b in Rn rref A = In rank A = n im A = Rn ker A = {0} columns of A form a a basis of Rn columns of A span Rn columns of A are linearly independent det A != 0 0 fails to be an eigenvalye of A
volume of a parallelpiped
for a matrix [v1 v2 v3] and the parallpiped defined by its vectors, its volumn is |detA|
For n x m matrices its sqrt(det(A^T * A))
expansion factor
area of T(space)/area of space
determinant of the transpose
if A is square, det(A^T) = det A
elementary row operations and determinants
if b comes from dividng a row of A by a scalar k, det B = (1/k) det A
row swap: detB = -A
adding a multple of A to another row: detB = detA
determinants of powers and products
det(AB) = det A * detB
powers is now obvious
determinants of similar matrices
det A = detB
determinants of inversion of a matrix
det(A^-1) = 1/detA
imA and A^T
imA^perpendicular = ker(A^T)
^perpendicular is the orthogonal complement, the st of vecotrs x in Rn orthogonal to all vectos in A
kernels and transpose
kerA = ker(A^T A) if ker(A) = {0)} then A^T A is iinvertible. Duh.
least squares solution
for Ax = b, a least square soslution is he one that minimizes the difference between b and Ax for all x in R^m.
the least square solution is the exact solution of the consistent system
A^TA x = A^T b. That’s the normla equation of Ax = b
the matrix of the orthogonal project using the transpose
A (A^T A)-1 A^T is that matrix.
matrix of orthogonal projection found by orthonomrla basis
for a subapce V with orthonomral basis u1..vm, the matrix P of the orthogonal project onto V is
P = QQ^T where Q is thr matrix of u1..m
properties of the transpose
(A + B) ^ T = A^T + B^T (kA)^T = k(A^T) (AB)^T = B^T A^T rank(A^T) = rankA (A^T)^-1 = (A^-1)^T
characteristics of orthogonal matrices
A is orthogongal.
L(x) = Ax preserves length.
columns of A forma an orthonormal basis of Rn
A^T A = In
A^-1 = A^T
A preserves the dot projet for all x and y. (Ax) * (Ay) = x * y
angle between two vectors
arrcos [ x * y / ||x||||y||]
also the correlation coefficient
formlar for the orthogonal project
different from gram schmit
)u1 * x)u .. (um * x) um
coordinates in a subspace of Rn
[x]b is the coordinates of the funciton x cv1 + c2v2 + … cmvm
where the coordinates express relations between the values of x and the vectors of the basis of V, b.
They have linearity properties,
rank-nulity theorem
dim(kerA) + dim(imA) = m for any n * m matrix.
matrix for orthogonal project
[u^2 u1u2
u1u2 u2^2]
where [u1 u2] is a unit vector parallel to L, the line being projected onto
also (ui * x)ui …
reflection
[a -b
b -a]
find with 2projL(x) - x for a specific line. a^2 + b^2 = 1
rotation
cos Z -sinZ
sinZ cos Z
or [a -b
b a] where a^2 + b^2 = 1
Z is the angle. Multiply by r to also scale.
horizontal shear
[1 k
0 1]
Vertical is by swap)