Linear Algebra Flashcards
Define a field
A set F with two binary operations + and × is a field if both (F,+,0) and (F{0},×,1) are abelian groups and the distribution law holds: (a+b)c = ac+bc, for all a, b, c ∈ F.
Define a field’s characteristic
The characteristic of a field is the smallest integer p such that 1 + 1 + · · · + 1 (p times) = 0. If no such p exists, the characteristic of F is defined to be zero.
Define a vector space
A vector space V over a field F is an abelian group (V,+,0) together with a scalar multiplication F×V → V such that for all a,b ∈ F and v, w ∈ V: a(v + w) = av + aw, (a + b)v = av + bv, (ab)v = a(bv) and 1.v = v.
Define linear independence
A subset S of a vector space V is linearly independent if whenever a1,…, an ∈ F, and s1,…, sn ∈ S, a1s1 + … + ansn = 0 ⇒ a1 = … = an = 0.
Define spanning
A subset S of a vector space V is spanning if for all v ∈ V there exists a1,…, an ∈ F and s1,…, sn ∈ S with v = a1s1 + … + ansn.
Define a basis
A subset S of a vector space is a basis if it is linearly independent and spanning.
Define a linear transformation
If V and W are vector spaces over F, then T: V → W is a linear transformation if for all a ∈ F and v,w ∈ V , T(av + w) = aT(v) + T(w).
Define an isomorphism of vector spaces
A bijective linear map between vector spaces.
Define a ring
A non-empty set R with two binary operations + and × is a ring if (R,+,0) is an abelian group, the multiplication × is associative and the distribution laws hold: for all a, b, c ∈ R, (a + b)c = ac + bc and a(b + c) = ab + ac.
Define a ring homomorphism
A map ϕ: R → S between two rings is a ring homomorphism if for all r, s ∈ R: ϕ(r + s) = ϕ(r) + ϕ(s) and ϕ(rs) = ϕ(r)ϕ(s).
Define a ring isomorphism
A bijective ring homomorphism.
Define an ideal
A non-empty subset I of a ring R is an ideal if for all s, t ∈ I and r ∈ R we have s − t ∈ I and sr, rs ∈ I.
Give the first isomorphism theorem (of rings)
The kernel Ker(ϕ) of a ring homomorphism ϕ: R → S is an ideal, its image Im(ϕ) is a subring of S, and ϕ induces an isomorphisms between the rings R/Ker(ϕ) and Im(ϕ).
Give the division algorithm theorem
Let f(x), g(x) ∈ F[x] be two polynomials with g(x) ≠ 0. Then there exists q(x), r(x) ∈ F[x] such that f(x) = q(x)g(x) + r(x) and deg r(x) < deg g(x).
Give Bezout’s Lemma for polynomials
Let a, b ∈ F[x] be non-zero polynomials and let gcd(a, b) = c. Then there exist s, t ∈ F[x] such that: a(x)s(x) + b(x)t(x) = c(x).
Define the minimal polynomial of a matrix
The minimal polynomial of A, denoted by m_A(x), is the monic polynomial p(x) of least degree such that p(A) = 0.
Define the characteristic polynomial of a matrix
The characteristic polynomial of A is defined as det(A − xI).
Define eigenvalues, eigenvectors
λ is an eigenvalue of A if there exists a non-zero v ∈ F^n such that Av = λv, and we call v the eigenvector.
Define a quotient vector space
The set of cosets V/U = {v + U : v ∈ V} with the operations (v + U) + (w + U) = v + w + U and a(v + U) = av + U for v, w ∈ V and a ∈ F is a vector space is called the quotient space.
Give the first isomorphism theorem for vector spaces
Let T: V → W be a linear map of vector spaces over F. Then Tbar: V /Ker(T) → Im(T), where v + Ker(T) → T(v) is an isomorphism of vector spaces.
Give the rank-nullity theorem
If T: V → W is a linear transformation and V is finite dimensional, then dim(V) = rank(T) + nullity(T).
Define invarience of a linear transformation
A subspace U of a linear transformation, T: V → V is T-invariant if T(U) ⊆ U.
Define an upper triangular matrix
If A = (aij) is an n × n matrix, it is upper triangular if aij = 0 for all i > j.
Give the Cayley-Hamilton Theorem
If T: V → V is a linear transformation and V is a finite dimensional vector space, then χT(T) = 0. Hence, the minimal polynomial divides the characteristic polynomial.
Define a direct sum
Let V be a vector space. V is the direct sum V = W1 ⊕ · · · ⊕ Wr of subspaces W1,… , Wr if W1 + W2 + … + Wr = V and Wi∩Wj is empty for all i ≠ j.
State the primary decomposition theorem
Let mT be the minimal polynomial of a linear transformation T and write it in the form mT (x) = f1^(q1)(x)…fr^(qr)(x) where the fi are distinct monic irreducible polynomials. Let Wi = Ker(fi^(qi)(T)). Then, V = W1 ⊕ · · · ⊕ Wr, Wi is T-invariant and mT restricted to Wi = fi^(qi).
Give the condition for triangularisability
T is triagonalisable ⇐⇒ mT factors as a product of linear polynomials.
Give the condition for diagonalisability
T is diagonalisable ⇐⇒ mT factors as a product of distinct linear polynomials.
Define nilpotency
Given a linear transformation T, if T^n=0 for some n > 0 then T is called nilpotent.
Define a Jordan block
A Jordan block J_i is an i x i matrix with 0 everywhere except 1s on the diagonal one to the right of the main diagonal.
Define a dual
Let V be a vector space over F. Its dual V’ is the vector space of linear maps from V to F, that is V’ = Hom(V, F).
Define linear functionals
The elements of a dual are called linear functionals.
Define a natural homomorphism
A homomorphism that is independent of the choice of basis.
Define a hyperplane
The preimage f^(-1)({c}) for a constant c ∈ F of a non-zero linear functional f: V → F is called the hyperplane of f.
Define an annihilator
Let U ⊆ V be a subspace of a vector space V. The annihilator of U is the set U^0 = {f ∈ V’ : f(u) = 0 for all u ∈ U}.
Define a dual map
Let T: V → W be a linear map of vector spaces. Define the dual map T’: W’ → V’ by f → f◦T.
Define a bilinear form
Let V be a vector space over a field F. A bilinear form on V is a map F: V × V → F such that for all u, v, w ∈ V and λ ∈ F: F(u + v, w) = F(u, w) + F(v, w), F(u, v + w) = F(u, v) + F(u, w) and F(λv, w) = λF(v, w) = F(v, λw).
Define a symmetric bilinear form
F: V × V → F is symmetric if F(v, w) = F(w, v) for all v, w ∈ V.
Define a non-degenerate bilinear form
F: V × V → F is non-degenerate if F(v, w) = 0 for all v ∈ V implies w = 0.
Define an inner product
A bilinear form is an inner product if it is symmetric and positive definite.
Define a sesquilinear form
Let V be a vector space over C. A sesquilinear form on V is a map F: V × V → C such that for all u, v, w ∈ V and λ ∈ C: F(u + v, w) = F(u, w) + F(v, w), F(u, v + w) = F(u, v) + F(u, w) and F(λ*v, w) = λF(v, w) = F(v, λw).
Define a conjugate symmetric sesquilinear form
F: V × V → C is conjugate symmetric if F(v, w) = F(w, v)* for all v, w ∈ V.
Define an inner product space
A real vector space V is an inner product space when endowed with an inner product. A complex vector space V is an inner product space when endowed with a sesquilinear, conjugate symmetric, positive definite form F.
Define an orthogonal basis
Given an inner product space, {w1,… , wn} are orthogonal if ⟨wi , wj⟩ = 0 for all i≠j.
Define an orthonormal basis
Given an inner product space, {w1,… , wn} are orthonormal if they are orthogonal and ⟨wi , wi⟩ = 1 for each i.
Define an orthogonal complement
Let U ⊆ V be a subspace of an inner product space V. Then the orthogonal complement is U^⊥ = {v ∈ V | ⟨u, v⟩ = 0 for all u ∈ U}.
Define an adjoint map
Given a linear map T: V → V, a linear map T: V → V is its adjoint if for all v, w ∈ V, ⟨v, T(w)⟩ = ⟨T(v), w⟩.
Define a self-adjoint map
A linear map T: V → V is self-adjoint if T = T*.
Define an orthogonal transformation
Let V be a finite dimensional real inner product space and T: V → V be a linear transformation. If T* = T^(−1) then T is called orthogonal.
Define a unitary transformation
Let V be a finite dimensional complex inner product space and T: V → V be a linear transformation. If T* = T^(−1) then T is called unitary.
Define the orthogonal group
O(n) = {A ∈ Mn×n(R)|A tA = Id}.
Define the special orthogonal group
SO(n) = {A ∈ O(n)| det A = 1}.
Define the unitary group
U(n) = {A ∈ Mn×n(C)|A tA = Id}.
Define the special unitary group
SU(n) = {A ∈ U(n)| det A = 1}.
State the singular value decomposition theorem
Let F = R or C. Every matrix A ∈ F^(m×n) with m ≥ n can be written as A = UΣV* , where U and V are matrices with orthonormal columns and Σ is a diagonal matrix with nonnegative diagonal entries σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0.