Definitions Flashcards
Matrix
For m,n>=1 an m x n matrix is a rectangular array with m rows and n columns
Row/column vector
A row vector is a 1 x n matrix. A column vector is an m x 1 matrix
Scalar (in terms of matrices)
The entries of a matrix are called a scalar. They come from a field, usually F
Addition and scalar multiplication of matrices
Addition: let A=(aij) and B=(bij) be m x n matrices. We define A+B=(cij) to have (i,j) entry cij=aij+bij We describe this addition as coordinatewise
Scalar multiplication: Let A=(aij) be an m x n matrix over F and p be in F. We define pA to be the m x n matrix with (i,j) entry paij
Commute (in terms of matrices)
AB=BA
Upper triangular matrix
If aij=0 whenever i>j
So the matrix only has elements in and above the diagonal
Lower triangular matrix
If aij=0 whenever i
Invertible (in terms of matrices)
If there exists B st.
AB=In=BA
Orthogonal matrix
AB=In=BA
Where B is the transpose of A
Unitary matrix
AB=In=BA
Where B is the transpose of the complex conjugate of A
Augmented matrix A|b
The m x n matrix A with the matrix b a joined as the (n+1)th column
Echelon form
1) Leading coeffs are 1
2) Leading entries of lower rows occur to the right of leading entries of higher rows
3) Zero rows, if any, appear below any non-zero rows
Determined and free variables
Let E|d be an augmented matrix where E is in echelon form. A variable Xi is determined if there exists a leading coeff in column i
Otherwise Xi is free
Reduced row echelon form
A matrix is in reduced row echelon form if it is in echelon form and if all columns containing the leading entry of a row has all other entries 0
Elementary matrix
For an ERO on an m x n matrix we define the corresponding elementary matrix to be the result of applying that ERO to Im
Vector space
A vector space of F is a non-empty set V together with a map
VxV->V given by (v,v’)|->v+v’ and a map
FxV->V given by (p,v)|->pv
That satisfy the vector space axioms
Vector space axioms
1) u+v=v+u (addition is commutative)
2) u+(v+w)=(u+v)+w (addition is associative)
3) there is 0v such that v+0v = v = 0v+v (existence of additive identity)
4) there is w such that v+w=0v (existence of additive inverse)
5) p(u+v)=pu+pv (distributivity of scalar multiplication over vector addition)
6) (p+q)v=pv+qv (distributivity of scalar multiplication of field addition)
7) (pq)v=p(qv) (scalar multiplication interacts well with field multiplication)
8) 1v=v (identity for scalar multiplication)
Subspace
A subspace of V is a non-empty subset of V that is closed under addition and scalar multiplication
Proper subspace
A subspace of V other than V
Linear combination
Take u1,u2,…,um in V
A linear combination of u1,…,um is a vector
a1u1+…+amum for some a1,…,am in F
Span
We define the span of u1,…um to be {a1u1+…+amum:a1,…,am in F
Spanning set
A set S is a spanning set for V if Span(S)=V
We say S spans V
Linearly dependant
We say that v1,…,vm in V is linearly dependant if there exists a1,…am not all 0 in F such that
a1v1+…+amvm=0
Linearly independent
If for v1,…,vm in V and
a1,…,am in F
v1,…,vm are linearly independent if the only way
a1v1+…+amvm=0 is if a1=…=am=0
Basis
A basis of V is a linearly independent spanning set
Dimension
For a finite dimensional vector space V. The dimension of V is the size of any basis of V
Row space
For an mxn matrix A over F. We define the row space of A tk be the span of the subset of F^n consisting of the rows of A
Direct sum
For subspaces U,W of V. We say that V is the direct sum of U and W if
U+W=V and
U∩W = {0v}
Direct compliment
For subspaces U, W of V
We say that U is the direct compliment of W in V if V is the direct sum of U and W
Column space
For an mxn matrix A over F. We define the column space of A tk be the span of the subset of F^n consisting of the columns of A
Row Rank
dim(rowsp(A))
Column Rank
dim(columnsp(A))
Linear transformation / map
Let V,W be vector spaces over F. We say that a map T:V->W is linear if
I) T(v1+v2)=T(v1)+T(v2) for all v1,v2 in V
II) T(Pv)=PT(v) for all v in V and P in F
Invertible (in terms of linear maps)
Let T:V->W
We say T is invertible if there is a linear transformation S:W->V such that
ST=idv and TS=idw (where idv and idw are the identity maps on V and W respectively)
T is a function so has a unique inverse so no ambiguity in writing T^-1
(See intro to uni maths course)
Kernel (or null space)
ker(T)={v∈V:T(v)=0w}
All the things that are mapped to 0
Image (of a linear map)
Im(T)={T(v):v∈V}
Nullity
For a finite dimensional V. Let T:V->W
We define null(T)=dim(kerT)
Rank (of a linear map)
For a finite dimensional V. Let T:V->W
We define rank(T)=dim(ImT)
A matrix for a linear transformation T:V->W with respect to specific ordered bases for V, W.
Let v1,…,vn be a basis for V, Let w1,…,wm be a basis for W. The matrix for T with respect to these bases is the mxn matrix where the jth column corresponds to the following coefficients of the basis of W.
T(vj)=a1jw1+a2jw2+,,,+amjwm
Similar Matrices
A and B (∈Mnxn(F)) are similar if there exists and invertible nxn P such that. B=P(^-1)AP
Rank of a matrix
For an n x m matrix A the rank of A is the rowrank of A
Bilinear Form
For a vector space V over F, a bilinear form is a function, often written [v,v’] (inequalities don’t work) : VxV->F such that:
(i) [a1v1+a2v2,v3]=a1[v1,v3]+a2[v2,v3] (linear in first variable), and
(ii) [v1,a2v2+a3v3]=a2[v1,v2]+a3[v1,v3] (linear in second variable)
Gram Matrix
Take v1,…,vn in V. The Gram matrix of v1,…,vn with respect to is the nxn matrix ([vi,vj]). So the (i,j)th element is [vi,vj]
Symmetric bilinear form
We say that a bilinear form [v,v’] :VxV->F is symmetric if
[v1,v2]=[v2,v1] for all v1,v2 in F
Positive definite bilinear form
REAL VECTOR SPACE ONLY
We say that a bilinear form [v,v’] :VxV->F is positive definite if [v,v]>= 0 for all v in V. [v,v]=0 iff v=0
Inner Product
An inner product on a real vector space V is a positive definite symmetric bilinear form on V
Inner Product Space
We say that a real vector space is an inner product space if it is equipped with an inner product,
Norm
Let V be a real inner product space. For v in V, we define the norm of v to be ||v||:= sqrt([v,v])
Orthonormal Set
Let V be an inner product space. We say that {v1,…,vn} a subset of V is an orthonormal set if for all i, j we have
[vi,vj]=δij (1 if i=j, 0 if i/=j)
Sesquilinear Form
Let V be a complex vector space. A function [v,v’] : VxV->C is a sesquilinear form if
(i) [a1v1+a2v2,v3]= a1[v1,v3] + a2[v2,v3]
(ii) [v1,v2]= complex conjugate of [v2,v1]
Positive definite sesquilinear form
We say a sesquilinear form is positive definite if [v,v]>=0 for all v with =0 iff v=0
Complex inner product space
A complex inner product space is a complex vector space equipped with a positive definite sesquilinear form