Proofs Flashcards
If a1, a2, . . . , ak are mutually orthogonal vectors in V (i.e. every pair ai′aj = 0 ; i ≠ j) and all the vectors are non-zero, then the set of vectors are linearly independent.
1.7.3
Proof
Let A: n × m = [ a(1) , a(2) , . . . , a(m) ]. The null space of A′ and the orthogonal complement of the column space of A are the same.
1.9.1
Proof
For any matrix A: n × m it is true that AA′ is symmetric, positive semi-definite and has the same
rank as A.
1.9.14
Proof
i) If A^c : m × n is a conditional inverse of A: n × m and the equations
Ax = y
are consistent (i.e. a solution exists for determining x ), then x1 = A^c y is a solution of
Ax = y
ii) If x1 = A^c y is a solution of the equations Ax = y for every y for which these
equations are consistent, then A^c must be a conditional inverse of A.
1.10.3
Proof
Let A be an n × m matrix. If (A′A)^c is a conditional inverse of A′A, then {(A′A)^c }′ is also a conditional inverse of A′A. Conversely, if {(A′A)^c }′ is a conditional inverse of A′A, then (A′A)^c is also a conditional inverse of A′A.
1.10.4
Proof
The system of equations
A′Ax = A′y (A of size n×m)
for determining x is consistent and if x is any solution, then Ax is uniquely determined.
1.11.2
Proof
For A′Ax = A′ y (A of size n × m) x is a solution if and only if Ax is the projection of y onto the vector space generated by the columns of A (or that x′A′ is the projection of y′ onto the vector space generated by the rows of A′ ).
1.11.3
Proof
For any vector y , A(A′A)^c A′ y is the projection of y onto the vector space V(A) generated by the columns of A: n × m where (A′A)^c is any conditional inverse of A′A (or y′A(A′A)^c A′ is the projection of y′ onto the vector space generated by the rows of A’).
1.11.4
Proof
If L is a linear function of k independent variables x1 , x2 , . . . , xk and L = a′x = x’a where a′ = (a1 , a2 , . . . , ak ), then ∂L/∂x = a .
1.13.1
Proof
Y ~ normal(0, Ip) standard normal for multivariate
Derivation of the MGF of multivariate normal
Proof
If X: p × 1 ~ normalp(μ;Σ), then Y : q × 1 = CX +b is normalq(Cμ +b;CΣC′) distributed where C: q × p is of rank q ≤ p.
2.2.1
Proof
Let X : p × 1 ~ normalp(μ; Σ), then the elements of X are statistically independently distributed if and only if (iff) the elements of X are uncorrelated, i.e. iff Σ is a diagonal matrix.
2.2.2
Proof
Let X : p×1~ normal(μ ; Σ) being partitioned as in (2.3.2). The marginal distribution of Xi is the normal( μi ; Σii ) distribution.
2.3.1
Proof
Let X : p ×1 ~ normal(μ;Σ) being partitioned as in (2.3.2). The sub-vectors X1 and X2 are
statistically independently distributed iff X1 and X2 are uncorrelated, i.e.. iff Σ12 = 0: q × (p – q).
2.3.2
Proof
Let X: p × 1 ~ normalp( μ ; Σ), then Q = (X – μ )′Σ–1(X – μ ) ~ χ2(p).
2.6.1
Proof