5. Limit Theorems Flashcards
Multivariate Normal Distribution
-suppose X_ = (X1,X2,…,Xn) is a vector of random variables
-if X_ has pdf:
f(x_) = (2π)^(-n/2) det(Σ_)^(-1/2) exp(-1/2 (x_-µ_) Σ_(x_-µ_) )
-then we say that X_ is a multivariate normal distribution:
X_~Mulitvariate Normal(µ, Σ_)
Properties of the Multivariate Normal Distribution
Expectation
-suppose X_=(X1,X2,…,Xn) and X_~Mulitvariate Normal(µ, Σ_)
-then:
E(X_) = µ
Properties of the Multivariate Normal Distribution
Covariance
-suppose X_=(X1,X2,…,Xn) and X_~Mulitvariate Normal(µ_, Σ)
-then:
cov(Xi,Xj) = Σij
-where Σij is an element of the matrix Σ
Properties of the Standard Multivariate Normal Distribution
Bivariate Special Case
-suppose X_=(X1,X2) and X_~Mulitvariate Normal(µ, Σ_) then:
µ_ = (0,0)
Σ_ = 2x2 identity matrix
f(x1,x2) = product of two univariate standard normally distributed random variables
Standard Multivariate Normal Distribution
-suppose Z_=(Z1,Z2,…,Zn) is distributed:
Z_~Mulitvariate Normal(0, In)
-where In is an nxn identity matrix
-then Z is the standard multivariate normal
Independence and the Multivariate Normal Distribution
-we have shown that if X and Y are independent: cov(X,Y)=0
-but zero covariance does no GENERALLY imply independence
-HOWEVER
- in the case of the multivariate normal:
X_~Mulitvariate Normal(µ, Σ_)
-where X_=(X1,X2,…,Xn)
-IF
cov(Xi,Xj) = 0
-THEN Xi and Xj are independent
Change of Coordinates
-suppose X is a random variable and Y=g(X) where g:ℝ->ℝ, that is a monotonic function
-define g^(-1):ℝ->ℝ such that g^(-1)(g(x)) = x and is differentiable, then:
fy(y) = fx(g^(-1)(y))) * |∂g^(-1)(y)/∂y| if y=g(x) for some x and 0 otherwise
-this can be used to derive may pdfs
Markov’s Inequality
-suppose X is a random variable that takes on non-negative values then for all a>0 :
P(X≥a) ≤ E(X)/a
Chebyshev’s Inequality
-if X is a random variable with mean μ and variance σ² then for all k>0 :
P( |X-μ| ≥ k) ≤ σ²/k²
Weak Law of Large Numbers
-suppose X1,X2,…,Xn is a random sample from a distribution with mean μ and var(Xi) = σ²
-then for all ε>0 :
P{ |(X1+X2+…+Xn)/n - μ| ≥ ε} ≤ σ²/nε²
Sequence of Random Variables
- sequence of independent and identically distributed random variables
- sequence: X1,X2,…,Xn
- think of a sequence as sampled data:
- -suppose we are drawing a sample of n observations
- -each observation will be a random variable Xi
- -with realisation xi
Mean / Variance of Sample Mean
- let X1,X2,…,Xn be a ransom sample from a distribution with mean μ and variance σ²
- let Xn^ be the sample mean
- then E(Xn^)=μ and Var(Xn^)=σ²/n
Weak Law of Large Numbers
Limit
-suppose X1,X2,…,Xn is a random sample from the distribution with mean μ and variance σ², then for all ε>0:
P{ |(X1+X2+…+XN)/n - μ| ≥ ε} -> 0 as n->∞
Convergence Definition
-for a sequence of real numbers
{ai} = {a1,a2a3,…,an,…}
-we say that the sequence {ai} converges to a real number A if for any ε>0 there is a positive integer N such that, for n≥N:
|an-A| < ε
Pointwise Convergence Definition
-for a sequence of functions: {fi} = {f1,f2,f3,...,fn,...} -suppose fi:X->ℝ for all i -then {fi} converges pointwise to f if, for all x∈X and ε>0 there is an N such that for all n≥N: |fn(x) - f(x)| < ε