Lecture 1 Flashcards
Define the concept of Weak Consistency.
An Estimator based on a sample of n observations is weakly consistent if the estimator converges in probability to the true value.
Define Convergence in Probability.
A sequence of random variables Xn converges in probability to X if for all δ>0,
lim P(|Xn - X| > δ) = 0 as n -> inf.
Define Almost Sure convergence.
A sequence of random variables Xn almost surely converges to X if for all δ>0,
lim P(|Xm - X| > δ for some m≥n) = 0 as n -> inf.
What is an alternative way of writing almost sure convergence.
P(lim |Xn - X|=0) = 1 as n -> inf.
Between convergence in probability and almost sure convergence, which implies the other? Why
Almost sure convergence implies convergence in probability but not the reverse. Because the set of almost convergent random variables is a subset of of its brother in convergence in probability, then the probability fo A.S is upper bounded by convergence in probability.
Give an alternative definition of convergence in probability.
For all δ>0, and ε>0, there exists some n0(δ,ε) s.t for all n ≥ n0,
P(|Xn - X| > δ ) < ε
Define the concept of uniform convergence.
A Sequence of random variables {Xn(θ)} converges uniformily to a r.v X(θ) if ,
lim P(sup|Xn(θ) - X(θ)| < ε} = 1 as n -> inf.
Define the concept of complete convergence.
We say that Xn converges completely to X if for all δ > 0,
∑ P(|Xn - X| > δ) < inf.
Where we take the sum from n = 0 to infinity. In other words, the sum of probability that the absolute difference between all random variables Xn and X must be finite.
What are the relationships between Almost sure convergence, convergence in probability and complete convergence?
Complete implies A.S and A.s implies Probability.
Define Slutzky’s Theorem.
Given Xn a sequence of kx1 vectors and X. Assuming Xn converges to X in probability, then given a function g(*) which is continuous on the domain of Xn. Then:
g(Xn) converges in P. to g(X)