Probability Flashcards
Probability
What is the distribution function of a random variable X (D)
F(x)=P(X<=x)
Probability
Define the median of a distribution X (D)
The point m where P(X<=m)>=1/2 and P(X>=m)>=1/2. Or if this occurs over an interval, the midpoint of that interval
Probability
What is the definition of convergence in distribution? (D)
For all point x where F is continuous, Fn(x)->F(x)
Probability
What is the definition of convergence in probability? (D)
For all ε>0,
P( ¦Xn-X¦ > ε ) -> 0 as n->∞
Probability
What is the definition of convergence almost surely? (D)
P ( Xn -> X ) =1
So for all ε>0 there exists an N where ¦Xn-X¦
Probability
What is that annoying proof you need to prove the convergence of random variables proofs> (T)
If An is a sequence of increasing events. A1 in A2 in A3 (A1 has the smallest probability) then
lim[n->∞] P (An) = P ( U An )
Probability
What is the rough proof of proving that:
For an increasing sequence of events An
lim[n->∞] P (An) = P ( U An ) (T)
Start on RHS. Write as a disjoint set then write as a sum rather than union. Write the sum as a limit. Take the limit outside then put the sum back into a union.
Probability
Give the rough proof that convergence in probability => convergence in distribution (T)
Note if Xn<= x then either X<=x+ε or ¦Xn-X ¦>ε but that these probabilities overlap.
So Fn(x)<=P(X<=x+ε or ¦Xn-X¦>ε) then show Fn(x)F(x-ε)-ε so Fn(x)->F(x)
Probability
Give the rough proof that convergence almost surely => convergence in probability (T)
Let AN = { ¦Xn-X¦=N }. Almost surely guarantees AN for some N. So P ( U An ) = 1 = lim[n->∞] P(An) by weird lemma. But An => ¦Xn-X¦
Probability
Give an example as to why convergence in probability =/=> convergence almost surely (T)
Let Xn be a variable st. P(X=1)=1/n. P(X=0)=(n-1)/n. Then P ( ¦Xn-0¦= P( Xn = 0 ) -> 1 So Xn->0 in probability. Since Xn discrete {Xn->0}={Xn=0 eventually}={U Bn}
Let BN = { Xn = 0 for all n>=N } . Then for any N,K>0
P(BN)<= P ( Xn = 0 for all n=N,…,N+K )
= N-1/N * N/N+1 * … * N+K-1/N+K = N-1/N+K
Since K arbitrary P(BN)=0
So lim[n->∞] P (Bn) = 0 = P( U Bn ) so Xn-/->0 a.s
Probability
Give an example as to why convergence in distribution =/=> convergence in probability (T)
Let Y be st. P(Y=0)=1/2=P(Y=1`) and let Xn have the same distribution as Y but that P(Xn=Y)=0 for all n
Xn clearly -> Y in distribution but P ( ¦Xn-Y¦>1/2 ) = 1 for all n so no convergence in probability
Probability
Under what conditions does convergence in distribution => convergence in probability (T)
When Xn converges to a constant c
Probability
State Markov’s inequality (T)
For non-negative X and z>0
P(X>=z)<=E[X]/z
Probability
Prove Markov’s inequality (T)
Let Xz be st.
Xz=0 when 0<=X=z
So X>=Xz
E[X]>=E[Xz]= 0*P(X=z) then rearrange
Probability
State Chebyshev’s inequality (T)
For Y with finite mean and variance and ε>0
P ( ¦Y-E[Y]¦>=ε ) <= Var(Y)/ε^2
Probability
Prove Chebyshev’s inequality (T)
P( ¦Y-E[Y]¦>=ε ) = P( (Y-E[Y])^2 >= ε^2 ) <= E[ (Y-E[Y])^2 ]/ε^2 by Markov = Var(Y)/ε^2
Probability
State the weak law of large numbers (T)
Let Xi be i.i.d with mean μ and Sn=X1+…+Xn
Then Sn/n -P-> μ
Probability
Prove the weak law of large numbers (T)
E[Sn/n]=μ by independence. Var(Sn/n)=Var(Sn)/n^2=( Var(X1)+Var(X2)+…+Var(Xn) ) / n^2 = nσ/n^2=σ/n
By Chebyshev P ( ¦Xn-μ¦>ε ) <= Var(Sn/n)/ε^2 = σ/nε^2 -> 0 as n->∞
Probability
State the strong law of large numbers (T)
Let Xi be i.i.d with mean μ and Sn=X1+…+Xn
Then Sn/n -> μ almost surely
Probability
State the central limit theorem (T)
Let Xi be i.i.d with mean μ and variance σ. Let Sn=X1+…+Xn
Then Sn-μ/σroot(n) -d-> N(0,1)
Probability
What 3 things does the CLT tell us? (Q)
- That the distribution of Sn concentrates around nμ
- That the fluctuations of Sn are of order root(n)
- That the asymptotic distribution of these fluctuations are normal