QE 1: Probability Theory Flashcards

1
Q

What are Kolmogorov’s axioms of probability?

A

(1) P(A) ≥ 0 for all events A.
(2) P(Ω) = 1
(3) P(A or B) = P(A) + P(B) if A and B are mutually exclusive.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is independence? Define three different ways.

A

Events A and B are independent if any of the following hold:
P(A|B) = P(A)
P(B|A) = P(B)
P(A and B) = P(A)P(B)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What does it mean to say that the expectation is a ‘linear operator’? Why is it one?

A

E[aX+bY] = aE[X] + bE[Y].

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the ‘law of the unconscious statistician’?

A

E[g(x)] = ∫g(x)f(x)dx

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the law of iterated expectations?

A

E[E[Y|X]] = E[Y]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is ‘i.i.d data’?

A

Observations are independently and identically distributed.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

In what sense is E[Y|X] the ‘best possible predictor for Y on the basis of X’?

A

E[Y|X] minimises E(Y-m(X))^2.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Define convergence in probability.

A

Xn converges in probability to c, if for every small e > 0, P{|Xn - c|≤e} -> 1 as n grows large.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the Law of Large Numbers?

A

The (weak) Law of Large Numbers claims that:

Suppose Xi is i.i.d. with finite mean and variance. Then, the sample mean is a consistent estimator of the population mean.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the Central Limit Theorem?

A

If Xi is i.i.d. with finite mean and variance, then √n(xbar - mu) / sigma -> N[0,1].

How well did you know this?
1
Not at all
2
3
4
5
Perfectly