2P7 Mathematics Flashcards

1
Q

What is a sample space?

A

The set of all possible outcomes in an experiment Ω

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are the three axioms of probability?

A

P(A) >= 0
P(Ω) =1
P(A U B) = P(A) + P(B) for A,B C Ω and A ∩ B = ∅
where ∅ is the empty set

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How is the probability A given B defined?

A

P(A|B) = P(A∩B)/P(B)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

How are probability mass functions and cumulative distribution function defined for discrete random variables?

A

Px(x) = P(X=x)
Fx(x) = P(X<=x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How is the joint PMF defined for two variables?

A

P(x,y) = P( X=x ∩ Y=y )

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is Baye’s rule?

A

P(B|A) = P(A|B)P(B)/P(A)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is marginilisation for joint PMFs?

A

Px(x) = sum over y (P(x,y))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the law of total probability?

A

P(A) = P(A|B)P(B) + P(A|B^c) P(B^c) where B^c is the complement of B

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Definition of independence of two events.

A

P(A∩B)=P(A)P(B)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Define expectation

A

E(g(X)) = sum(g(x)Px(X)) for all x in X, where X is the support of X
It is linear.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the expectation rule for two independent random variables?

A

E(XY) = E(X)E(Y) if X and Y are independent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Define Var[x]

A

E[(X-E[X])^2] = E[x^2] - E[x]^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Define entropy

A

H[X] = E[-log2P(X)]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the Bernoulli Distribution?

A

A trial with binary output
X∈{0,1}, p ∈ [0,1]
X ~ Ber(p)
Pk = { p if k=1, 1-p if k=0, 0 otherwise}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the geometric distribution used for?

A

How many trials until I’m successful. X ~ Geo(p)

Probability of success after x trials on that final trial.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the formula for the geometric distrbution?

A

P(k) = p(1-p)^(k-1)

13
Q

What does the binomial distribution describe?

A

How many times was I succesful after n trials?

X~B(n,p) where n is the numper of trials, and p is the probability of successs.

14
Q

What does the poisson distribution describe?

A

Ho wmany times was I succesful given a success rate λ

X~Pois(λ),

λ is the average number of events/time interval

Px(k) is the probability that there are k of these events in the time interval.

15
Q

How does the Bernoulli distribution relate to the binomial?

A

Xj~Ber(p)

sum over j (Xj) ~ B(n,p)

16
Q

How does the binomial distribution relate to the poisson distribution?

A

B(n, λ/n) -> Pois(λ)

17
Q

What are some properties of the cumulative distrbution function F(x)?

A

Non decreasing,
Fx(a) < Fx(b)

Limits are 0 and 1 for -∞ and +∞ respectively

Fx(b) - Fx(a) = P (a<= X <= b)

18
Q

Definition of the joint CDF?

A

F(x,y) = P (X <= x ∩ Y<=y)

19
Q

What is the exponential density function?

A

X~Exp(λ) where λ is the rate of successes, and X is the distance between two successes.

20
Q

What is the beta density function?

A

The PDF of the trial probability if we observe α−1 successes and β−1 failures.

X∼Beta(α,β)

X∈[0,1] X is also a probability and has a probability distribution.

21
Q

If X, and Y are discrete random variables and Y =g(X) how do you extract Py(y)

A

Py(y) = sum(Px(x)) the sum over x | g(x)=y

22
Q

If X, and Y are continuous random variables and Y =g(X) how do you extract fy(y)

A

given g is strictly monotonic,

f(y) = fx(g-1(y))/|g’(g-1(y))|

23
Q

What is the normalised Gaussian distribution?

A

(X-µ)/σ ~ N(0,1)

24
Q

How do you find the distribution of S=X+Y for discrete?

A

Ps(s) = sum over y ( P(s-y, y) = sum over x (P(x, s-x))

if X and Y are independent:

P(s) = sum over y (Px(s-y) Py(y)) [convolution]

25
Q

How do you find the distribution of S=X+Y for continuous?

A

fs(s) = integral ( fxy(s-y, y)dy)

for independent

fs(s) = integral ( fx(s-y) fy(y) dy ) fs = fx*fy where * is convolution

26
Q

How do expectation and variance variance change with S = X+Y?

A

E[S] = E[X] + E[y]
Var[s] = Var[X] + Var[Y] + 2Cov[X,Y]

where COV[X,Y] = E[XY] - E[X]E[Y]

if X and Y are indepedent Cov[X,Y] = 0

27
Q

What is correlation of two random variables?

A

ρ = Cov[X,Y]/sqrt(Var[X]Var[Y])

28
Q

What is the probability generating function?

A

Gx(z) = sum(z^k Px(k)) = E(Z^x)

29
Q

What’s some properties of the probability generating function for discrete random variables?

A

E[X] = G’x(1)

Var[X] = G’‘x(1) + G’x(1) - G’x(1)^2

if X and Y are independent
GX+Y = GX(z) x GY(z)

30
Q

What’s the moment generating function for continuous random variables?

A

gx(s) = integral(fx(x) e^sx ds) = E ( e^sx)

31
Q

What’s some properties of moment generating function?

A

E[X] = g’(0)

Var[X] = g’‘(0) - g’(0)^2

if X and Y are independent
gX+Y = gx(s) x gy(s)

32
Q

What’s the central limit theorem?

A

If you have a bunch of independent random variables distribution Gaussianly with means µi and variances σi^2

then
[sum(Xi) - sum(µi )]/sqrt(sum(σi^2)) ~ N(0,1)

33
Q

What is the analogy for variance in a multivariate Gaussian?

A

Σ where Σij = E[(Xi-µi)(Xj-µj)]