SU2 - The Properties of Random Variables, The Normal and Its Related Distributions Flashcards

1
Q

What is expected value of a random variable?

A

It is the weighted average of all possible values of X aka mean

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the Expectation of Sums?

E(X1+X2)

A

E(X1)+E(X2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the result of E(c1X1+c2X2)?

A

E(c1X1) + E(c2X2) = c1E(X1) + c2E(X2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the expectation of a binomial distribution?

A

np

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is Jensen’s inequality?

A

g(E(X)) ≤ E(g(X))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the result of E (aX + b)?

A

E (aX + b) = aE(X) + b

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the expectation of a Bernoulli distribution?

A

p

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Is the expectation of the function of a random variable equals to the function of the random variable’s expectation?
E [g (X)] = g (E [X])

A

No, unless g(X) is a linear function, Jensen’s inequality

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Is the expectation of the product of random variables equal to the product of their expectations?

A

No, only if the random variables with the product are independent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the Cauchy distribution?

A

Student t distribution with 1 degree of freedom. In this case, the expectation may not exist

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the variance?

A

To measure how “spread out” the values of a random variable are
𝑉𝑎𝑟𝑋=𝐸[(𝑋−𝜇)2] or 𝐸(𝑋2)−𝜇2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

For any constant c, what is the value of Var(c)?

A
  1. For any constant, value of variance is 0 because there is no variance at all.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

For any constants a and b, Var(aX+b) = ??

A

For any constants a and b, Var(aX+b) = a^2 Var(X)

When we add b to X, we only shift the distribution of X laterally. The shape of the distribution of X stays the same; therefore, the variance of X remains unchanged

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

For any constants a and b, and sd ( aX+b ) = ??

A

|a| sd (X)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What does standardization mean?

A

To re-centre the expectation of the random variable to 0 and to normalise its variance to 1, i.e. E(X) = 0 and Var(Z) = 1.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

If 𝑋1 and 𝑋2 are uncorrelated/independent, Var𝑋1+𝑋2 =?

A

Var(X1) + Var(X2)

17
Q

Is the variance of the sum of random variables always equal to the sum of their variances?

A

No, unless the variables are independent

18
Q

What happens to the covariance when X and Y are independent?

A

If X and Y are independent, then Cov(X,Y) = 0, and E(XY) = E(X)E(Y)

19
Q

What is the correlation coefficient?

A

The correlation coefficient can be thought of as a standardised covariance that does not depend on units of measurement.

corr(X,Y) = 1(-1) means a perfect positive (negative) linear relationship

20
Q

What is the variance of a bernoulli distribution?

A

Var(X) = np(1-p)

21
Q

What is the alternate expression for Cov(X, Y)?

A

E(XY) - 𝜇𝑋𝜇𝑌

E(XY) - E(X)E(Y)

22
Q

What are the three properties of covariances?

A
  • For any constant a, Cov(a,X) = 0.
  • For random variable Z, Cov(X+Y,Z) = Cov(X,Z)+Cov(Y,Z).
  • For any constants a1 and a2, Cov(a1 X,a2 Y) = a1 a2 Cov(X,Y).
23
Q

Does zero correlation imply independence?

A

Yes

24
Q

What happens if you add a constant to a random variable in a covariance?

A

Does not affect its covariance with another random variable

E(X) = E(E(X│Y)) = E(c) = c

25
Q

What happens if you multiply a constant to a random variable in a covariance?

A

Affect the covariance by the same multiple

26
Q

What is a conditional expectation?

A

Explaining one variable in terms of the other variable

27
Q

What is the law of Iterated Expectation?

A

E[E(Y|X)] = E(Y)

28
Q

How to write a conditional expectation of a random variable Y given a random variable X?

A

E(Y|X)

29
Q

E(X|X) = ?

A

X

30
Q

E(c(X)|X) = ?

A

c(X)

31
Q

if X and Y are independent, E(Y|X) = ?

A

E(Y)

32
Q

if E(Y|X) = E(Y), then Cov(X,Y) = ?

A

0

33
Q

E(XY) can directly jump to E(E(XY|Y)?

A

Yes

34
Q

What is a normal random variable?

A

The normal random variable is symmetrically distributed around its mean μ. The peak of the normal density function occurs at Y=μ and the density function is concentrated around μ.

35
Q

How to read this?

X ∼ N(µ, σ2)

A

The normal distribution with mean µ and variance σ2

36
Q

If X ∼ N(µ, σ2) and if Y = aX+b and a =/= 0, then y = ?

A

Y has a normal distribution with mean aµ+b and variance a2o2

every linear function of X will also have a normal distribution.

37
Q

What is the standard normal?

A

A normal random variable with mean 0 and variance 1.

38
Q

What is the Chi-Square mean and variance?

A
E(Y) = m
Var(Y) = 2m
39
Q

How is the chi-square related to the standard normal?

A

When the degrees of freedom in a Student-t goes to infinity, the Student-t becomes the standard normal.