Week 4: Expected Value Covariance Flashcards

1
Q

If Random Variable X takes finitely many values x1,x2..xn. with equal (uniform probability), the average of X is:

A

(1/n)*(sum of x values), where n is the number of values

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How to calculate weighted average when Random Variable X takes finitely many values x1,x2..xn. with equal (uniform probability).

A

Weighted average = sum(pi*xi) , p is probability and x a value (index of value in set)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the expected value of a continuous random variable?

A

Mean value of outcomes over an interval.

E(X) = Integral (range infinity to negative infinity) x fx(x) dx
- Where fx(x) is the probability density function (PDF) of X

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What does it mean for a random variable to be integrable?

A

Integrable means random variable expected value formula (discrete or continuous) converges (integral has a finite answer).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What does 1(a, b)(x) = to?

A

X is 1 when between a and b, and 0 otherwise.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

E[aX + b] =

A

aE[X] + b

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Expected value of a product of independent random variables is:

A

The product of the expected values

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Variance formula given X is real valued random variable and mean = E(X)

A

Var(X) = E[(X - mean)^2], if discrete

Var(X) = Integral (range infinity to negative infinity) (x - mean)^2 fx(x) dx, if continuous

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Using Variance translation theorem, if X satisfies E(X) = 0 (X is centered), what is variance?

A

Var(X) = E(X^2), because E(X) = mean = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Variance Translation Theorem

A

Var(X) = E(X^2) - (Mean^2) = E(X^2) - E(X)^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

If X is a real-valued random variable, then Var(aX + b) =

A

(a^2) * Var(X)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Covariance Formula

A

Cov(X, Y) = E[(X - mean x)(Y - mean y)]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Correlation formula (between two random variables)

A

Px,y = Cov(X, Y) / (stdX * StdY)

Px,y = E[(X - mean x)(Y - mean y)] / (stdX * StdY)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What happens to correlation if Cov(X, Y)?

A

If Cov(X, Y) = 0, Pxy=0, which means correlation is 0. Therefore X and Y are uncorrelated.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What does independence mean for correlation?

A

Indepence = Uncorrelated (but remember uncorrelated does not mean independent)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Var(X) + 2Cov(X, Y) + Var(Y) =

A

Var(X + Y)

17
Q

Var(X) - 2Cov(X, Y) + Var(Y) =

A

Var(X - Y)

18
Q

In joint Gaussian random variable, when are X1, X2, … Xd independent?

A

If C (covariance matrix) is a diagonal matrix (0’s everywhere except the diagonal)