Conditional expectation Flashcards

1
Q

Loss function
▪ Definition

A

Consider two random variables, X and Y.
Assume:
▪ X is Rⁿ-value
▪ Y is real valued

The idea is that X is observable and Y has to be “predicted” on the basis of X

We consider the following
“Best prediction problem”
find a function
Ϥ: Rⁿ –> R
such that the “loss function” is minimized

L(Ϥ) = E[(Y-Ϥ(X))²] –> least squared
loss function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Conditional density
▪ Definition

A

We assume (X,Y) is a discrete random vector, with joint density pₓ,y

p y|x (y|x) = P(Y=y|X=x)
p y|x (y|x) = (P(Y=y|X=x)) / P(X=x)
p y|x (y|x) = pₓ,y (x,y) / pₓ(x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Conditional expectation
▪ Definition

A

We define the conditional expectation of Y given X=x as follows

E(Y|X=x) = Σ y py|ₓ (y|x)

More generally

E(g(x,y)|X=x) = Σ g(x, y) py|ₓ (y|x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Conditional expectation
▪ Properties

A

1) E[aY+bZ|X=x] = aE(Y|X=x) + bE(Z|X=x)
2) E[f(x)|X=x] = f(x)
3) E[f(x)Y|X=x] = f(x) * E[Y|X=x]
4) E[ E(Y|X) ] = E(Y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Conditional expectation
▪ Theorem

A

The function Ϥ(x) = E(Y|X=x) solves the best prediction problem

In other words, for any
g:Rⁿ–>R

L(g) ⩾ L(Ϥ)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly