Lecture 1 Flashcards

1
Q

What are the equations of the expectation of X in a discrete and continuous case

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

For a single draw of random variable Y, the best constant predictor C in the sense of minimising is

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

what is MSE

A

Mean Squared Error
1. means Y is our random variable
2. we are then taking c away from Y, no matter Y value
3. We square the result
4. We then Add the Sum of scenarios

uy is inbiased predictor of Y
MSE is E [(Y-uy)^2] = the var

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is a conditional Moment

A

the joint probability distribution of two random variables of X and Y

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Linear Regress

A

Assumes that the regression function or Conditonal Expectation Function (CEF) given by

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Linear Regress- what happens if Y and X are independent

A

look at the bottom part

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Linear Regress- what if both Y and X are jointly normally distributed?

A

Linear model holds with:

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Linear Regress– what if true CEF is not linear,

A

Called a projection

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the best predictor h(x) of Y in the sense of minimising for a single draw of a pair of random variables

A

MSE = E [(Y-h(x))^2]
where the conditional mean h(x) = E [Y|x]

uy|x is an unbiased predictor ie. E [ Y- E [ Y-X]]
MSE is E [(Y- E [Y|X)^2] = E [ Var of y | X ]
Linear projection E [ Y|X] is the best predictor among the class of all linear functions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Types of Data

A

Cross section - one or more variables at a single point of time
Time Series- data collected over a period of time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Random Sampling

A

Assume it can be obtained from the underlying pop
- each person has the same chance of getting picked
- may describe random smapling of size T as a collection of iid (Independent and identically distrusted)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Model a Time- Series

A

we often capture a temporal relationship
- infinite stochastic process is a sequence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Goal of modeling a time series

A
  1. describe prob behaviour of the underlying stochastic process that is believe to have generated the data
  2. use the data to estimate moments of the process
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Modeling time-series 2 = how do we esitmate the moments of the process

A

Make Assumptions, regarding the joint behavior of the random variables
Preserve:
- identical distribution assumption from cross sectional analysis
-allow for dependencies between variables close together in time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

White Noise are independent stand normally distrubited with

A

E [Yt] = 0 all of time
Var [ Yt] = 1 for all of time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

White Noise Conditional Expectation of 2 adjacent pairs Yt, Yt+1?

A

E [Y|Y-1] = E [Yt] =0
should be zero as Yt given Yt-1 is equal to the unconditional expectation , E [Yt]. Normally distrbuted with mean zero

17
Q

Auto regssion - to introduce serial correlation, let’s construct the following
1. draw T = 100 observations from ut ∼ N(0, 1)
2. set Y0 = u0
3. generate the remaining Yt ’s as follow
Yt = 0.9 · Yt−1 + σut (t = 1, 2, . . . , 99) (1)
with σ = √(1 − 0.9^2)

A

Because Y1 is a linear combination of two independent N(0, 1) variables with σ^2 y = 0.9^2 + σ^2 = 1, it follows that

Yt ∼ N(0, 1) ∀t

However, they are not independent:

C[Yt , Yt+1] = C[Yt , 0.9 · Yt + σut+1]
= 0.9 C[Yt , Yt ]+σ C[Yt , ut+1]
= Var[Yt ]=1 =0
= 0.9 · 1 + σ · 0
= 0.9

{z }

Can also do this for any 2 obs which are 2 periods apart:
C[Yt , Yt+2] = 0.9^2

18
Q

What do we call the covaraince with its self, even though they are at diffrent points of time

A

AutoCovariance

19
Q

Stationarity conditions

A

E (Yt ) is constant for all t

▶ 0 < Var (Yt ) < ∞,
Var (Yt ) is constant and finite for all t

▶ 0 ⩽ |C (Yt , Yt±s ) | < ∞,
C (Yt , Yt±s ) is constant for all t and any s,
C (Yt , Yt±s ) = f (s, Θ

the 3 moments dont change over time

20
Q

What does Statistical independence imply

A

Mean independence
Uncorrelatedness
Hence Mean independence means uncorrelatedness

21
Q

Time series process is stationary if:

A
  1. uncodntional mean E [yt] is not a function of time- constant
  2. the unconditional Var (yt) is constant
  3. auto covariance C (yt, yt+s) is nit a function of time and only dpends on the lag s- t is constant for any s