Lecture 2 Flashcards
Linear regression with time series data
model assumptions
method of Ordinary Least Sqaures
Method of momednts
Properties of OLS in finite and in infinte samples
Model assumptions
Can be regressed as
* Yt = Bo +B1 xt + et
if we assume E [et|xt] = 0, then
* E [yt|xt] = Bo +B1 xt + E [et|xt]
* Bo +B1 xt
happens as the condtional mean u= 0, the unconditonal u will be 0
happens as et is true random
Homoscedasticty
Variance et is constant for all values of x
asummptions beteween xt and et
crucial for interpreting the model and derving properties of any estimator Bo and B1
Overview of model assumptions- Normal regression w/fixed regressor
E [y|x] = XB
X = fixed (not random)
Rank (x) = full
e= fully independent , homoscedastic, no serial correlation I.e. σ^2I , et~ N(0,σ^2)
B^ols ~ N, unbiased (BUE)
Overview of model assumptions- Normal regression w/stirctly exog. regress
E [y|x] = XB
X = stochastic
Rank (x) = full
e= mean independence with E[ et|xt] = 0 ∀t, homoscedastic , np serial corelation = Var (e|x) =σ^2I , et|x~ N(0,σ^2)
Bols|x~N , unbiased
Overview of assumption models- regression w/weakly exog. regress
E [y|x] = XB
X = stochastic
Rank (x) = full
e= mean independence with E[ et|xt] = 0 ∀t, homoscedastic , np serial corelation = Var (e|x) =σ^2I , et|x~ N(0,σ^2)