Simple regression model Flashcards

1
Q

SLR. 1

A

Population model is linear

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

SLR. 2

A

Random sampling. X and Y are independent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

SLR. 3

A

Sample variation in x

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

SLR. 4

A

Zero conditional mean E(u|x) That u is mean independent of x

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

SLR. 5

A

Homoskedasticity - Var (u|x) = standard deviation squared

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

SST

A

Squared difference of observation from the mean

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

SSE

A

Squared difference of the fitted value from the mean

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

SSR

A

Sum of all squared residuals - Squared difference between the observed value and the fitted value

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

SST =

A

SSR + SSE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Error

A

Deviation of the observed value from true value

Hence error term is unobservable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Residual

A

Deviation between the observed value and the estimated value

Hence residual is observable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Perfect collinearity

A

Explanatory variable lies exactly on the linear function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

ui hat =

completely broken down

A

yi − ybi = yi − β0 hat − β1xi hat

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

F.O.C for B0

A

−2 Ε (yi − βc0 − βc1xi)= 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

F.O.C for B1

A

−2 Ε xi (yi − βc0 − βc1xi) = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is the OLS estimator trying to do?

A

Minimise SSR by deriving it (first order condition)

17
Q

Finding Bo hat

A
Take -2 away
Eyi -nB0 -B1Exi = 0 
Divide everything by n
y bar - B0 - B1xbar =0 
B0= Y bar -B1 xbar
bar = mean
18
Q

Finding B1 hat

A

Take -2 away
We know B0 hat so plug that in
Left with E(Yi-Ybar) - Exi(BiXi -B1Xbar) = 0
EXi(Yi-Ybar) = B1E(xi(xi-xbar))

19
Q

Exi(yi-ybar) =

A

E(xi-xbar)(yi-ybar)

20
Q

Average value of x

A

1/nExi = Xbar

therefore Exi = nxbar

21
Q

E (Xi-Xbar)(Yi-Ybar) =

useful trick

A

= E Xi(Yi-Ybar) - Xbar E (Yi-Ybar)
= E Xi(Yi-Ybar) -XbarNYbar + XbarNYbar
= E Xi(Yi-Ybar)

22
Q

E(Xi-Xbar)^2 =

A

= E(Xi-Xbar)(Xi-Xbar)
= EXi(Xi-Xbar) - XbarE(Xi-Xbar)
= EXi(Xi-Xbar) - XbarNXbar + XbarNXbar
= EXi(Xi-Xbar)

23
Q

B1

A

E(Xi-Xbar)^2

24
Q

Only assumption needed to calculate the OLS estimator B1

A

E (Xi-Xbar)^2 > 0

25
PRF
Population regression function - True relationship between x and y
26
SRF
Sample regression function | Yhat = Bohat + B1xhat
27
3 mathematical properties that hold in any sample of data. First two follow the two F.O.C
E Uihat = 0 E XiUihat = 0 (Xbar, Ybar) is always on the regression line
28
Unbiasedness assumptions
``` E(B1hat) = B1 E(B0hat) = B0 ```
29
Var(x)
1/1-n E (Xi-Xbar)^2
30
ZCM assumptions in terms of Y E(Y|X) = Var(Y|X) =
B0 +B1x | σ^2
31
y = B0 + B1x + u (Level - Level)
If you change x by one, we'd expect y to change by B1
32
Ln(y) = B0 + B1x + u (Log - Level)
If we change x by one unit, we'd expect our y variable to change by 100 x B1 %
33
Y= B0 + B1Ln(x) + u
If we increase x by 1% we would expect y to increase by B1/100 units of y
34
Ln(y) = B0 + B1Ln(x) + u
If we change x by 1%, we would expect y to change by B1%