2. Multiple Variable Linear Regression Model Flashcards

1
Q

How often is the two variable model used?

A

Rarely because important factors are often left out which makes the zero conditional mean invalid

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What happens if one out of 50 variables is correlated with u

A

The OLS estimators will be biased since the zero conditional mean won’t hold

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What are the normal equations?

A

They are the first order conditions we get when we do the minimisation of the SSR. There are k+1 equations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What happens to our estimators if there is a strong correlation between x1 and x2?

A

We get less information since we are unsure which variable it is that affects y

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Partialing out

A

We partial out the part of the variation in y that could be explained by either x1 or x2 or both.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Gauss Markov assumptions

A
  1. Linear in parameters
  2. Random sampling
  3. No perfect collinearity
  4. Zero conditional mean
  5. Homoskedasticity
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What do we do if two variables are perfectly collinear?

A

Exclude one of them

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What happens to our estimate if MLR4 isn’t valid?

A

The estimation is possible but it will be biased. This is hard to solve

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is Rj^2?

A

A measure of the correlation between xj and all the other explanatory variables

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What things affect the variation of our estimate of Bj?

A
  • increased var(u) increases it
  • increased sample size decreases it
  • increased variation in explanatory variables of x decreases it
  • increased Rj^2 increases it
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Multicollinearity

A

When Rj^2 increases so does the variance of our variables, this is the only downside from using MLR over SLR

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

How does the problem of multicollinearity affect our model?

A
  • it doesn’t cause bias

* it seriously inflates our variation of our estimates of variables when the correlation is very high

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

How can we help the problem of multicollinearity?

A
  • Increasing the sample size

* it is tempting to remove a variable but this will cause bias

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

When are OLS estimators BLUE?

A

When the Gauss markov assumptions hold

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Null hypothesis

A

Ho- it is the original hypothesis which we are challenging

How well did you know this?
1
Not at all
2
3
4
5
Perfectly