Topic 3: Multiple Regression Model Flashcards

1
Q

The OLS estimates are chosen to minimize what MLR model with k variables?

A

the sum of [(y_i) - (beta^0) - (beta^1xi1) - (beta^2xi2)-…-(beta^kxik)]^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

In MLR, holding all other intendent variables equal except x, the change in y = ?

A

change in y^ = beta1 change in x^1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What does it mean to say that a OLS estimator is unbiased?

A

That means that the expected value of the estimates is equal to the population coefficient.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is an important assumption for the unbiasedness of estimators?

A

the zero conditional mean assumption is what makes the expected value of the estimates to be equal to the parameter

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the bias in a simple regression from leaving out an important variable, beta2?

A

beta2 * the sum of [(xi1 - xbar)xi2] / the sum of [(xi1 - xbar)^2]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

When the correlation between x1 and 2x is greater than zero, and x2 is omitted from the estimated equation, when is the bias positive and negative?

A

When corr(x1,x2)>0 and beta2>0 the bias is positive (the expected value is greater than the parameter) and when corr(x1,x2)>0 and beta2<0 the bias is negative.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

When the correlation between x1 and 2x is less than zero, and x2 is omitted from the estimated equation, when is the bias positive and negative?

A

When corr(x1,x2)<0 and beta2>0 the bias is negative (the expected value is less than the parameter) and when corr(x1,x2)>0 and beta2<0 the bias is positive.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How to remember the bias in beta~1, when x2 is omitted?

A

When the correlation and the beta2 have the same sign the bias is positive, when they have different signs, the bias is negative.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Stata: How to leave how certain observations from a regression?

A

reg y x1 x2…. if var =

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is multicollinearity?

A

Correlation among independent variables in a MLR model. An R-squared of 1 shows perfect collinearity, but there is no standard amount under 1 that always is too much.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How do you interpret a the effect of a model on y where two variables are correlated (but not perfectly) with the variable of interest?

A

Combine the coefficients of the two variables in the regression to get the effect of the variable of interest on y.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Why is the data in an experiment better than observational data when it comes to MLR assumptions?

A

In the experiment, it is given that the treatment is randomly assigned, thus the zero conditional mean assumption will hold.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Would omitting a variable cause bias in your estimators?

A

Yes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Would including an irrelevant (uncorrelated) variable cause biased in your estimators?

A

No.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Even though including uncorrelated variables does not cause bias, why would you not want to include such variables?

A

Because they could have undesirable effects on your variance.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Why are OLS estimators used?

A

Because they are the Best Linear Unbiased Estimators. by MLR assumptions 1-5

17
Q

Why do we need a multiple regression analysis?

A

Because it is difficult to draw a ceteris paribus conclusion about x affecting y without other variables to explain more variation in y without being correlated with u.

18
Q

What is goodness of fit?

A

It is the proportion of the sample variation that is explained by the regression line, it is equal to SSE/SST. It usually increases as independent variables are added.

19
Q

What is the formula for the sampling variance of OLS estimators?

A

Var(beta^j) = (sigma^2) / SSTj*(1-Rjsquared), where SSTj is the sample variation in xj, and Rjsquared is from regression xj on all other xs.

20
Q

How could you find SSTj?

A

SSTj = the sum of (xij - xbarj)^2

21
Q

How could you find Rjsquared?

A

Run a regression of xj on the other interdependent variables, then look for the R-squared value in stata.

22
Q

What is the unbiased estimator of sigma^2 (the error variance)?

A

sigma^2 = [sum (u^i)^2] \ [n-k-1] = SSR/ (n-k-1)

23
Q

The variance of beta^j depends on what three factors?

A

The error variance (sigma^2), SSTj (total sample variance of xjs), and R^2

24
Q

What does larger R^2s do to Var(beta^j)?

A

var(beta^j) increases

25
Q

A small sample size (and SSTj) can lead to what kind of sampling variance?

A

large sampling variances

26
Q

What is Var(beta^j)?

A

Var(beta^j) = [sigma^2] / [SSTj(1-R^2)]

27
Q

What is the Gauss-Markov Theorem?

A

Theorem that states that under MLR 1-5, OLS estimators are BLUE, and have the smallest variance.