Econometrics Final Quizzes Flashcards

1
Q

The interpretation of the slope coefficient in the model Yi = β0 + β1 ln(Xi) + ui is as follows:

A

a 1% change in X is associated with a change in Y of 0.01 β1.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

An example of a quadratic regression model is:

A

Yi = β0 + β1X + β2X2 + ui.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

In the log-log model, the slope coefficient indicates:

A

the elasticity of Y with respect to X.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Misspecification of functional form of the regression function:

A

Results in Omitted Variable Bias

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Which of the the following are different causes of potential model misspecification

A
  • choice of variables
  • functional form
  • error structure
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

In nonlinear models, the expected change in the dependent variable for a change in one of the explanatory variables is given by:
△Y = f(X1 + △X1, X2,…, Xk) - f(X1, X2,…Xk).

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

You estimate a model of student test scores on student-teacher ratio using a sample of 420 California school districts. Using OLS the estimated standard error on the slope coefficient is 0.51, but when using when using the heteroskedasticity robust estimation (White’s estimation) it is 0.48. The t-statistic is:

A

use White’s estimation because the t-statistic will be smaller than with OLS

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Which of the following is a difference between the White test and the Breusch-Pagan test?

A

The Breusch-Pagan test assumes that we have knowledge of the variables appearing in the variance function of heterosckedasticity.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

A simple way to visually inspect whether the results are likely to be heteroskedastic is to:

A

examine a scatterplot of the residuals (error terms) and X plot.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Which of the following statements related to heteroskedasticity are correct?

A

The OLS estimator is still linear in parameters with unbiased estimates of the betas but is no longer the best.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

The Harvey-Godfrey tests assumes that the heteroskedasticity has a linear functional form with a specific X.

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

When testing for heteroskedasticity, you will reject the null hypothesis of homoscedasticity if the t-statistic is greater than the critical t-value.

A

False; If LM> LM*

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

The binary dependent variable model is an example of a:

A

limited dependent variable model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

In the binary dependent variable model, a predicted value of 0.6 means that:

A

given the values for the explanatory variables, there is a 60 percent probability that the dependent variable will equal one.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

E(Y|X1,…, Xk) = Pr(Y = 1| X1,…, Xk) means that:f

A

for a binary variable model, the predicted value from the population regression is the probability that Y=1, given X.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

In the linear probability model, the interpretation of the slope coefficient is:

A

the change in probability that Y=1 associated with a unit change in X, holding others regressors constant.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

The following tools from multiple regression analysis carry over in a meaningful manner to the linear probability model, with the exception of the:

A

Regression R^2

18
Q

An alternative method of estimating Binary Outcome Models is the

A

Logit Model

19
Q

Provide an example of a Binary Outcome (Limited Dependent Variable).

A

Getting a new car or not getting a new car based on income, wealth, location, and job status.

20
Q

For the polynomial regression model:

A

the techniques for estimation and inference developed for multiple regression can be applied.

21
Q

Assume that you had estimated the following quadratic regression model testscore^ = 607.3 + 3.85 Income - 0.0423 Income2. If income increased from 10 to 11 ($10,000 to $11,000), then the predicted effect on test scores would be:

A

2.96

22
Q

Consider the following regression model: savingsi = β0 + β1age + β2age2 + ui. The overall change in savings caused by a one-year change in age is equal to β1.

A

False

23
Q

Heteroskedasticity means that:

A

the variance of the error term is not constant.

24
Q

When a model has heteroskedastic errors, you can use OLS with heteroskedasticity-robust standard errors because

A

the exact structure of the heteroscedasticity is rarely know.

25
Q

In the presence of heteroskedasticity, if using White’s estimation the OLS estimator is:

A

unbiased and more precise

26
Q

The White’s Test is a very general heteroskedasticity test that test for several different structures of heteroskedasticity

A

True

27
Q

The linear probability model is

A

the application of the linear multiple regression model to a binary dependent variable.

28
Q

The major flaw of the linear probability model is that

A

the predicted values can lie above 1 and below 0.

29
Q

Consider the following least squares specification between test scores and the student-teacher ratio: testscores^ = 557.8 + 36.42 ln (Income). According to this equation, a 1% increase income is associated with an increase in test scores of:

A

0.36 Points (36.24/100)

30
Q

Breush - Pagen Test = uses resid2, Test :

A

H0: a1 = 0 (homoskedasticity)
Ha: a1 does not = 0 (heteroskedasticity)

31
Q

Calculate LM

A

n*R2

31
Q

LM > LM*
LM < LM*

A

Reject H0 = Heteroskedasticity
Not Reject H0 = Homoskedasticity

32
Q

Correct for Heteroskedasticity

A

Robust OLS Model

33
Q

Log changes

A

always percentage changes

34
Q

1 <
1 >

A

Inelastic
Elastic

35
Q

Variables measured in years and variables measured in percentages

A

should not be logged

36
Q

Logs are not used if

A

variables take on 0 or negative values

37
Q

Heteroskedasticity is

A

error variance has non constant variance

38
Q

Consequences of Heteroskedasticity

A
  • OLS Estimates are still unbiased
  • Estimated Variance and covariance are bias and inconsistent (higher variance = higher std error)
  • Hypothesis Tests = Not Valid (higher std error = low t stat)
39
Q

External Validity

A

statistical inferences can be generalized from population and other populations and settings
- Threats = knowledge and judgement on case by case basis

40
Q

Internal Validity

A

statistical inferences about causal effects are valid for population being studied
- Threats = omitted variable bias, wrong functional form, errors in variable bias, sample selection bias, simultaneous causality bias (all violations of Gauss Markov 2)

41
Q

Improve Validity

A

Obtain better data, develop model for measurement error, instrumental variables regression