Mid Term Flashcards
The average of the OLS fitted values for any sample is always zero
False
When one includes an irrelevant independent variable in a regression, we call it “over controlling.”
False
If we were to change the units of measurement for one of the independent variables, the coefficient estimates for all independent variables would change.
False
If we run a regression in Stata and obtain a p-value equal to .06, we would reject the null hypothesis that the coefficient is equal to zero at the 5 percent level, but would fail to reject that null at the 10 percent level.
False
Multicollinearity refers to the situation where the independent variables are highly correlated. Multicollinearity does not cause the OLS estimator to be biased, but it does generally increase the standard errors.
True
If an estimator is consistent, then as the size of the random sample increases the estimator moves towards the true population parameter value.
True
The zero conditional mean assumption implies that ui = 0 for all i regardless of the value of xi.
False
We do not need the normality of the error term assumption to perform valid statistical inference if the other multiple linear regression model assumptions hold and we have a large sample.
True
Omitting an independent variable that is correlated with the dependent variable from an OLS regression always causes the estimated coefficients on the included independent variables to be biased.
False
A confidence interval for a prediction is always at least as small or smaller than the corresponding prediction interval.
True
One of the most important differences between an applied regression analysis course from the Statistics Department and an econometrics course from the Economics Department is the degree to which the course focuses on estimation bias caused by endogenous variables.
True
The central limit theorem states that the sample mean of a variable, when it is standardized (by the population standard deviation), has a standard normal distribution, even if the variable itself is not normally distributed.
True
Under assumption SLR.1 - SLR.4, the OLS estimates equal the true population parameters.
False
A violation of the zero conditional mean assumption would cause the OLS estimator to be biased.
True
Over specifying a model (by adding irrelevant control variables) would cause the OLS estimator to be biased.
False
The sum of the squared residuals (SSR) is equal to the difference between the total sum of squares (SST) and the explained sum of squares (SSE).
True
A sample correlation coefficient of 0.95 between the regressor of interest and another regressor in the model would cause the OLS estimator to be biased.
False
The reason OLS is commonly used is because it is the most computationally efficient unbiased linear estimator.
False
Omitting a variable that is correlated with the regressor of interest would cause the OLS estimator to be biased if the omitted variable is uncorrelated with the outcome variable.
False
Imperfect multicollinearity causes OLS estimates to be biased.
False
Multicollinearity increases the…
Variance of the estimator
Heteroskedasticity causes the OLS estimator to be biased
False
Heteroskedasticity causes the OLS estimator…
Standard Errors to be wrong
Multicollinearity causes the OLS estimates to have large standard errors.
True
If Stata reports a p-value equal to .04 then we would reject the null hypothesis that the coefficient is equal to zero at the 1 percent level, but would fail to reject that null at the 5 percent level.
False
The p-value is the probability of the null hypothesis being true, given the observations.
False
What is the probability of obtaining an estimate as extreme or more extreme than the one obtained if the null hypothesis were true.
P-Value
If we run a regression using STATA and obtain a p-value equal to .06, then we would reject the null hypothesis that the coefficient is equal to zero at the 5 percent level, but would fail to reject that null at the 10 percent level.
False
The t-statistic is the ratio of the parameter estimate and the variance of the parameter estimate.
False
The standard error of a parameter estimate is also called the root mean squared error of the regression.
False
If a parameter estimate is consistent, then as the size of the random sample increases to infinity the parameter estimate will move towards the true population parameter value.
True
If an estimator of a population parameter is unbiased, then in repeated sampling from the population, the average value of the population parameter estimates will equal the true population parameter as the number of random samples goes to infinity.
True
If the sample size is large, we can perform statistical inference even if assumption MLR.6 (Normality) is violated.
True
Adding an additional independent variable to the regression cannot cause a decrease in the value of R squared.
True
Adding an additional independent variable to the regression cannot cause a decrease in the value of Adjusted R-Squared
False
Over controlling in a multiple regression model is when one includes an explanatory variable that is a pathway through which the independent variable of interest affects the dependent variable.
True
The prediction interval is always wider than the confidence interval for the prediction.
True
To model if there are increasing or decreasing returns to a particular independent variable one should include an interaction term.
False
If the dependent variable is a binary variable, the error term is obviously not normally distributed. This may result in biased OLS estimates.
False
Overspecifying the model causes bias.
False
Overspecifying the model can increase the variance of the OLS estimator.
True
Measures how many standard deviations beta j hat is away from the hypothesized value of beta j.
T statistic
The correlation between the residuals from a regression and each of the X variables is positive.
False
What is the percent change in one variable given a 1% ceteris paribus increase in another variable?
Elasticity
What is, as I get more and more data, the standard error of beta j hat gets smaller and smaller at the rate of 1 over n squared
root n convergence