penis balls Flashcards
OLS
Why use Ordinary Least Squares?
- ) OLS is relatively easy to use.
- ) The goal of minimizing sum of error squared is quite appropriate from a theoretical point of view
- ) OLS estimates have a number of useful characteristics
Ordinary Least Squares (OLS)
is a regression estimation technique that calculates the estimated slope coefficients so as to minimize the sum of the squared residuals
In a simple linear regression model, the standard error of the slope coefficient is expected to shrink at a rate that is equal to the inverse of the:
A) Square root of the number of parameters in the model
B) Square of the sample size
C) The sample size minus the number of parameters in the model
D) Square root of the sample size
d) Square root of the sample size
In a multiple linear regression model, the OLS estimates are inconsistent if:
A) There is correlation between the dependent variables and the error term
B) There is correlation between the independent variables and the error term
C) There is correlation between the independent variables
D) The sample size is less than the number of parameters in the model
b) There is correlation between the independent variables and the error term
If Betaj^ is an unbiased and consistent estimator of Betaj, then which of the following statements is correct?
A) The distribution of Betaj^ becomes more and more spread around the Betaj as the sample size grows
B) The distribution of Betaj^ collapses to a single point Betaj when the sample size tends to infinity
C) The distribution of Betaj^ tends toward a standard normal distribution as the sample size grows
D) None of the other statements are correct
b) The distribution of Betaj^ collapses to a single point Betaj when the sample size tends to infinity
In a multiple linear regression model, if the variance of the error term conditional on an explanatory variable is not constant then:
A) The t statistics are invalid and confidence intervals are valid for small sample sizes
B) The t statistics and confidence intervals are invalid no matter how large the sample size is
C) The t statistics are valid and confidence intervals are invalid for small sample sizes
D) The t statistics and confidence intervals are valid no matter how large the sample size is
b) The t statistics and confidence intervals are invalid no matter how large the sample size is
In a simple linear regression model, a change in the dependent variable’s unit of measurement does not lead to a change in:
A) The confidence intervals of the regression
B) The sum of squared residuals of the regression
C) The goodness-of-fit of the regression
D) The standard error of the regression
C) The goodness-of-fit of the regression
Measurement error occurs in a simple linear regression model when:
A) The observed value of a variable used in the model differs from its actual value
B) The model includes more than two independent variables
C) The partial effect of an independent variable depends on unobserved factors
D) The dependent variable is binary, but the independent variable is continuous
A) The observed value of a variable used in the model differs from its actual value
In a hypothesis test, the significance level used is:
A) The probability of rejecting the null hypothesis when it is true
B) One minus the probability of rejecting the null hypothesis when it is true
C) One minus the probability of rejecting the null hypothesis when it is false
D) The probability of accepting the null hypothesis when it is true
A) The probability of rejecting the null hypothesis when it is true
The F statistics can be used to test non-nested models
True
False
False
Predictions made for the dependent variable in a multiple linear regression model are subject to sampling variation
True
False
True
In a multiple regression model with an independent variable that is a dummy variable, the coefficient on the dummy variable for a particular group represents the estimated difference in intercepts between that group and the base group.
True
False
True
In a multiple linear regression where the Gauss-Markov assumptions hold, why can you interpret each coefficient as a ceteris paribus effect?
Because the Ordinary Least Squares (OLS) estimator of the coefficient on variable xj is based on the covariance between the dependent variable and the variable xj after the effects of other regressors has been removed.
In a random sample:
All the individuals or units from the population have the same probability of being chosen.
What assumption is necessarily violated if the weekly endowment of time (168 hours) is entirely spent either studying, or sleeping, or working, or in leisure activities?
No perfect multicollinearity
Take an observed (that is, estimated) 95% confidence interval for a parameter of a multiple linear regression. Then:
We cannot assign a probability to the event that the true parameter value lies inside that interval.
In testing multiple exclusion restrictions in the multiple regression model under the classical assumptions, we are more likely to reject the null that some coefficients are zero if:
the R-squared of the unrestricted model is large relative to the R-squared of the restricted model.
In the Chow test the null hypothesis is:
all the coefficients in a regression model are the same in two separate populations.
The significance level of a test is:
one minus the probability of rejecting the null hypothesis when it is true.
What is true of confidence intervals?
Confidence intervals are also called interval estimates.
Define the upper bound of a confidence interval for population parameter β
The upper bound of the confidence interval for a for a population parameter β is given by β + critical value · standard error β .
Which of the following is true of the standard error of the OLS slope estimator, s.e. β ?
It is an estimate of the standard deviation of the OLS slope estimator.
A restricted model will always have fewer parameters than unrestricted
True
False
True
The F statistic is always nonnegative as SSRr is never smaller than SSRur.
True
False
True
The population parameter in the null hypothesis…
A) Is always equal to 0
B) Never equal to 0
C) Is not always equal to 0
c) Is not always equal to 0
Changing the unit of measurement of any independent variables, where the log of the independent variable appears in the regression affects…
only the intercept coefficient.
A predicted value of a dependent variable…
represents the expected value of the dependent variable given particular values
for the explanatory variables.
Which Gauss-Markov assumption is violated by the linear probability model?
The assumption of constant variance of the error term. (Heteroskedaticty)
The heteroskedasticity-robust t statistics are justified only if the sample size is large.
True
False
True
What does the regression slope coefficient indicate?
by how many units the conditional mean of y increases, given a one unit increase in x
beta 1 hat has a smaller standard error, ceteris paribus, if
there is more variation in the explanatory variable, x
What is BLUE?
the OLS estimator with the smallest variance in the class of linear unbiased estimators of the parameters
R-squared can never decrease when another independent variable is added to a regression
True
False
True
When there is high correlation, OLS estimators can still be unbiased, but the estimation of parameters has lower precision when regressors are correlated
True
False
True
Increasing the sample size
A) Increases variance
B) Keeps variance the same
C) Reduces variance
C) Reduces variance
What is the p-value?
smallest significance at which the null would be rejected
The rejection rule in terms of the p-value is…
if p-value < alpha, we reject null
At what point can the null not be rejected?
When the significance level is reduced
a large p value is in favour of the null
True
False
True
the larger F is, the larger the SSR restricted relative to SSR unrestricted, so the worse the explanatory power of the restricted model. What does this say about the null H0?
This implies that the null H0 is false
the r squared of the restricted model is…
Zero by definition
the adjusted r squared takes into account the number of variables in a model and it may…
A) Decrease
B) Stay unchanged
C) Increase
A) Decrease
What do you do if MLR3 is violated with a perfect linear function
drop one of the independent variables
if Fstat > Fcritical…
then x1 and x2 are jointly significant and reject the null
if there is heteroskedasticity
the ols is not the most efficient estimator and the standard errors are not valid for inference
The F statistic is always nonnegative because…
The SSRr is smaller than SSRur
changing the unit of measurement of any independent variable, where log of the independent variable appears in the regression affects
A) Only the beta slope coefficient(s)
B) Only the intercept coefficient
C) All coefficients
B) Only the intercept coefficient
the assumption of constant variance of the error term is violated by the linear probability model
True
False
True
What are the assumptions of MLR? (1-5)
MLR1: Linear in parameters MLR2: Random sampling MLR3: No perfect collinearity MLR4: Zero conditional mean MLR4: Homoscedasticity
What does MLR1-4 ensure?
Unbiasedness of the OLS estimators
In a regression Y = beta0 + x1beta1 + x2beta2 + u, if x2 is omitted, which of the following are correct?
A) When beta2 > 0 and corr(x1, x2) > 0, there is a positive bias
B) When beta2 < 0 and corr(x1, x2) > 0, there is a negative bias
C) When beta2 > 0 and corr(x1, x2) < 0, there is a positive bias
D) When beta2 < 0 and corr(x1, x2) < 0, there is a negative bias
E) A and B are correct
F) All of the above are correct
E) A and B are correct