penis balls Flashcards
OLS
Why use Ordinary Least Squares?
- ) OLS is relatively easy to use.
- ) The goal of minimizing sum of error squared is quite appropriate from a theoretical point of view
- ) OLS estimates have a number of useful characteristics
Ordinary Least Squares (OLS)
is a regression estimation technique that calculates the estimated slope coefficients so as to minimize the sum of the squared residuals
In a simple linear regression model, the standard error of the slope coefficient is expected to shrink at a rate that is equal to the inverse of the:
A) Square root of the number of parameters in the model
B) Square of the sample size
C) The sample size minus the number of parameters in the model
D) Square root of the sample size
d) Square root of the sample size
In a multiple linear regression model, the OLS estimates are inconsistent if:
A) There is correlation between the dependent variables and the error term
B) There is correlation between the independent variables and the error term
C) There is correlation between the independent variables
D) The sample size is less than the number of parameters in the model
b) There is correlation between the independent variables and the error term
If Betaj^ is an unbiased and consistent estimator of Betaj, then which of the following statements is correct?
A) The distribution of Betaj^ becomes more and more spread around the Betaj as the sample size grows
B) The distribution of Betaj^ collapses to a single point Betaj when the sample size tends to infinity
C) The distribution of Betaj^ tends toward a standard normal distribution as the sample size grows
D) None of the other statements are correct
b) The distribution of Betaj^ collapses to a single point Betaj when the sample size tends to infinity
In a multiple linear regression model, if the variance of the error term conditional on an explanatory variable is not constant then:
A) The t statistics are invalid and confidence intervals are valid for small sample sizes
B) The t statistics and confidence intervals are invalid no matter how large the sample size is
C) The t statistics are valid and confidence intervals are invalid for small sample sizes
D) The t statistics and confidence intervals are valid no matter how large the sample size is
b) The t statistics and confidence intervals are invalid no matter how large the sample size is
In a simple linear regression model, a change in the dependent variable’s unit of measurement does not lead to a change in:
A) The confidence intervals of the regression
B) The sum of squared residuals of the regression
C) The goodness-of-fit of the regression
D) The standard error of the regression
C) The goodness-of-fit of the regression
Measurement error occurs in a simple linear regression model when:
A) The observed value of a variable used in the model differs from its actual value
B) The model includes more than two independent variables
C) The partial effect of an independent variable depends on unobserved factors
D) The dependent variable is binary, but the independent variable is continuous
A) The observed value of a variable used in the model differs from its actual value
In a hypothesis test, the significance level used is:
A) The probability of rejecting the null hypothesis when it is true
B) One minus the probability of rejecting the null hypothesis when it is true
C) One minus the probability of rejecting the null hypothesis when it is false
D) The probability of accepting the null hypothesis when it is true
A) The probability of rejecting the null hypothesis when it is true
The F statistics can be used to test non-nested models
True
False
False
Predictions made for the dependent variable in a multiple linear regression model are subject to sampling variation
True
False
True
In a multiple regression model with an independent variable that is a dummy variable, the coefficient on the dummy variable for a particular group represents the estimated difference in intercepts between that group and the base group.
True
False
True
In a multiple linear regression where the Gauss-Markov assumptions hold, why can you interpret each coefficient as a ceteris paribus effect?
Because the Ordinary Least Squares (OLS) estimator of the coefficient on variable xj is based on the covariance between the dependent variable and the variable xj after the effects of other regressors has been removed.
In a random sample:
All the individuals or units from the population have the same probability of being chosen.
What assumption is necessarily violated if the weekly endowment of time (168 hours) is entirely spent either studying, or sleeping, or working, or in leisure activities?
No perfect multicollinearity
Take an observed (that is, estimated) 95% confidence interval for a parameter of a multiple linear regression. Then:
We cannot assign a probability to the event that the true parameter value lies inside that interval.
In testing multiple exclusion restrictions in the multiple regression model under the classical assumptions, we are more likely to reject the null that some coefficients are zero if:
the R-squared of the unrestricted model is large relative to the R-squared of the restricted model.
In the Chow test the null hypothesis is:
all the coefficients in a regression model are the same in two separate populations.
The significance level of a test is:
one minus the probability of rejecting the null hypothesis when it is true.
What is true of confidence intervals?
Confidence intervals are also called interval estimates.