Multiple Choice Questions Flashcards
The normality assumption implies that:
The population error u is independent of the explanatory variables and is normally distributed with mean zero and variance σ2
A normal Variable is standardised by:
Subtracting off its mean from it and dividing by its standard distribution
Consider the equation, Y=β_1+β_2 X_2+u. A null hypothesis H_0:β_2=0 states that:
X_2 has no effect on the expected value of y
The general t-statistic can be written as:
t= (estimate - hypothesised value)/standard error
The significance level of a test is:
The probability of rejecting the null hypothesis when it is true
Which of the following statements is true of confidence intervals:
a. Confidence intervals in a CLM are also referred to as point estimates.
b. Confidence intervals in a CLM provide a range if likely values for the population parameter
c. Confidence intervals in a CLM do not depend on the degrees of freedom of a distribution
d. Confidence intervals in a CLM can be truly estimated when heteroskedasticity is present
Confidence intervals in a CLM provide a range of likely value for the population parameter
Which of the following is true
a. When the standard error of an estimate increases, the confidence interval for the estimate narrows down.
b. Standard error of an estimate does not affect the confidence interval for the estimate.
c. The lower bound of the confidence interval for a regression coefficient, say β_j is given by β_j - [standard error x (β_j)]
d. The upper bound of the confidence interval for a regression coefficient, say β_j, is given by β_j + [Critical value x standard error (β_j)]
The upper bound of the confidence interval for a regression coefficient, say β2, is given by βj + [Critical value x standard error (βj)]
Consider the following regression equation y=β0 +β1X1 + β2X2 + u. What does β1 imply?
β1 implies the ceteris paribus effect of X1 on y
If the explained sum of squares is 35 and the total sum of squares if 49, what is the residual sum of squares?
14
Rsquared shows…
Rsquared shows what percentage of the total variation in the dependent variable, y, is explained by the explanatory variables.
The value of Rsquared
lies between 0 and 1
If an independent variable in a multiple linear regression model is an exact linear combination of other independent variables, the model suffers from the problem of…
Perfect Collinearity
High (but not perfect) correlation between two or more independent variables is called…
Multicollinearity
The term “linear” in a **multiple regression model **means that the equation is linear in parameters (True or False)
True
The key assumption for the general multiple regression model is that all factors in the unobserved error term be uncorrelated with the explanatory variables (True or False)
True