General Flashcards
The variance of a random variable
π is
A) Equal to the standard deviation of
π
B) Strictly smaller than zero
C) Equal to zero if
πΈ[π]=0
D) Is strictly larger than zero
D) Is strictly larger than zero
Variance measures the spread of values around the mean. It is non-negative and strictly positive unless
X is constant, in which case it is zero. Unlike the standard deviation, variance cannot be negative.
The best predictor for
π to minimize the mean squared error (MSE) after observing that π takes value π₯ is:
A) πΈ[π]
B) Var(π)
C) πΈ[πβ£π=π₯]
D) π₯
A: E[Yβ£X=x]
Explanation:
The conditional expectation E[Yβ£X=x] minimizes the MSE when predicting Y. It represents the best estimate for Y given the observed value of X.
The standard error of an estimator (such as the OLS estimator) is:
A) Equal to the square root of the variance of the sampling distribution of the estimator
B) Equal to the squared mean of the dependent variable Y
C) Equal to the variance of the dependent variable Y
D) Equal to the variance of the sampling distribution of the estimator
A) Equal to the square root of the variance of the sampling distribution of the estimator
Explanation:
The standard error is the square root of the variance of the estimatorβs sampling distribution, providing a measure of variability in the estimates.
An estimator for the population parameter Ξ² is said to be unbiased if:
A) The estimated value converges to Ξ² as the sample size increases
B) The estimate obtained from a single sample is equal to Ξ²
C) The expected value of the estimator is equal to π½
D) The estimator has the smallest possible sampling variance
C) The expected value of the estimator is equal to Ξ²
Explanation:
An unbiased estimator has an expected value equal to the true parameter value, ensuring that on average, it accurately estimates the parameter.
A sample of time-series data:
A) Contains always only stationary variables
B) Can be obtained by drawing a random sample from the population
C) Is not useful for economic analysis
D) Is a single realization from a stochastic process
D) Is a single realization from a stochastic process
Explanation:
Time-series data represents one specific path or realization from an underlying stochastic process, observed sequentially over time.
An estimator is said to be consistent if:
A) It converges to the true population parameter when the sample size increases to infinity
B) It is unbiased
C) It converges to the sample mean of the data when the sample size increases to infinity
D) It is efficient.
A: A) It converges to the true population parameter when the sample size increases to infinity
Explanation:
A consistent estimator approaches the true value of the parameter as the sample size becomes infinitely large, ensuring accuracy over larger samples.
The OLS estimator of a stationary AR(1) model:
A) Is biased but consistent because the error terms are serially correlated
B) Is biased but consistent because the explanatory variables are strictly exogenous
C) Is biased but consistent because the error terms are non-stationary
D) Is biased but consistent because the explanatory variables are weakly exogenous
D) Is biased but consistent because the explanatory variables are weakly exogenous
Explanation:
OLS estimates in an AR(1) model are biased in small samples but become consistent as sample size increases if the explanatory variables are weakly exogenous.
The conditional expectation E[Yβ£X=x]:
A) Is equal to E[Yβ£X=x]=E[Y] if Y and X are correlated
B) Always can be modeled as a linear regression
C) Gives the expected value of Y given that X equals x
D) Is equal to E[Yβ£X=x]=E[Y]/x
Gives the expected value of Y given that X equals x
Explanation:
The conditional expectation represents the expected value of Y when the value of X is known, minimizing prediction error.
If a Durbin-Watson test rejects its null hypothesis, the data provides evidence that the error terms in the regression are:
A) Homoscedastic
B) Not serially correlated
C) Serially correlated
D) Heteroscedastic
C) Serially correlated
Explanation:
The Durbin-Watson test is used to detect serial correlation. Rejecting the null hypothesis indicates that there is serial correlation in the residuals.
Serial correlation in the error terms of a linear regression model:
A) Requires a robust estimator of the standard errors, such as the Newey-West estimator
B) Implies that the explanatory variables are only weakly exogenous
C) Implies that OLS is BLUE
D) Implies that the error terms are a function of the explanatory variables
Requires a robust estimator of the standard errors, such as the Newey-West estimator
Explanation:
Serial correlation affects the validity of standard errors, necessitating the use of robust estimators like the Newey-West to obtain correct inference.
If the explanatory variables in a linear regression are weakly exogenous:
A) The error terms are homoscedastic
B) The error term in period
π‘
t may be correlated with explanatory variables in future periods
C) OLS is unbiased in finite samples
D) OLS is BLUE
B) The error term in period t may be correlated with explanatory variables in future periods
Explanation:
Weak exogeneity allows for correlation between current errors and future explanatory variables, unlike strict exogeneity.
If the error terms in a linear regression model with strictly exogenous regressors are heteroscedastic:
A) OLS is biased but consistent
B) OLS is unbiased but not efficient
C) Statistical inference can proceed in the same way as with homoscedastic errors
D) The error terms from two consecutive periods are correlated
OLS is unbiased but not efficient
Explanation:
Heteroscedasticity impacts the efficiency of OLS, meaning the estimator is unbiased but less precise
If we assume that the error terms in a linear regression with strictly exogenous regressors are normally distributed:
A) OLS is the best unbiased estimator
B) OLS is not efficient
C) The sampling distribution of the OLS estimator can only be derived asymptotically
D) The sampling distribution of the OLS estimator is the t-distribution
OLS is the best unbiased estimator
Explanation:
With normally distributed errors and strictly exogenous regressors, OLS is BLUE (Best Linear Unbiased Estimator).
The finite sampling distribution of the OLS estimator is:
A) The distribution of the OLS estimates from repeated samples
B) The distribution of the dependent variable
C) The distribution of the residuals
D) Always the normal distribution
A) The distribution of the OLS estimates from repeated samples
Explanation:
The finite sampling distribution shows the variability of OLS estimates from repeated sampling, not the distribution of residuals or the dependent variable.
A weakly stationary time-series:
A) Has an unconditional variance that changes over time
B) Has unconditional moments that do not change over time
C) Exhibits a trend over time
D) Has a conditional expectation that does not change over time
B) Has unconditional moments that do not change over time
Explanation:
A weakly stationary time series has a constant mean and variance over time, and its autocovariance depends only on the lag.