Prelim Exam Prep Deck Flashcards
Sandwich (robust) estimator for variance is given by ….
Gibbs sampler is given by
What’s consistency?
What’s the Fisher information matrix?
M-estimator
Any solution to \sum \psi(Y_i, theta) = 0
\psi does not depend on i or n.
true parameter value of theta is given by E\psi(Y_i, theta) =0
distribution (derived through Taylor expansion) is of form
\hat theta \sim AN (theta_0, V(theta_0) / n )
V= A^-1 B A^(-1T)
See sandwich variance estimator for more details
What’s the gamma-exponential model?
prior p(theta) = Gamma(alpha, beta)
&
likelihood p( y | theta) = Exponential( theta )
> >
posterior p(theta | y) = Gamma(alpha + 1, beta + y)
What’s the Poisson-Gamma Model?
What are Jeffrey’s priors?
What’s the normal-normal model?
What’s the beta-binomial model?
What’s the Metropolis Algorithm?
What’s the Metropolis-Hastings algorithm?
What’s importance sampling?
What’s the normal pdf?
What’s the gamma pdf?
What’s the gamma mean?
What’s the weak law of large numbers?
What’s the CLT?
What’s asymptotic normality of the MLE?
What’s the Gauss-Markov Theorem?
The Gauss Markov theorem says that, under certain conditions, the ordinary least squares (OLS) estimator of the coefficients of a linear regression model is the best linear unbiased estimator (BLUE), that is, the estimator that has the smallest variance among those that are unbiased and linear in the observed output variables.
What are the conditions required for the Gauss-Markov Theorem to apply?
independent/ uncorrelated error is the key.
What’s the OLS estimator?
What’s the sum of an infinite power series?
What’s the sum of an infinite geometric series?
What’s Small o: convergence in probability?