Pinboard Flashcards
what is a random walk
accumulation of error terms from a stationary series of error terms
what is the static model
yt=β0+β1zt+ut
what is the finite distributed lag model
yt=β0+β1zt+β2zt-1+…+ut
what is a stochastic process
sequence of random variables indexed by time
what does weak stationary mean
Mean, variance and covariances are stable. Mean and variance constant over time. Covariance between yt and yt-j depends only on distance between two terms
what is an AR(1) model
Autoregressive:
yt=θyt-1+εt
what is a MA(1) model
Moving average:
yt=εt+αεt-1
what is weak dependence
correlations between time series variables become smaller and smaller. Weakly dependent if Corr(yt,yt-j)->0 as j->∞ (asymptotically uncorrelated)
what is the Correlagram equation
ρj=Cov(yt,yt-j)/Var(yt)=γj/γ0
what is the variance part of the correlagram equation γ0: (ρj=Cov(yt,yt-j)/Var(yt)=γj/γ0)
Var: γ0=E((yt-μ)^2)
what is the autocovariance part of the correlagram equation γj: (ρj=Cov(yt,yt-j)/Var(yt)=γj/γ0)
Autocov: γj=E((yt-μ)(yt-j-μ))
what does the fact that E(et^2)=σ^2 mean
the variance where the expected value is 0 (can derive it)
what does efficient mean
smallest variance
what does consistent mean
plim(αhat)=α
what does a unit root mean
yt=θyt-1+et
Unit root: θ=1
what is a way of showing et and es are serially uncorrelated when E(et)=0
E(etes)=0 (from Cov(etes) with E(et)=0)
what is the stability condition
|θ|<1
how do you do the test of order of integration
checking whether weakly stationary -> check whether mean and variance are constant over time -> then check covariance between yt and yt-j
what is the test for serial correlation
OLS yt on xt to get β1 -> form residual -> regress uthat on ut-1hat and xt… to get ρ -> F test
what is the unit root test
∆yt=c+(θ-1)yt-1+et, (θ-1)=γ -> Dickey-Fuller test against adjusted CVs. DF=γhat/var(γhat)^1/2
How do you do the Breusch-Pagan test for homoskedasticity
Null homo H0:E(ui^2|xi)=σ^2, var not fct of explanatory variables, can’t observe ui^2hat so replace by OLS residuals and test H0:δ1=δ2=…=δk=0 in ui^2hat=δ0+δ1x1i+δ2x2i+…+δkxki+ε R^2 in regression of ui^2hat on xi->R^2u^2hat. Bresuch-Pagan stat nR^2u^2hat, n sample size, bull home nR^2u^2hat->d χk^2, null rejected if nR^2u^2hat larger than cv of χk^2 distribution. don’t have to specify an alternative
In the Breusch-pagan test do you expect a high or low R^2 under the null of homoskedasticity: (H0:δ1=δ2=…=δk=0 in ui^2hat=δ0+δ1x1i+δ2x2i+…+δkxki+ε)
R^2 small under null because none of var in u explained by regressors
what is the definition of heteroskedasticity
conditional variance of the error term in the linear model is different for different values of the explanatory variable
E(ui^2|xi)=Var(yi|xi)=σ^2(xi),
fct of explanatory
what is the equation for heteroskedasticity
E(ui^2|xi)=Var(yi|xi)=σ^2(xi),
fct of explanatory
what does robust mean
allows for heteroskedasticity
what does less noise do
improves efficiency
how does the weighted least squares method work (in words)
more noise=less weight
less noise=more weight,
less noise improves efficiency
what is the variance (words and equation that matches words)
sum of squared distances of each term from the mean (μ), divided by number of terms in the distribution, from this subtract the square of the mean,
σ^2=(Σ(X-μ)^2)/N = (Σx^2)/N-μ^2
what is the variance formula
Var(X)=E((X-E(X))^2) = E(X^2)-(E(X))^2
what is homoskedasticity
conditional variance of u given x1,…,xk is constant: Var(u|x1,…,xk)=σ^2
what is the total sum of squares (TSS)
Σ(yi-ybar)^2,
measure of total sample variation in the y: how spread out the yi are in the sample.
If we divide the TSS by n-1 we obtain the sample variance
what is the model sum of squares (MSS)
Σ(yihat-ybar)^2.
Sample variation in yihat (where we use the fact that ybarhat=ybar)
what is the residual sum of squares (RSS or SSR)
Σuihat^2.
RSS measures sample variation in uihat
what is the relationship between the RSS the MSS and the TSS
the total variation in y can be expressed as the sum of the explained variation and the unexplained variation.
TSS=MSS+RSS
what is the R^2 test
R^2=MSS/TSS = 1 - RSS/TSS
if TSS=MSS+RSS what is the F stat that allows for testing whether all parameters, apart from the constant are zero
F = (TSS-RSS)/k / RSS/(n-k-1)
= MSS/k / RSS/(n-k-1)~F(k,n-k-1)
what is R^2 a measure of
measure of how much variation in y is explained by variables in model
what is the Gauss-Markov property
under the additional assumption of homoskedasticity, the estimator βhat is efficient (smallest variance), and is the BLUE of β, with variance Var(βhat)=σ^2(X’X)^-1
what is the covariance equation
Cov(x,y)=E((X-E(X))(Y-E(Y)))
E(XY)-E(X)E(Y)
Correlation equation
ρxy=Cov(x,y)/σxσy
what is the zero conditional mean assumption
random error u satisfies E(u|x1,…,xk)=0 -> needed for consistency
what does no perfect collinearity mean
matrix X full (column) rank