Econometrics Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

Student’s t-test for correlation coefficient

A

t = r*sqrt(n - 2) / sqrt(1 - r^2)

Test is applicable only if the two populations are normally distributed

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Formula for slope coefficient in a univariate regression

A

cov(X,Y) / std(Y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

List Gauss-Markov assumptions

A

OLS is BLUE under:

1) Correct specification (linear model)
2) Spherical errors (constant variance and zero correlation terms)
3) Exogeneity of independent variables
4) Sample data matrix has full rank

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What if residuals are not normally distributed?

A

This does not lead to biased estimation of the coefficients

However, t-tests might not work on small samples

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What would heteroscedasticity lead to?

A

Heteroscedasticity does not lead to biased estimation of the coefficients
Incorrect measurement of standard errors of the coefficients

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Student’s t-test for regression coefficients

A

Two-tailed test: b_X / SE(X)

Degrees of freedom: n - k - 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

(M)ANOVA parameters

A

SST = RSS + SSE
SST = var(Y), RSS = var(Y_est), SSE = sum((Y -Y_est)^2)
MSR = RSS / k
MSE = SSE / (n - k - 1)
Degrees of freedom for total = n - 1
F-test = MSR / MSE (H0 - all coefficients equals zero, test is rejected if F > F_crit - one-way test!)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

R_2 and R_2_adj

A

R_2_adj = 1 - (1 - R_2)*(n - 1)/(n - k - 1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Testing for heteroscedasticity

A

Breusch-Pagan test for conditional heteroscedasticity
BP = n * R_2 (regression of squared residuals on an independent variable). Test is Chi-square, one-way, k degrees of freeedom

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Correction for heteroscedasticity

A

White SEs - usually higher than normal ones

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Serial correlations - outcomes

A

In a cross-section setting positive serial correlation leads to artificially low SEs, but does not lead to biased estimation of the coefficients
In a time-series setting serial correlation may make parameters estimated inconsistent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Durbin - Watson statistics

A

DW = sum((Resid_t - Resid_t-1)^2)/sum((Resid_t)^2) = 2(1 - r) for large sample size
Serial correlation exists if DW significantly differs from 2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Correction for serial correlations

A

Hansen method to adjust SEs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Assumptions of AR models

A

Covariance stationarity:

1) Constant and finite expected value (mean-reversion level is calculated by fitting Y_t = Y_t-1 and solving for Y_t)
2) Constant and finite variance of Y
3) Constant and finite covariance of Y_t and T_t-1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Test for serial correlation of AR models

A

Special type of t-test: t = corr(Resid_t, Resid_t-1)*sqrt(T)

Two-tail test with T-2 degrees of freedom

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Test for unit-roots

A

Dickey - Fuller test:

Y_t - Y_t-1 = b0 + g * Y_t-1, H0: g = (b1 - 1) equals zero.

17
Q

When a regression of two time series on each other is correct?

A

1) If both X and Y and covariance stationary

2) If both X and Y are cointegrated unit-roots

18
Q

Chow test

A

Test for structural difference in two subsamples

F(k, n - 2k) = (RSS_full - RSS_1 - RSS_2) /(RSS_1 + RSS_2) * (n - 2k)/k