Quantitative Methods Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

Covariance

A

SUM[(Xbar-Xi)*(Ybar-Yi)] / (n-1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Correlation Coefficient

A

r = Cov(x,y) / [Sigma(x)*Sigma(y)]

where, -1 < r < 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

T-test

A

t = [r * sqrt(n-2)] / sqrt(1-r^2)

Reject H0 if t+critical < t or t-critical>t.
t needs to be in between the two critical t’s

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Slope of regression line

A

b1 = Covariance / Variance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Total Sum of Squares (SST)

A

SUM(Yi - Ybar)^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Sum of Squared Errors (SSE)

A

SUM(Yi-Yhat)^2

This is the unexplained variation in the regression

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Regression Sum of Squares (RSS)

A

SUM(Yhat-Ybar)^2

This is the explained variation in the regression

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Total Variation = Explained Variation + Unexplained Variation

A

SST = RSS + SSE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Mean Regression Square of Sums (MSR)

A

=RSS/number of slope parameters

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Mean Squared Error (MSE)

A

MSE = SSE / (n-2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Regression degree of freedom = k
Error degree of freedom = n - k - 1

Where k = number of slope parameters

A

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

R^2 = (SST - SSE) / SST = RSS / SST

A

R^2 = (Total Variation - Unexplained Variation) / Total Variation = Explained Variation / Total Variation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Standard Error of the Estimate (SEE)

A

SEE = SqRt (MSE) = SqRt(SSE/(n-2))

Smaller SEE means the regression has better fit

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

F-Test tests the statistical significance of a regression

A

F = MSR / MSE = (Rss/k)/(SSE/[n-k-1])

Always a one tail test

Reject if F > Fcritical

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

F-test hypothesis testing

A
  1. H0 –> b1=b2=b3=b4=0 vs. Ha–> @ least 1 bi is not =0
  2. Decision —> Reject H0 if F(test-statistic) > Fcritical
  3. If rejected, Bj is significant
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Adjusted R^2

A

Ra^2 = 1 - {[(n-1)/(n-k-1)]*(1-R^2)}

17
Q

t = coefficient / standard error

A

18
Q

Breusch Pagan Chi-Square Test is used to detect heterskedasticity

A

ChiSquare = n * R^2residuals
with k degrees of freedom.

To correct heteroskedasticity, calculate the robust standard errors.

19
Q

Durbin-Watson Statistic is used to detect serial correlation (residual terms are correlated with each other)

A

DW = SUM[(EtHAT - Et-1HAT)^2] / SUM(EtHAT^2)

Decision Rule:
Dw < dl reject null, positive serial correlation
dl du fail to reject null, no evidence of serial correlation

20
Q

Use the Hansen method to correct for serial correlation

A

21
Q

Detect Multicollinearity 1. t-test says no variable is statistically significant, and 2. F-test is statistically significant and R^2 is high.

A

Correct multicollinearity by omitting highly correlated variables

22
Q

Log-Linear Models

A
Yt = e^(b0+b1(t)) --->
ln(Yt) = b0 + b1(t)
23
Q

Autoregressive Models

A

Xt = b0 + b1Xt-1 + b2Xt-2+…+bpXt-p + ErrorTerm

24
Q

Chain rule of forecasting

A
Xt+1 = b0 + b1*Xt
Xt+2 = b0 + b2*Xt+1
25
Q

T-test

A

t = correlation of error terms / [1/SqRt(T)]
with T-2 degrees of freedom.

If the test is rejected, add more lag variables and re-test

26
Q

Mean Reverting Level is the tendency to move back to the mean

A

Xt = b0/(1-b1)

27
Q

Dickey Fuller Test

A
  1. Xt = b0 + b1*Xt-1 + Error
  2. Xt - Xt-1 = b0 + (b1-1)*Xt-1 + Error
  3. Test the (b1-1) term using the t-test. If it is rejected, there is no unit root.

A unit root is when a time series is not covariance stationary.

28
Q

Auto Regressive Conditional Heteroskedasticity (ARCH) exists if the variance of the residuals in one period is dependent on the variance of the residuals in a previous period.

A

Errort^2 = A0 + A1*ErrorT-1^2 + Sigmat

29
Q

Variance of ARCH series

A

Sigmat+1^2 = A0 + A*Errort^2