02. Quantitative Methods Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

Correlation Equation:

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are the three limitations of correlation analysis?

A
  1. Outliers
  2. Spurious correlation
  3. Only measures linear relationships.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What does a t-test determine?

A
  • A t-test is used to determine if a correlation coefficient, r, is statistically significant.
  • Significance is supported if the test statistic is less than −tcritical or greater than tcritical with n − 2 degrees of freedom.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

General linear regression equation:

A

Yi = b0 + b1Xi + εi.

  • Yi and Xi are the ith observations of the dependent and independent variable, respectively.
  • b0 = intercept.
  • b1 = slope coefficient.
  • εi = residual error for the ith observation.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Linear regression assumptions:

A
  • A linear relationship exists between the dependent and independent variables.
  • The independent variables are not random, and there is no exact linear relation between any two or more independent variables.
  • The expected value of the error term is zero.
  • The variance of the error terms is constant.
  • The error for one observation is not correlated with that of another observation.
  • The error term is normally distributed.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

The confidence interval for the regression coefficient, b1, is calculated as:

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Standard Error of the Estimate Calculation:

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What does the coefficient of determination, R2, measure?

A

The proportion of the total variation of the dependent variable explained by the regression.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

R2 equation:

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

F-Stat calculation:

A

Used to test the significance of all (or any subset of) the independent variables (i.e., the overall fit of the model) using a one-tailed test

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

MSR calculation:

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

MSE Calculation:

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Describe the p-value:

A

The p-value is the smallest level of significance for which the null hypothesis can be rejected.

  • If p-value is less than the significance level, the null hypothesis can be rejected.
  • If p-value is greater than the significance level, the null hypothesis cannot be rejected.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the equation for a t-test used for hypothesis testing of regression parameter estimates:

A

with n − k − 1 degrees of freedom

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Confidence interval for regression coefficient calculation:

A

estimated regression coefficient ± (critical t-value)(coefficient standard error)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

R2 Adjusted Calculation? Why is it relevant?

A
  • R2 increases as the number of independent variables increases—this can be a problem.
  • The adjusted R2 adjusts the R2 for the number of independent variables.
17
Q

What is Conditional Heteroskedasticity?

A

When residual variance is related to level of independent variables.

18
Q

What is the effect of Conditional Heteroskedasticity?

A
  • Coefficients are consistent.
  • Standard errors are underestimated.
  • Too many Type I errors.
19
Q

How is Conditional Heteroskedasticity detected?

A

Breusch-Pagan chi-square test = n × R2

20
Q

How is Conditional Heteroskedasticity fixed?

A

By using White-corrected standard errors

21
Q

What is serial correlation?

A

When Residuals are correlated

22
Q

What are the effects of serial correlation?

A
  • Coefficients are consistent.
  • Standard errors are underestimated.
  • Too many Type I errors (positive correlation).
23
Q

How is serial correlation detected?

A

Durbin-Watson test
≈ 2(1 − r)

24
Q

How is serial correlation corrected?

A

By using the Hansen method to adjust standard errors

25
Q

What is multicollinearity?

A

When two or more independent variables are correlated

26
Q

What are the effects of multicollinearity?

A
  • Coefficients are consistent (but unreliable).
  • Standard errors are overestimated.
  • Too many Type II errors.
27
Q

How is multicollinearity detected?

A

Conflicting t and F statistics; correlations among independent variables if k = 2

28
Q

What is the correction for multicollinearity?

A

Drop one of the correlated variables