Session 3 - Quantitative Methods Flashcards

1
Q

Covariance

A

Statistical measure of the degree to which 2 variables move together. Infinite range.

= [(X - mean of X)(Y - mean of Y)] / n - 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Correlation Coefficient

A

A measure of the strength of the linear relationship (correlation) between 2 variables. Range of -1 to 1.

= (covariance of X and Y) / [(sample SD of X)(sample SD of Y)]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

T-test

A

Used to determine if a correlation coefficient, r, is statistically significant.

= r√(n- 2) / √(1-r²)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Slope Coefficient

A

the change in the dependent variable for a 1 unit change in the independent variable.

= covariance / variance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Sum of Squared Errors (SSE)

A

The sum of the squared vertical distances between the estimated and actual Y-values.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Standard Error of Estimate (SEE)

A

Gauges the fit of the regression line. Smaller error = better fit.

= √(SSE/n-2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Regression Sum of Squares (RSS)

A

Measures the variation in the dependent variable that is explained by the independent variable.

It is the sum of the squared vertical distances between the actual Y-values and the predicted Y-values.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Total Sum of Squares (SST)

A

Measures the total variation in the dependent variable. It is equal to the sum of the squared differences between the actual Y-values and the mean of Y.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Coefficient of Determination (R²)

A

The % of the total variation in the dependent variable explained by the independent variable.

= RSS/SST
= (SST - SSE)/SST

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Total Variation (ANOVA)

A

= Explained variation (RSS) + Unexplained variation (SSE)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

F-statistic

A

Assesses how well a set of independent variables, as a group, explains the variation in the dependent variable.

= (RSS/k) / (SSE/n-k-1)

where k = # of independent variables

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

P-value

A

The smallest level of significance for which the null hypothesis can be rejected.
An alternative method of doing hypothesis testing of the coefficients is to compare the p-value to the significance level.
If p-value is less than the significance level, the null hypothesis can be rejected.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Confidence Interval

A

Estimated regression coefficient +/- (critical t-value)(coefficient standard error)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Adjusted R²

A

Overcomes the problem of overestimating the impact of additional variables on the explanatory power of a regression model.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Dummy Variables

A

Independent variables that are binary in nature. They are often used to quantify the impact of qualitative events. They are assigned a value of 0 or 1.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Heteroskedasticity

A

Occurs when the variance of the residuals is not the same across all observations in the sample. This happens when there are subsamples that are more spread out than the rest of the sample.

17
Q

Conditional Heteroskedasticity

A

Related to the level of independent variables. For example, it exists if the variance of the residual term increases as the value of the independent variable increases.

18
Q

Unconditional Heteroskedasticity

A

Not related to the level of the independent variables, which means it doesn’t systematically increase or decrease w/ changes in the value of the independent variables.

19
Q

How to detect Heteroskedasticity

A
  1. Examine a scatter plot of the residuals
  2. The Breusch-Pagan Chi-square test
    * calculate robust standard errors to correct
20
Q

Serial Correlation (auto-correlation)

A

Refers to the situation in which the residual terms are correlated with one another.

21
Q

Positive Vs. Negative Serial Correlation

A

Positive: a positive regression error in one period increases the probability of observing a positive regression error in the next period.

Negative: positive increases the probability of negative regression error in the next period.

22
Q

Multicollinearity

A

The inclusion of correlated independent variables

23
Q

Computing a Test Statistic

A

= coefficient / std error

If this exceeds the critical t value, it is statistically significant.