Quantitative Analysis Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

Spurious Correlation

A

Used to refer to 1) correlation between two variables that reflects chance relationships in a particular data set, 2) correlation induced by a calculation that mixes each of two variables with a third, and 3) correlation between two variables arising not from a direct relation between them but from their relation to a third variable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Linear Regression

A

Also known as linear least squares, computes a line that best fits the observations; it chooses values for intercept b0 and b1 that minimize the sum of the squared vertical distances between the observations and the regression line.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Standard Error of Estimate (SEE)

A

Measures uncertainty; is similar to the standard deviation for a single variable, except that it measures the standard deviation of the residual term in the regression. An indicator of the strength of the relationship between the dependent and independent variables. The SEE will be low if the relationship is strong and high if it is weak.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Coefficient of Determination (R^2)

A

The fraction of the total variation that is explained by the regression.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Type 1 Error

A

Rejecting the null when it is true.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Type 2 Error

A

Failing to reject the null when it is in fact false.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Analysis of Variance (ANOVA)

A

A statistical procedure for dividing the total variability into components that can be attributed to different sources. Used to determine usefulness of the independent variable or variables in explaining variation in the dependent variable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Multiple R

A

Is the correlation of the two variables, which is also the square root of the R^2 for one variable equations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Regression MSS

A

Explained Variation/degrees of freedom

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Multiple Linear Regression

A

Allows you to determine the effect of more than one independent variable on a particular dependent variable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Heteroskedastic

A

The variance of the errors differs across observations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Unconditional Heteroskedasticity

A

Occurs when Heteroskedasticity of the error variance is not correlated with the independent variables in the multiple regression.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Conditional Heteroskedasticity

A

Heteroskedasticity in the error variance that is correlated with (conditional on) the values of the independent variables in the regression.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Serial Correlation

A

When regression errors are correlated across observations. Also known as, autocorrelated.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Positive Serial Correlation

A

Serial correlation in which a positive error for one observation increases the chance of a positive error for another observation. It also means that a negative error for one observation increases the chance of a negative error for another observation. The residual terms are correlated with one another, leading to coefficient error terms that are too small.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Multicollinearity

A

Occurs when two or more independent variables (or combinations of independent variables) are highly (but not perfectly) correlated with each other.

17
Q

Nonstationarity

A

A variable’s properties, such as mean and variance, are not constant through time.

18
Q

P-Value of Regression

A

The smallest level of significance at which we can reject a null hypothesis that the population value of the coefficient is 0, in a two-sided test.

19
Q

Multiple R-Squared to Multiple Regression

A

The percentage of the variation in the dependent variable that is explained by the model.

20
Q

Autoregressive Model (AR)

A

A time series regressed on its own past values.

21
Q

Covariance Stationary

A

A time series’ properties, such as the mean and the variance do not change over time. Assume in an autoregressive model to conduct a valid statistical reference.

22
Q

Random Walk

A

A time series in which the value of the time series in one period is the value of the series in the previous period plus an unpredictable random error.

23
Q

Cointegrated

A

If a long-term financial or economic relationship exists between two time-series, such that they do not diverge from each other without bound in the long run.

24
Q

Out of Sample Error Forecasts

A

Refers to the difference between the realized value and the forecasted value for the dates beyond the estimation period.

25
Q

Calculating RMSE for Out of Sample Forecast Errors

A

1) Take the difference between the actual and the forecast errors
2) Square the error
3) Sum the squared errors
4) Divide by the number of forecasts
5) take the square root of the average

26
Q

The Hansen Procedure

A

Used to solve issues with Heteroskedasticity and serial correlation.

27
Q

Unit Root

A

The presence of a unit root means that the least squares regression procedure that we have been using to estimate an AR(1) model cannot be used without transforming the data first. Most likely to occur in time series that trend over time or have a seasonal element.