Quantitative Methods Flashcards
Linear Regression Equation
Linear regression equation: Y_i= b_0+ b_1 X_1+ ε_i,i=1,…,n where Yi – dependent, Xi – independent, bo – intercept, b1 – slope coefficient
Confidence Interval
Confidence interval: (b_1 ) ̂± t_c s_(b_1 ) ̂ , tc – critical t-value, b1 hat – calculated t coefficient, sb – standard error
Hypothesis Test (b1)
Hypothesis Test: = ((b_1 ) ̂-b_1)/s_(b_1 ) ̂ = (calculated t coefficient- H_0 Value)/(Standard Error) , where standard error= coefficient/(ANOVA t-stat), if t > tc, reject null hypothesis b1 = ___ (if testing for statistical significance from zero, b1 = 0)
Standard Error of the Estimate (SEE)
Standard Error of the estimate of regression model: SEE= ((Unexplained Variation or sum of squares)/(n-1))^(1/2)
Regression Degrees of Freedom
Regression degrees of freedom = # of independent variables
Residual Degrees of Freedom
Residual degrees of freedom = total df – regression df = n – (k+1)
MSS Regression
MSS Regression = Regression SS / Regression df
MSS Residual
MSS Residual = Residual SS / Residual df
F =
F = MSS Regression / MSS Residual
Correlation =
Correlation = (Cov(X,Y))/(s_x s_y )=r where s is standard deviation = square root of variance
T-Test (correlation is different from zero)
T-Test (correlation is different from zero): t=(r√(n-2))/√(1-r^2 ) where r is sample correlation, if t is greater than tc then reject the null hypothesis
Multiple Linear Regression
Multiple Linear Regression: Y_i= b_0+b_1 X_1i+b_2 X_2i+⋯+b_k X_ki+ε_i,i=1,2,…,n
Durbin Watson
Durbin Watson, if the DW stat is outside the critical values then fails to reject the null, if DW = 2 not serially correlates, if DW 2 then negatively correlated
Multicollinearity (definition)
Multicollinearity – a regression assumption violation that occurs when two or more independent variables (or combinations of independent variables) are highly but not perfectly correlated with each other
Heteroscedasticity (how to notice)
Heteroscedasticity – incorrect standard of errors
Multicollinearity (hot to notice)
Multicollinearity – high R2 (f-stat related) and low t-stats
Mean-Reverting Level =
Mean-reverting level = b_0/(1-b_1 )
Unit Root
First use a unit root test for each of the two times series to determine whether either have a unit root: 1) if neither have unit root, can safely use linear regression to analyze relationship between two time series; 2) if one fails, cannot use linear regression to analyze relationship between two time series; 3) if both have unit root, need to establish whether the two time series are cointegrated before rely on regression
Testing for ARCH
Test for autoaggressive conditional heteroscedasticity (ARCH): whether a1 is statistically different from 0 (a1 is like the b1 of the error regression function, where the error term is the independent variable)