Multiple Regression - Reading 5 Flashcards

1
Q

How to interpret the intercept coefficient?

A

if the dividend
payout Ratio is 0 and the slope of the yield curve is 0, we would expect the subsequent 10 year real earnings growth sale to be
- 11,67.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How to interpret the slope coefficient?

A

If the payout ratio increases by 1%, we would expect the dependent variable to increase by b, holding everything else constant

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How to formulate a hypothesis for statistical significance?

A

h0:b=0
Ha:b<>0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

how to calculate the t-statistic for statistical significance?

A

t=(^b-0)/Sd(^b)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

how many degrees of freedom to use for statistical significance??

A

df=n-k-1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the p-value

A

The p-value is the smallest value of significance for which the null hypothesis can be rejected

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What 8 important informations are you able to calculate with an ANOVA table?

A
  • Coefficient of determination
  • F statistic
  • standard error of estimate
  • SST
  • RSS
  • SSE
  • MSE
  • MSR
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Why do we need to calculate a adjusted coefficient of determination

A

Unfortunately , R2 by itself may not be a reliable measure of the explanatory power of the multiple regression.
This is because the coefficient of determination almost always increase as variables an added to the model even if the marginal contribution of the new variables isn’t statistically significant. Consequently, a relative
high R may reflect the impact of a large set of independent variables rather than how well the set explains the
dependent variable
*will always be less or equal R^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How to calculate an adjusted coefficient of determination?

A

Adj. R^2={[(n-1)/(n-k-1)]x[1-R^2]}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How to interpret the coefficient of dummy variables?

A
Th estimated regression coefficient for dummy variables Indicates the difference in the dependent variable for the category represented by the dummy variable and the average value of the dependent variable for all classes
except the dummy variable o class
n -1 variables
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is heteroskedasticity? and What are the types and with one is the major problem?

A

heteroskedasticity occurs when the variane of the residuals is not the same across all observations in the sample
CONDITIONAL—> MAJOR PROBLEM
UNCONDITIONAL

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is heteroskedasticity effect?

A
  • sd unreliable
  • coefficients aren’t affected
  • t-statistics unreliable
  • F-statistics unreliable
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

How to detect heteroskedasticity?How to calculate the test statistic?

A

Scatter plot or Breusch-Pagan
BP Statistic= n x R^2_resid
*one-tailed test

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

How to correct heteroskedasticity?

A

robust sd (white corrected)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is serial correlation?

A

because of a tendancy of the data to cluster together from observation to observation, positive serial correlation typically
results in coefficient standard error that are too small, even though the estimated efficient are consislant. To many Type I error!!!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

How to spot serial correlation?

A

reisdual Plot or DW statistic

if n too large: DW=2(1-r)

17
Q

How to detect serial correlation?

A

DW for n large: DW= 2(1-r)
h0: No positive serial correlation

DW=2 -> error terms homocadastic and
not serially correlated (r=0)
DW<2 -> error terms are positively
serially correlated (r ›o)
DW>2 -> error terms are negatively
serially correlated (r<0)

Decision rule:
0——————d1——————-d2—————-
Reject Ho / Inconclusive / Do not reject H0

18
Q

How to correct serial correlation?

A
  • Hanssen Method -> Corrects for serial correlation and heteroskedasticity
  • but if serial correlation is not sa problem it is best to use the White corrected sd instead
  • Improve the specification
19
Q

What is multicollinearity?

A

the condition when two or more at the independent variables, or linear combinations of the independent variables, in a multiple regression are highly correlated with each other. This condition distorts the standard error of estimate and the coefficient standard arsons, leading to problems when conducting t-test for statistical significance of parameters

20
Q

What is multicollinearity effect?

A
  • coefficientts are consistent but unreliable
  • SD errors inflated
  • incorrectly conclude that a variable is not stat. significant -> Type II error
21
Q

How to detect multicollinearity?

A
  • none of the individual coefficient is significantly different from zero
  • F-test is statistically significant
  • Coefficient of determination is high
22
Q

How to correct multicollinearity?

A

omit one or more of the correlated variables