Multiple regression analysis Flashcards

1
Q

Why do you need a multiple regression

A

To prevent an explanatory variable ending up in the u term. Therefore prevents ZCM assumption being broken

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

MLR. 3

A

No prefect collinearity - No exact linear relationships. Also no constant explanatory variables

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

MLR. 6

A

No correlation between error terms - no autocorrelation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Significance level

A

Measure of the strength of evidence that must be present

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Critical value

A

A point which is compared to the test statistic. If t value is greater than CV then Ho can be rejected

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Type 1 error

A

When the actual state is false but the analysis says it’s true

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Type 2 error

A

When the actual state is true but the analysis says it’s false

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

P value with example

A

Summary of strength of evidence against the null.

E.g - p = 0.05 then you can be 95% sure it is statistically significant

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

F test

A

Used to compare multiple regressions against each other. e.g an unrestricted model and restricted model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Reading f statistic (the degrees of freedom)

A

Numerator degrees of freedom on top (same value as q). Denominator degrees of freedom on the left hand side

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Why are quadratics needed in regression?

A

For increasing or decreasing marginal effects. E.g - experience on wage diminishes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Why are logs needed in regression?

A

Useful for large amounts.

Also often reduces heteroskedasticity.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Finding a maximum point from a quadratic

A

When B1 intercept is positive and B2 is negative. x* is when B1/2B2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Level - Level

A

B hat (mean of X/ mean of Y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Level - Log

A

B hat (1 / mean of Y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Log - Level

A

B hat * mean of X

17
Q

Log - Log

A

B hat

18
Q

What is the t statistic when H0: Bj = aj

A

t = estimate - hypothesised value
——————————————
Standard error

19
Q

Log - level level quadratic

A

(B1hat + 2B2hat x Xbar) x 100

20
Q

When the question talks about goodness of fit

A

R^2 values

21
Q

Affect of rescaling

A

All parameters are divided or multiplied by the change in the measurement. SO is the SSR
However the R^2 is exactly the same as the proportion

22
Q

Partialling out approach

A
  • Regress X1 on X2 to obtain residuals

- Regress y on residuals to obtain B1

23
Q

What does partialling out show

A

That you get the same result for the multivariate regression even when taking out the effect of other variables