7.2. Regression Analysis II Flashcards

1
Q

Testing hypotheses, between the two coefficients…

A

H0: beta (the slope) equals zero and therefore there is no influence of X on Y.

H1: beta (the slope) doesn’t equal zero and therefore there is influence of X on Y.

Compute the test statistic.

Find the critical value (using the two-tailed t-test statistic).

If test statistic > critical value, we reject the null hypothesis and acknowledge a statistically significant difference.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Testing the significance of R2…

A

Another check of the quality of the regression equation, testing to see if R2 is significantly greater than zero.

This uses F-distribution.

H0: R2 equals zero, implying that beta (the slope) equals zero.

H1: R2 doesn’t equal zero, implying that beta (the slope) doesn’t equal zero.

Compute the test statistic.

Find the critical value, using the F-distribution table.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Prediction…

A

We can calculate prediction or confidence intervals for data:
- Prediction interval for an individual observation on Y when X = a particular value.
- Confidence interval for the position of the regression line at X = a particular value.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Both confidence and prediction interval estimates are most precise, when…

A
  • Xp=X.
  • The closer the sample observations lie to the regression line (i.e. the smaller the standard error).
  • The greater the spread of sample X values (i.e. larger (Xi-X)2).
  • The larger the sample size.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

For Regression Analysis Part II (single regression)…

A

Please see 7.3. Casino Expansion Example.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Multiple regressions…

A

We have more than one explanatory (independent) variable. This is multivariate regression.

Single regression is very limited.

Generally:
- The principle is to fit a plane, not a line as in the simple regression.
- This minimises the sum of squares of vertical distances from each point to the plane.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

A regression model with two explanatory variables…

A

b0: signifies the constant, or intercept, on the Y axis.

b1: is the slope of the plane in the direction of X1 axis.

b1: shows the effect of a unit change in X1 on Y, assuming X2 remains constant.

b2: is the slope of the plane in the direction of X2 axis.

b2: shows the effect of a unit change in X2 on Y, assuming X1 remains constant.

If both X1 and X2 change by 1, the effect on Y is b1+b2.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Data transformation…

A

We need to transform nominal data to adjust it for inflation.

To calculate real income, we must divide GDP by a GDP deflator.

To calculate real import prices, we must divide the price of imports by the retail price index.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Interpreting results…

A

n(gdp) can be calculated which shows the elasticity of one variable (imports) with respect to the other variable (GDP).

Imagine the value is 2.14. This means that a 1% rise in the latter variable (GDP) leads to a 2.14% rise in the former variable (imports).

Using elasticity, we can use percentages rather than simply ‘units’ making our interpretations more accurate.

By a similar calculation, we can calculate this for the other way:
For example, a 1% rise in import prices leads to a 0.03% rise in import demand.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Improving the model, using logarithms…

A

We might need to find more explanatory variables.

We might need to try lagged variables as explanatory variables.

We might need to try a different functional form for the equations.

The use of natural logarithms:
- Non-linear transformation of the data.
- More direct estimates of the elasticities.
- Common practice.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Issues with multivariate regression…

A

Autocorrelation: when one error observation is correlated with an earlier one.

Multicollinearity: when some or all of the explanatory variables are highly correlated.

Omitted variable bias: bias introduced in the estimated coefficients of independent variables due to the omission of relevant variables from the regression model.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly