12.1 - 12.3 Predicting the Outcome of a Variable Flashcards

1
Q

Define the (least squares) regression line. What is the equation?

A

requires that the sum of the squares of the vertical distances from the data points (x, y) to the line is as small as possible
only for linear associations
y = mx+b
-y = responding variable
-x = explanatory variable
-m = slope = the amount that ŷ changes when x increases by 1 unit
-b = y-intercept = the prediced value of ŷ when x=0
NOT resistant to outliers

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Define interpolating. Define exterpolating.

A

*interpolating - predicting ŷ for x-values that are BETWEEN observed x-values; can produce unrealistic forecasts
*exterpolating - predicting ŷ for x-values that are BEYOND observed x-values

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is r^2?

A

the strength of the correlation squared; NOT resistant to outliers
“48% of the variation in [the response variable, y] can be explained by the variation in [the explanatory variable, x]”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Define slope of the regression line.

A

the amount that ŷ changes when x increases by 1 unit

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Define y-int of the regression line.

A

the predicted value of ŷ when x=0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Define residual.

A

the difference between the actual value and the predicted value (y - ŷ)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly