Chapter 10: Simple Linear Regression Flashcards

1
Q

Regression analysis

A

Allows us to test hypotheses about the relationship between two variables, by quantifying the strength of the relationship between the two variables, and to use one variable to make predictions about the other variable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Sum of squares total (SST)

A

A measure of the total variation in the dependent variable in a simple linear regression. It is calculated by subtracting the mean of the observed values Y¯ from each of the observed values Yi, squaring each of these differences, and then summing all of these squared differences.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Simple linear regression (SLR)

A

An approach for estimating the linear relationship between a dependent variable and a single independent variable by minimizing the sum of the squared deviations between the fitted line and the observed values.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Regression coefficients

A

The collective term for the intercept and slope coefficients in the regression model.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Residual

A

The amount of deviation of an observed value of the dependent variable from its estimated value based on the fitted regression line.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Sum of squares error (SSE)

A

A measure of the total deviation between observed and estimated values of the dependent variable. It is calculated by subtracting each estimated value Y^i from its corresponding observed value Yi, squaring each of these differences, and then summing all of these squared differences.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Homoskedasticity

A

Constant variance across all observations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Heteroskedasticity

A

Non-constant variance across all observations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Estimated parameters

A

The intercept and slope of the fitted line.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Sum of squares regression (SSR)

A

A measure of the explained variation in the dependent variable, calculated as the sum of the squared differences between the predicted value of the dependent variable, Y ̂_i, based on the estimated regression line, and the mean of the dependent variable, Ȳ.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Coefficient of determination (R2)

A

The percentage of the variation of the dependent variable that is explained by the independent variable. It is a measure of goodness of fit of a regression model.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Mean square regression (MSR)

A

Calculated as the sum of squares regression (SSR) divided by the number of independent variables in the regression model. In simple linear regression, there is only one independent variable, so MSR equals SSR.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Mean square error (MSE)

A

Calculated as the sum of squares error (SSE) divided by the degrees of freedom, which are the number of observations minus the number of independent variables minus one. Since simple linear regression has just one independent variable, the degrees of freedom calculation is the number of observations minus 2.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Standard error of the slope coefficient

A

Calculated for simple linear regression by dividing the standard error of the estimate by the square root of the variation of the independent variable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Indicator variable

A

A variable that takes on only one of two values, 0 or 1, based on a condition. In simple linear regression, the slope is the difference in the dependent variable for the two conditions. Also referred to as a dummy variable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Analysis of variance (ANOVA)

A

A table that presents the sums of squares, degrees of freedom, mean squares, and F-statistic for a regression model.

17
Q

Standard error of the estimate

A

A measure of the distance between the observed values of the dependent variable and those predicted from the estimated regression. The smaller this value, the better the fit of the model. Also known as the standard error of the regression and the root mean square error.

18
Q

Standard error of the forecast

A

Used to provide an interval estimate around the estimated regression line. It is necessary because the regression line does not describe the relationship between the dependent and independent variables perfectly.

19
Q

Log-lin model

A

A functional form for transforming regression model data in which the dependent variable is logarithmic but the independent variable is linear.

20
Q

Lin-log model

A

A functional form for transforming regression model data in which the dependent variable is linear but the independent variable is logarithmic.

21
Q

Log-log model

A

A functional form for transforming regression model data in which both the dependent and independent variables are in logarithmic form.