Week 2 Flashcards

1
Q

What are residuals?

A

Residuals are the portion of the score (case) which is not predicted by the regression. Since real world relationship is going to be perfect, there will always be some error and this is expected, however, large errors indicate that the regression model is not adequate, patterns in the residuals can indicate heteroscedasticity or non linearity or both and isolated cases with large residuals are likely to be outliers

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What does Mahalanobis distance show?

A

Mahalanobis distance provides a way to measure how similar some set os conditions is to a known set of conditions. It accounts for the covariance among variables

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What does linear regression plan to show?

A

linear regression looks to predict an outcome (DV) using predictor variables (IVs)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Linear regression looks to predict a single quantitative DV from multiple quantitative and/or qualitative variables. Are relationships always linear? How can non linear IV’s be made to appear linear?

A

By transforming them using log functions and so forth

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

what does the Pearson correlation measure?

A

Pearson correlation measures the strength of the linear relationship

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Does regression model imperfect relationships?

A

Yes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

In regression, do we predict y from X?

A

Yes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What does the equation to predict Y from X look like?

A

Y’ = A + BX

Where ‘ indicates “predicted”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

The prediction of Y will not always be perfect. How is the error calculated?

A

Actual Y minus predicted Y

Y - Y’

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What does the sum of squares have to do with regression?

A

The sum of least squares is used to find the value of A and B that create the equation of the line of best fit

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Natural variation in Y indicate that the points of Y are dispersed in a scatter, how is this denoted?

A

SSY

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the total variation of Y separated into?

A

Total variation can be separated into “regression” and and “error”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

How is the total variance explained

A

R2 is equal to the Sum of Squares due to Regression divided by the Sum of Squares Total

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the R value

A

The R value is the correlation between the DV and the IV’s

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is the R2 value?

A

The R2 value is the total amount of variation accounted for by the predictors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is the R2 adjusted?

A

Adjusted R2 represents the R2 after accounting for the number of predictors and the sample size

17
Q

What is the standard error of the estimate?

A

the deviation about the predicted values

18
Q

What are the regression coefficients?

A

Regression coefficients are the values of the B’s for the model.
The unstandardised values are in the units of the original measures
The standardised values are the z-scores and gives some indication of which variables have the biggest influence on outcome