W7 multiple regression analysis Flashcards

1
Q

Ordinary least squares regression

A

When we have > 1 predictor (X) variable, we simply need a way to combine them
The regression procedure expands seamlessly to do this…

Essentially, we extend the least squares procedure to estimate b0 and b1 and b2… bk to give us the best possible prediction of Y from all the variables jointly

This type of regression might be referred to in some texts as OLS, Ordinary Least Squares Regression

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Equation for multiple regression

A

Y’ = b0 + b1X1 + b2X2 + e

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Testing the significance of R

A

fancy p = rho
-> population correlation coefficient
H0 : fancy p = 0
H1 : fancy p not equal to 0

H0 : there is no linear relationship between X and Y in the population
H1: there is a linear relationship between X and Y in the population

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

R squared and variance

A

In correlation, r squared = proportion of variance shared by X and Y, i.e., the two correlates—overlap

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Multiple R squared and variance

A
  • Multiple R2 is variance in Y accounted for by Xk
    usually referred to simply as R2
  • Not specific to any one single X, it is for the set of them.
  • The regression equation is also referred to as the Model
    we check the value of R2 in the Model Summary box in the SPSS output
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

R squared equation

A

R squared = SSreg / SStot

  • these can be manually calculated (i guess)
  • found in ANOVA table
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

SStot

A

Y - Y bar

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

SSreg

A

Y’ - Y bar

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

SSres

A

Y - Y’

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Extending our interpretation of regression predictors

A

Semipartial correlations—sr and sr2

  • > Relationship between Xk and Y when all other predictors partialled out of only Xk.
  • > Good indicator of the unique relationship between Xk and Y
  • > Always squared (i.e., sr2) to give proportion of unique variance in Y accounted for by Xk
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Calculating the unique variance

A
  • unique variance of X1 on Y is equal to X1’s “part” squared, found in correlations coefficients table when using zpp command
  • unique variance of X2 on Y is equal to X2’s “part” squared from correlations table
  • the “part” has removed the variance accounted for from the other variable and gives the variance (when squared) of that variable alone
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Shared variance

A
  • the difference between the combined ‘unique variance’ (sr squared X1 + sr squared X2) and ‘total variance explained’ (R squared)

Shared variance = R squared - (sr squared X1 + sr squared X2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly