lecture 4- multiple regression Flashcards

1
Q

describing correlation:

A

Correlation: relationship between 2 variables
* calculate r and R2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

predicting simple linear regression:

A

Simple linear regression: predicting one variable from another variable
* calculate R and R2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

describing partial correlation:

A

Partial correlation: relationship between 2
variables while accounting for another variable
or variables
* calculate partial r and R2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

predicting multiple linear regression:

A

Multiple linear regression: predicting one variable
from 2+ other variables
* calculate multiple R and multiple R2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

least squares- regressions

A

simple linear regression equuation: Y= a + bX + error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

variance accounted for (R2)- regressions

A

How well does variable X predict variable Y?
How much variance in Y can be predicted from X?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

equation for simple and multiple linear regression

A

simple: Y^ = a + b1x1
multiple: Y^ = a + b1x1 + b2x2 + b3x3 + K bk xk

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

simple linear regression

A

R is the correlation between the criterion variable and a single predictor (ignoring all other possible predictors).
R2 (coefficient of determination) is the amount of variance explained by that single predictor.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

multiple regression:

A

Multiple R is the correlation between a criterion variable and multiple, weighed predictors (i.e., the effect of each
predictor after controlling for the effects of the other predictors).
Multiple R2 (coefficient of multiple determination) is the amount of variance explained by those multiple predictors.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

simple and multiple linear regression

A

The multiple regression equation is an extension of the bivariate equation.
* the constant (a) which represents value of Y when all predictor variables (x1, x2, x3…) are zero.
* There is a b weight for each of the predictors (x1, x2, x3…).
→ These are partial regression coefficients.
→ These weights represents the change in Y associated with a 1-unit change in a particular X,
when all other Xs are held constant

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Y with an accent on top = there will be error in prediction

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

error=

A

varience in the model that is not explained

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

least squares:

A

find regression line that provides the best prediction possible ie a regression line that minimises error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

regressions=

A

how much of the varience in a data set is accounted for by the predictor (s)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

what are the three types of multiple regression analyses?

A

Simultaneous (direct entry) regression:
* all the variables are entered in together, irrespective of their absolute or relative importance
Hierarchical regression: you decide (you can enter variables in blocks, with your decisions being driven by
previous research and hypotheses).
Stepwise (statistical) regression:
* Forward regression: your computer programme (e.g., SPSS) will find the single best predictor and enter it as
the first variable; the variable that accounts for the highest proportion of the remaining variance is entered
next and so on
* Backward regression: all variables are entered initially and the worst predictors (i.e., the predictors that
account for the least variance) are removed in turn

How well did you know this?
1
Not at all
2
3
4
5
Perfectly