SPSS: Multiple Regressions Flashcards

Formulas

1
Q

Types of Regression

A

Multiple Regression: A statistical technique used to make predictions about the scores on an outcome variable(y) based on the scores of multiple independent variables(x).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Assumptions (Cohen 2013)

A

Independent Random Sampling
Linearity
Normality
Homoscedaticity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Independent Random Sampling

A

The data points should be independent of each other

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Linearity

A

The assumption is that the best way to describe the data pattern is using a straight line.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Normality

A

The assumption is that the population data points for both variables are normally distributed.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Homoscedaticity

A

The assumption that for each possible x-value, the y-variable has the same variance in the population (same as homogeneity of variance in the independent samples t-test).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Assumptions (ABU-BADER, 2010)

A

Scale of measurement for the variables: Factors, Criterion
Multicollinearity
Sample size

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Factors

A

Nominal (dummy variable only), ordinal, interval, ratio

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Criterion

A

Interval or Ratio

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Multicollinearity

A

Occurs when two variables have a relationship that is too strong

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Sample Size

A

50+8m(Number of Factors) recommendation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

ŷ

A

The value for the outcome variable that we are trying to predict

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

x

A

Value that we know for the predictor variable(s)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

i

A

the number or each factor

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

a

A

Y-intercept is where the regression line crosses the y-axis; the outcome value is when x=0 (also known as the constant or the regression constant).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

b

A

Slope of the regression equation; angle of the regression line; proportion of change in the dependent variable for every one unit change in the independent variable (also known as the unstandardized regression constant).

17
Q

Y-intercept formula

A

ayx= y(mean)-byx(x mean)

18
Q

Regression Equation

A

Y1= byx(x)+ayx

19
Q

Regression Variance

A

The spread of the variance around the mean quantifies the total amount of error by computing how well the regression equation predicts y-values:
Sum of Squares Regression/N

20
Q

Residual Variance

A

The distance between the observed y-value and the predicted y-value:
Sum of Squares Residual/N

21
Q

Slope Formula

A

byx=sy/sx(r)

22
Q

Linear Regression Interpretation

A

Pearson’s r, Bonferronis(slope), Is the model statostically significant(F, df reg, df res, p-value), coefficient of determination(R^2), coefficient of nondetermination(1-R^2)

23
Q

Pearson’s r

A

It is used to determine the RELATIONSHIP between two variables

23
Q

Total Variance

A

Quantifies the total variance/distance between the predicted y-values and the mean of the original y-values