SPSS: Multiple Regressions Flashcards
Formulas
Types of Regression
Multiple Regression: A statistical technique used to make predictions about the scores on an outcome variable(y) based on the scores of multiple independent variables(x).
Assumptions (Cohen 2013)
Independent Random Sampling
Linearity
Normality
Homoscedaticity
Independent Random Sampling
The data points should be independent of each other
Linearity
The assumption is that the best way to describe the data pattern is using a straight line.
Normality
The assumption is that the population data points for both variables are normally distributed.
Homoscedaticity
The assumption that for each possible x-value, the y-variable has the same variance in the population (same as homogeneity of variance in the independent samples t-test).
Assumptions (ABU-BADER, 2010)
Scale of measurement for the variables: Factors, Criterion
Multicollinearity
Sample size
Factors
Nominal (dummy variable only), ordinal, interval, ratio
Criterion
Interval or Ratio
Multicollinearity
Occurs when two variables have a relationship that is too strong
Sample Size
50+8m(Number of Factors) recommendation
ŷ
The value for the outcome variable that we are trying to predict
x
Value that we know for the predictor variable(s)
i
the number or each factor
a
Y-intercept is where the regression line crosses the y-axis; the outcome value is when x=0 (also known as the constant or the regression constant).
b
Slope of the regression equation; angle of the regression line; proportion of change in the dependent variable for every one unit change in the independent variable (also known as the unstandardized regression constant).
Y-intercept formula
ayx= y(mean)-byx(x mean)
Regression Equation
Y1= byx(x)+ayx
Regression Variance
The spread of the variance around the mean quantifies the total amount of error by computing how well the regression equation predicts y-values:
Sum of Squares Regression/N
Residual Variance
The distance between the observed y-value and the predicted y-value:
Sum of Squares Residual/N
Slope Formula
byx=sy/sx(r)
Linear Regression Interpretation
Pearson’s r, Bonferronis(slope), Is the model statostically significant(F, df reg, df res, p-value), coefficient of determination(R^2), coefficient of nondetermination(1-R^2)
Pearson’s r
It is used to determine the RELATIONSHIP between two variables
Total Variance
Quantifies the total variance/distance between the predicted y-values and the mean of the original y-values