Simple -> Multiple Regression Flashcards

1
Q

SIMPLE REGRESSION

A

R^2 - GOODNESS OF FIT
- for simple regression, R^2 = square of correlation coefficient
- reflects variance accounted for in data by best fit line
- takes values between 0 (0%)/1 (100%)
- frequently expressed as percentage > decimal
- high values show good fit; low values show poor fit

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

LOW R^2 VALUES

A
  • R^2 = 0
  • 0% = randomly scattered points; no clear relation between X/Y
  • implies that a bets-fit line will be v poor description of data
  • aka. good best fit line = high proportion of variance explained
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

HIGH R^2 VALUES

A
  • R^2 = 1 (impossible!)
  • 100% = points lie directly on line; perfect relation between X/Y
  • implies best fit line = v good description of data
  • aka. moderate best fit line = less variance explained
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

HIGH R^2 VALUES

A
  • R^2 = 1 (impossible!)
  • 100% = points lie directly on line; perfect relation between X/Y
  • implies best fit line = v good description of data
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

SIGNIFICANCE TESTS

A
  • reported in SPSS output
    SIMPLE REGRESSION
  • t-test; established if model describes statsig proportion of variance in data
    MULTIPLE REGRESSION
  • uses ANOVA to discover if proportion of variance in data explained by model = statsig
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

MULTIPLE REGRESSION

A

R^2 - GOODNESS OF FIT
- R^2 will get larger every time another IV (regressor/predictor) is added to model
- new regressors may only provide small improvement in amount of variance in data explained by model
- need to establish “added value” of each additional regressor in predicting DV

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

EFFECTIVENESS (VS EFFICIENCY)

A
  • all possible contributory causes (IVs); maximises R^2
  • explains largest possible proportion of variance in DV/outcome
  • maximises R^2 (ie. maximises proportion of variance explained by model)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

EFFICIENCY (VS EFFICTIVENESS)

A
  • only includes most important variables (IVs)
  • gives largest step increase in R^2ADJ
  • maximises increase in R^2ADJ upon adding another regressor (ie. if new regressor doesn’t add much to variance explained it isn’t worth adding)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

EFFECTIVENESS SCALE

A

0-25%
- v poor; likely unacceptable
25-50%
- poor BUT may be acceptable
50-75%
- good
75-90%
- very good
90%
- likely there’s something wrong w/analysis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

ARE REGRESSORS STATIG ASSOCIATED W/DV?

A
  • ANOVA test checks if model (as whole) has statsig relation w/DV
  • part of predictive value of each regressor may be shared by 1/+ of other regressors in model
  • aka. model must be considered as whole (ie. all IVs together)
  • read off ANOVA table in SPSS output; report as in ANOVA (ie. F (3,12) = 4.33; p = .028)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

INDIVIDUAL REGRESSORS x DV

A
  • SPSS output entitled COEFFICIENTS
  • column headed “un-standardised coefficients - B” gives regression coefficient for each regressor variable (IV)
  • units of coefficient = same as for regressor (IV)
  • all other variables must be held CONSTANT
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

REGRESSOR W/GREATEST EFFECT ON DV

A
  • units for each regression coefficient = dif aka. we must standardise them to compare
  • column headed “standardised coefficients - beta”
  • can compare beta weights for each regressor variable to compare effects of each of DVs
  • larger beta weight indicates stronger effect of regressor on DV values
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

ARE REGRESSOR RELATIONS W/DV STATSIG?

A
  • assessed using t-test
  • check values in column headed t/sig
  • if regression coefficient = negative -> t-value will also be negative (doesn’t matter about sign; t size = important)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly