Week 1 - GLM Flashcards

1
Q

3 things modelling consist of

A

X variables
Relationship among X variables (correlation)
Relationship among X and Y variables

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Regression Model

A

X variables are jointly and simultaneously related to Y

Are a linear weighted composite (Sum of X’s, weighted by what b is)

Assumption linearity (Straight relationship NOT Curvilinear - learn this later)

Predicted score and residual uncorrelated (Systematic and error are not correlated)

Relationship between X and Y not additive

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Basis of GLM

A

Model + error = data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Error (Residual)

A

Part score leftover after systematic variation in X variable accounted for

Baseline to test the strength of X and Y variables in significance testing

Want error as small as possible (Should be if model is correctly specified)
- Should then only contain random/measurement error (Only happen if all systematic variation in Y is accounted for)

Not linearly related to the X’s
- uncorrelated to X’s and not accounted for by the model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Additivity GLM

A

Utilised in hierarchical modelling

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

GLM in Regression

A

Model = Systematic relation among X’s

Error = random, uncorrelated individual difference

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Important to consider in GLM regression

A

Specification of model

Mis-specification lead to less reliability and questioning whether the model is truly systematic

Importance of theory to highlight important factors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are the 4 stages of Box and Jenkins Linear Modelling

A

1) Model specification
2) Parameter estimation
3) Modelling Checking and fit assessment
4) Prediction

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

1) Model Specification

A

Choose and specify constructs to be included in the model

Specification error the most serious you can make

  • Leav out important variables or include varaibles that are irrelevant
  • Every other error can be fixed but this one
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

2) Parameter estimation

A

Deriving the F, R, b-weights

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

3) Model Checking/ fit assessment

A

Rarely performed in social science

Look at the r2 (How much variacne accounted for by the model)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

4) Prediction

A

Never done in social science

Seek to predict new data with existing parameters

Does observed Y fit in with model of predictors and predicted relationship

Use data to test our prediction

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Incorrectly specified model

A

May contain systematic effect of a variable that has been left out of the model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Consequences of incorrectly specified model

A

Inflate error

Inflate b-weights (Make there look like there is a bigger effect when in reality there is not)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

How to increase statistical control

A

Adding control predictors into the model to take their systematic variance out of the error term
- Variance that the control variable share with Y is partialed out (Other variables can exert a unique effect on the variance on Y whilst controlling for these nuisance variables)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What are the three types of Multiple regression?

A

1) Simultaneous (Entry of all factors at once) (Forced entry)
2) Hierarchical (Uses additive properties of GLM to assess predicted X’s over several regressions)
3) Stepwise Regression (Used for data mining and marketing to come up with best predictors for a given sample)
- Use statistical rules and not theory
- Consecutively remove items from the model to lower the r2 to a predetermined amount

17
Q

What is b-weight?

A

Partial slope

Effect of That particular X on the outcome adjusting for every other X in the model

18
Q

What happens when predictors highly correlated?

A

Can partial each other out

19
Q

Model Comparison in Hierarchical

A

Compare models to explain Y

New predictor must account for part of Y not explained for by previous predictors in the nested model

Each new regression start with residualised Y outcome from the previous block

R2 become change in R2 across models (blocks)