The Linear Model Flashcards

1
Q

Why do we want to fit models?

A

To make predictions, everything we do is just a variation of a theme

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How do we generalise our model?

A

We work on a small sample and develop our model from this - hoping it represents a large sample. for example, you wouldn’t just turn up and make a bridge, you have to plan it, making models (a small bridge) then see how it performs under different conditions = tells you how it would work in the real world

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What are statistical tests?

A

Cases of the linear model - can do any test and end up with the same results

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the equation of a straight line?

A

outcome = (b0+b1+x) + error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What does B1 represent?

A

Estimate of parameter for the predictor:
direction/strength of the relationship
difference in means

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What does B0 represent?

A

Estimate of the value of the outcome when the predictor is 0 (intercept)
when everything is 0, what is the relationship between the predictor and outcome

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What does X represent?

A

The predictors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the mean?

A

A very simple model, with one parameter and no predictors. its not a value actually observed, therefore there will be error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What does the mean give rise too?

A

The least deviations - least squared error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Why does the mean have the least squared error?

A

It is the score from which all of the scores will deviate the least from
estimating is based on minimising errors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How to estimate squared errors?

A

Difference between raw score and the mean

square these and add them up

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What do most models use?

A

Ordinary least squares - minimises the error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is it called when you extend the model?

A

Multiple regression

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

How do you enter predictors into the model?

A

Hierachal - best way - as the researcher, you make decisions about what goes first, good for theory testing, building on past knowledge rather than just guessing

Forced entry - all predictors are entered at the same time (ingredients in a cake)

Stepwise - predictors are selected using semi-partial correlation with the outcome. using what SPSS has found to have the biggest contribution, non-human, only for exploratory analyses, once one predictor is in, effects all of the others

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is the rate of change of B?

A

As something increases by one unit, how much does the outcome increase
Change in outcome associated with a change in predictor

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What are standardised parameters?

A

Parameters which are expressed in standard deviations, means they are comparable because they are on the same scale

17
Q

What does deviation refer too?

A

The observed value minus the predicted value

18
Q

Why is adding up each persons error problematic?

A

Because the minuses and pluses will cancel each other out

19
Q

Solution to adding up each persons error

A

Square each value

20
Q

What is the sum of squared errors?

A

All of the errors in the data set
find the deviation
square each deviation
add all of these up

21
Q

What is wrong with the sum of squared errors?

A

It is good, but it depends on how many scores there are in a data set. If different amount of scores, unfair as not a comparable measure

22
Q

What do we use instead of sum of squared errors?

A

Mean squared error/variance

23
Q

How do you calculate mean squared error/variance?

A

Sum of squares divided by degrees of freedom

24
Q

What does the variance refer too?

A

The average error between the mean and observations made

25
Q

What are degrees of freedom?

A

The amount of scores that are free to vary, to be any score they want
the last score in a data set has to be the one which aligns the score to the mean

26
Q

What is SST?

A

Total variability between the scores and the mean

deviation squared added up

27
Q

What is SSR?

A

The variability between the model and the data, the error in the model

28
Q

What is SSM/regression?

A

Variability between the model and the mean

the improvement due to the model

29
Q

If the model is better than the mean, what should we expect?

A

A bigger SSM compared to SSR

30
Q

How to calculate F?

A

MSM divided by MSR - how good it is compared to how bad it is

31
Q

What does R mean?

A

Correlation between predictors and outcome

32
Q

What does R squared mean?

A

Variability accounted for by the model
SSM divided by SST
correlation between observed and predicted score

33
Q

What does R squared adjusted mean?

A

An estimate of R square in the population

34
Q

What does R square change mean?

A

How much model fit improves as more predictors are added

35
Q

What is the F change?

A

How good it is compared to how good it isn’t

36
Q

What does T mean?

A

Evaluates predictors, see which is useful

37
Q

What does P mean?

A

Probability of getting a test statistic as big as one you have, if null hypothesis is true