Logistic Regression, and an Introduction to Hierarchical Analysis Flashcards

1
Q

Simplest Linear Model

A

Just a constant term
Returns the mean of our data
Postive mean = positive change
Beta divided by standard error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

One Continuous Regressor

A

Add one more term returning the intercept and slope of our graph
Explains variation in the score
Beta value divided by standard error - significance difference?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Multiple Continuous Regressors

A

Add more terms
Returns the slope controlling for the effect of other regressors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Multiple Categories

A

Add a ‘dummy variable’ for each additional category
Returns the mean for the reference category, and difference from this reference for each other category

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Multiple Independent Variables

A

Add extra sets of ‘dummy variables’ for each additional discrete factor, plus their interaction terms

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Logistic Regression

A

p(y=right) = 1/1+exp(=B0 + B1 x coherence))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Probability to Odds

A

Odds = p/1-p

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Log Odds

A

Log odds of a binary outcome can be modelled with a straight line
Logarithm of odds
Continuous function
Log odds = log(p/1-p)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

2 Ways of Expressing the Same Idea

A

B1 = impact of coherence
B0 = intercept term
Biases choices

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Summary

A

When the outcome variable is binary, rather than continuous we use logistic regression to predict outcomes
Equivalent to a linear model predicting the log odds of the probability of the outcome
Log odds = link function
Generalises linear models to predict outcomes that are binary in nature

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Aggregate Information across Participants

A

Estimate models ‘hierarchically’
Advantages - interpretability, can apply to any ‘summary statistic’ from each participat, computational simplicity
Disadvantages - requires many observations at ‘first level’ (each participant), assumes each subject is equally reliable, doesn’t explicitly account for ‘correlated observations within participants

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Conclusion

A

Repeat experiment across many different individuals
Aggregate information across individuals by performing a ‘hierarchical’ analysis - estimate a GLM for each subject at first model, and then take the parameters of this model to the 2nd level performing inference across the population
Straightforward and usually valid
Some limitations dealt with by a more sophisticated method of aggregating information across participants, mixed effects model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly