Logistic Regression Flashcards

1
Q

How do we fix the problem of using categorical data?

A

We predict the probability of Y not just Y.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What do we have to do next when we are predicting the probability of Y?

A

We have to linearise the S-Shape curve so that we have a straight line and not a curve!
This transformation makes interpreting the results differently.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What does ML stand for?

A

Maximum likelihood.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is Y in the logistic regression equation?

A

It is log(Odds) OR log(p/1-p).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How do you conduct a logistic regression in SPSS?

A

We don’t need to know this for the exam.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the first step in interpreting the SPSS results?

A

We check the null model (block 0).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the null model?

A

It is the model without including any predictor.

This value has to be beat when we include the predictors.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are the two Pseudo R-squareds that we use for the model fit and what are their cut-off points.

A

CoxSnell - 0 to 0.75.
Nagelkerke - 0 to 1.

How well you can predict the DV based on the IVs.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the other model fit we can use and how do we know if the model fits or not?

A

Goodness of fit - Hosmer-Lemeshow.

P-value must be non-significant.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What must Block 1 overall percentage be for the results to be significant?

A

It must be larger than Block 0 overall percentage.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How do we test for multicollinearity in logistic regression?

A

We can’t… Have to run the original model as linear regression model and check VIF.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

How do we test for linearity in logistic regression?

A

Compute natural log of predictor and test the interaction.
Interaction significant = linearity issue.
Look at significance in the table of predictor by log transformed predictor.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is B in the SPSS table?

A

Log(Odds).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

How do we interpret B?

A

It is the change in Log(Odds) for one unit increase in our predictor variable.
Not very intuitive.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What should we interpret instead of B?

A

It is better to interpret this in probabilities.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is OddsRatio?

A

It is the change in odds resulting in a unit change in the predictor.

17
Q

How do we interpret OddsRatio?

A

If greater than 1 - predictor increases, odds of being in target group increases (coded as 1). P significant = increase is significant.
If smaller than 1 - predictor increases, odds of being in target group decreases. P significant = decrease is significant.

18
Q

What is the OddsRatio if the probability is the same to be in any of the groups?

A

1.

19
Q

What number must not be between the lower and upper CI?

A

1 (all other sessions is 0!!!).

20
Q

What is OddsRatio in the SPSS table?

A

Exp(B).

21
Q

How do you calculate Odds?

A

p/1-p.

22
Q

How do you calculate OddsRatio?

A

e to the power of B.

23
Q

What can Logistic Regression be used for?

A

To form deep learning networks + study neural pathways and AI.

24
Q

What two assumptions aren’t necessary in Logistic Regression?

A

The error term doesn’t need to be normally distributed.

Homoscedasticity is not required.

25
Q

When do we use a Logistic Regression?

A

When we have categorical data.