Week 9 day 2 Flashcards

1
Q

When do we do a multiple linear regression?

A

When we want to see whether there is a correlation between more than two numeric variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What kind of analysis do we need to do if we have more than one predictor variable, which are numberic, and the outcome is numeric also.

A

Multiple regression.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

If you have known predictor variables for a give outcome and you do not do a multiple regression, what are you risking?

A

You are risking taking into account how those other predictor variables explain the variation in the outcome. In other words, you will end up misinterpreting how a given predictor is correlated with a given outcome.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are the models used to describe a correlation between:
1. one outcome variable and one predictor variable ( continuous variables) - data appears to have a linear relationship.
2. one outcome variable and one predictor variable (continuous variables) - data appears to have a non-linear relationship, but is monotonic.
3. one outcome variable and two predictor variables (all numeric) and we don’t include an interaction term.
4. One outcome variable and two predictor variables (all numeric) and there is an interaction term.

A
  1. Pearson’s r correlation. - line of best fit for raw data.
  2. Spearman’s correlation. - line of best fit for ranked data.
  3. Multiple regression - plane of best fit.
  4. Multiple regression - curved plane of best fit.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What do interactions tell us?

A

They tell us that in order to know what one predictor does to the outcome variable we need to know the value of the other predictor.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the null hypothesis for a regression analysis?

A

The null hypothesis in a regression is that there is no relationship between the outcome and the predictor and therefore y=intercept+error. In other words, it doesn’t matter what the value of the predictor variable is, the outcome will remain the same.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the test stat for regression?

A

F stat.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How is the F stat for regression calculated? What is it a ratio between?

A

Model sum of squares and residual sum of squares.

Model sum of squares is the sum of squared deviation between each data point and the mean of the outcome variable. (df=number of predictors)

Residual sum of squares is the sum of squared deviations between each data point and the line of best fit. (df=N-number of predictors-1).

The residuals capture what our model does NOT capture.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

In ANOVA we have a sum of squares between and sum of squares within.
What are their analogues in a regression?

A

Model sum of squares and residual sum of squares.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How do we determine whether a specific predictor is significant or not?

A

A t-test is done between the null that the coefficient would be zero and the actual coefficient from the line of best fit.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Why do we have to worry less about doing multiple comparisons in a regression than we do in an ANOVA?

A

Because we are doing fewer tests and the tests we are doing are usually theoretically motivated, that is, the only reason we are doing a regression the first place is because we think that there may be some relationship between the predictor/s and the outcome.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the measure of effect size used for regression?

A

R-squared.
It captures the proportion of the variance in the outcome accounted for by the model (line/plane of best fit).
R-sqaured=1- residual sum of squares/(residual sum of squares + model sum of squares).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

When is R-squared 1?
When is -squared 0?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

When we do correlation test, like Pearson’s r, are we just doing a regression with one predictor?

A

yes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What are standardised coefficients in regression models and when do we use them?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q
A