Regression Flashcards

1
Q

What does a correlation allow us to do?

A

allows us to quantify / measure the strength and direction of a relationship.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What does a Pearson’s correlation assume, and what are it’s outputs?

A

A Pearson’s correlation assumes that a relationship is linear (straight line). It’s reports a t-value, r(cor), and p-value. Only r(cor) and p needed to report

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How does a Spearman’s test work? What does it assume?

A

A Spearman’s test assumes monotonicity. In ascending order it ranks on x- and y-axis in and plots them. Prints s-value, s(rho), and pvalue

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are the main characteristics of a regression line?

A

outcome v (Y) = slope X + intercept.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the slope and what is the intercept.

A

The slope is what value the regression line begins on the y-axis. The intercept is how much (+ or -) y is predicted to change for every unit increase along the x-axis of the line is affected for every 1 unit along the x-axis. IF x = zero, y = the intercept.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Why do we always square^2 our test equations?

A

So our postivites and neatives don’t cancel out.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What other name is the slope know as?

A

Diversity. An lm function will print the intercept and diversity scores

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is a regression line called when it has one outcome variable and two predictor variables?

A

Regression coefficient

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Why does variation allocation change when you add new predictors?

A

R now has the data to express more accurately which variables account for the variation. Additional varaibles claim variation.

This is the same for ANOVA

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What does an interaction variable indicate?

A

That the affect of variable 1 is going to depend on variable 2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What does a postive and negative interaction do to a reression sheet (3D), and what does it imply for the relationship?

A

Pos’ interction: bends the sheet up. This means that as V1 increases it is likely to have more of an effect on V2.

Neg’ interaction: bends the sheet down. this means as V1 increases it is likely to have less affect on V2.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What does the printed f-statistic tell us in regression?

A

This represents how well our whole model fits the varation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What does a t-statistic tell us in regression?

A

How likely it is to get a regression line for that variable if the null was true. Null = no difference between regression line and Y- mean or ‘flat line’

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What are the two sources of variance used to calculate if a regression model is significant?

A

Model Sum of Sqaures (SSmod)
Residual Sum of Squares (SSres)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is the (SSmod) calculating?

A

The difference between regression line predictions and mean Y. Null: predicts no difference between regression line and mean.

df = k

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is the (SSres) calculating?

A

The difference between the data and regression line predictions

17
Q

How are the df’s calculated for (SSmod) and (SSres), and what are these figures analogous to?

A

(SSmod): df = k, analogous to SSb.
(SSres): df = N - K -1)

18
Q

Taking (SSmod) and (SSres), how would they most likely appear if model was significant.

A

Similar to ANOVA we would want high (SSmod), indicating that regression like was more different from mean Y, and low (SSres), meaning are regression line more accureatly captured the data.

(SSres) = Less noise

19
Q

What is the effect size in a regression model representing?

R^2

A

How well does the overall regression model actually account for the outcome varaible

20
Q

How is effect size calculated for regression?

A

R^2 = 1 - SSres / SStotal
zero = when model does nothing
one = when model is perfect fit

Printed as multiple R-squared.

21
Q

How else can you get R^2?

A

Pearson’s correlation (r) squared.

R^2 also acheived by, 1 - SSres / SStotal

22
Q

What do you need to do before comparing the effect of differenet varibles?

A

Convert them into a z-score.

23
Q

What is a standardised cofficients?

A

Varaibles that have been converted into a z-score. Denoted as beta β

Not to be confused with b(slope)

24
Q

Why are standardised cofficents useful?

A

Standardising our cofficents to a z-score allows us to compare the affect of each varaible.