Linear regression and correlation Flashcards

1
Q

Fitting a line to scatterplot?

A

Method of least squares; makes use of differences between line and points being positive and negative so must square them. Similar to standard deviation! Choose line that has smallest value for sum of squares.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

In regression, does σ of FEV1 (Y) depend on height (X axis)?

A

No: assumed that SD is constant for all heights (i.e. spread of FEV1 about the mean FEV1).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

In regression, is it true that the FEV1 of a given student depends on his height?

A

No - only that the mean of the population of FEV1 depends linearly on the other variable. Y axis given as mean; X is actual value. Consider the FEV1s for the population of students with a given height and assume that this mean varies linearly with height.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Three parameters to estimate in regression?

A

σ (SD about the line), α (intercept) and β (slope). Just as for a single variable can estimate the population mean μ and σ, here μ depends on α + βx, which are population parameters and cannot be known. Instead, use sample, fit line where y = a+bx and use these sample estimates. Also need to estimate σ about the line using s. Intercept (a) is on the same scale as y so uses same units. B is y per x.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

b and β?

A

Just as use SE to see if m is a good estimate of μ, can see if b is a good estimate of β. Again, use SE, but more complicated.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Hypothesis testing in regression?

A

Has a limited role; more about how good estimate of β is. The one hypothesis of interest is whether β = 0. This is because if β=0 then the mean of Y does not change with the x variable (i.e. no association).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Regression in minitab?

A

Skip all but “the regression equation is”. Gives “FEV1” when should be mean FEV1. Also look at Predictor, Constant, Ht (x). Constant = intercept. Get SE of Ht listed under SE Coef, and get P value for Ht too. Next bit of output is S, the same as our s, i.e. estimate of SD (spread of FEV1 about the fitted line).

In summary, get estimated slope and intercept (under Coef), the SE of the slope, under SE Coef, the P value for β=0, and the SD about the line (s).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How can intercept be negative?

A

This is because intercept is mean FEV1 when height = 0 which would obviously not make sense. For this reason intercept is only important for plotting the slope.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What does a P value of 0.005 mean for the hypothesis β=o?

A

Means that there is very strong evidence that FEV1 mean does depend on the height. Does not mean that FEV1 can be determined once height is known, just that mean FEV1 changes with height.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

“Naming” regression?

A

Regression of y on x (seeing how mean y varies with x).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What does s account for in regression?

A

The variation left in FEV1 values AFTER height has been accounted for

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Clinical applications of regression?

A

Some important variables might be very difficult or invasive to measure so instead predicting it from other, related variables would be helpful. In practice, the natural variability often gives wide limits.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Single value estimate of FEV1 for height of 180cm?

A

As FEV1 = α+hβ, but we must use a and b, mean FEV1 for his height is -9.19 + 180 * 0.07 = 4.20L.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Intervals in regression?

A

Before, acknowledge uncertainty in m as an estimate of μ by computing interval where expect μ to lie. These intervals widen as sample from which the interval is calculated increases (as estimate of μ becomes more precise). The same in regression: estimate the mean using a+bh means that uncertainty in a and b will give estimate of h (4.20L) uncertainty. This is more complicated, but intervals still get smaller as sample size increases.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Why are confidence intervals not appropriate for regression?

A

Because if estimating mean FEV1 for one person of given height, it does not follow that making the sample as large as possible will make this interval indefinitely smaller as variation will still remain. Therefore cannot have a formula that allows sample size to reduce interval to 0.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What interval is used instead of confidence intervals?

A

Instead, use prediction intervals. Can also calculate confidence intervals (indicate that the mean FEV1 for students with height 180 cm is 3.8-4.6 with 95% confidence), but prediction interval indicates that 95% of students with height 180cm have FEV1 2.9-5.5. Confidence intervals will be curved because our estimate of mean FEV1 is better near the “centre”, and prediction intervals will be near-straight because they rely mainly on intrinsic variability, which is constant. The small amount of curve comes from incorporating the CIs.

17
Q

How do confidence intervals and prediction intervals change with sample size?

A

Confidence intervals will get much smaller; prediction intervals will change slightly.

18
Q

Pitfalls in using regression?

A

Often is merely a relationship between two variables using the sample itself; therefore unwise to use the line generated for other kinds of superficially similar data e.g. children, older males or female students. Also be wary out outliers as can alter estimates.

19
Q

Why cannot reverse regression?

A

The equation is MEAN FEV1 = a+bh and so cannot simply swap h and FEV1. The regressions of each on the other are different and it is important to choose which is appropriate.

20
Q

Assumptions in regression?

A

That the mean of the y variable at a given value of the x variable changes linearly with x, that the spread of the data about this line is constant and thus does not change as x changes, and that the DEVIATIONS from this line follow a normal distribution (important if calculating confidence or prediction intervals or hypothesis test).

21
Q

Assessing the linearity assumption in regression?

A

Draw scatterplot; see by eye if is plausible.

22
Q

Assessing the spread about the line in regression being constant?

A

Use residuals (vertical distance of a point from the fitted line). Positive if above the line, negative if below. If the fitted data truly reflects the structure of the data then the residuals are a sample from a distribution with population mean = 0 and they all have the same SD. Can see if have the same SD by plotting residuals against the height of the individual; should change little with height.

23
Q

Assessing the normality of deviations from the line?

A

As residuals = deviations from the line, this means checking that the residuals are from a common normal distribution. The best way of doing this is using a normal probability plot for the residuals.

24
Q

Are any assumptions made regarding the x variable in regression?

A

No! Can even be discrete.

25
Q

The correlation co-efficient?

A

We use the product-moment correlation or the Pearson correlation, given the symbol r (rho for population).

26
Q

Properties of r?

A

Always between -1 and 1, if the points were exactly on a straight line then r would either be -1 or 1; a value of 0 = no LINEAR relation but could be circular etc.; can be computed for data which comprises pairs of continuous variables.

27
Q

What does a negative/positive value of r mean?

A

Means that the y variable tends to decrease where the x variable increases. If positive then tend to increases or decrease together.

28
Q

Does the sign of r affect strength?

A

No - just the direction.

29
Q

Hypothesis testing for r?

A

Often done (r=0.35 (P=0.07)) but actually if just testing that the population correlation ρ=0 (i.e. that there is no linear relation between the variables) then this is exactly the same test as β=0 in a regression of y on x (P value will be the same). Also need one of the variables to be continuous for this; if both are continuous then can do confidence intervals for ρ but not very useful.

30
Q

Real problem with hypothesis testing in correlation?

A

Even if have very weak relationship (r=0.3, for example), then a large enough sample size may lead to a significant P value for ρ=0 testing. This will be evidence against the two variables being unrelated, but does not really tell you that they are closely related. In general, values of r betwen 0 and 0.6 or so are pretty hard to interpret. Regression usually more comprehensive.