BIO 300 Lab Quiz 3 Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

correlation

A

strength of a linear association between 2 numerical variables

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

correlation uses

A

correlation coefficient

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

correlation coefficient

A

r
[-1,1]
unitless

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

-r

A

as one variable increases the other decreases

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

inferences from correlation

A

cannot infer causality

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

regression

A

implies causality between 2 variables
used to predict value of response variable from explanatory variable
can determine how much of variability is due to relationship w/ explanatory variable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

regression statistic

A

R^2 = SS_regression / SS_total

[0,1]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

linear regression assumptions

A
  • relationship between response (Y) and explanatory (X) is linear
  • Y values at each value of X are normally distributed
  • variance of Y values is same at all values of X
  • Y measurements are sampled randomly from the population at each value of X
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

are there any outliers

A

yes, can be seen by boxplot

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

how to do regression with multiple groups (N/S)

A

“regression with groups”

then add total regression line by right clicking then going to regression fit

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

do the data need to be transformed

A

are the data clumped in one corner of the scatterplot
is there greater spread in one section of the scatterplot
are there different orders of magnitude spanned in the variables

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

how to SLR

A

stat– regression– regression– fit regression model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

options to check for regression

A
responses- Chl-a
continuous predictor- Log P
graphs- residuals vs. fits
results- everything but Durbin-Watson
storage- residuals
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

SSregression

A

proportion of variation in response variable accounted for by the regression

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

SSresidual

A

proportion of variation unexplained by regression

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

MS

A

measure of variance, average of sums of squares: SS/df

17
Q

F

A

F-ratio

MSregression/MSresidual

18
Q

constant

A

y-intercept

note that this value will appear in the equation

19
Q

log P constant

A

the slope

will appear in equation

20
Q

better predictor of response variable

A

higher R^2

21
Q

multiple regression

A

stat– regression– fit regression model
responses- chl-a
continuous predictors- Log P, Log N

22
Q

dont forget

A
to label residual columns
check for normality
check for equal variance
check if assumptions are met
test residuals for normality
23
Q

when R^2 SLR ~ R^2 MR

A

possibly 2 explanatory variables are correlated

24
Q

if two explanatory variables are correlated

A

co-linearity

25
Q

why did Log N lose its significance in MR

A

the variation explained by Log N is already accounted for by Log P.. not much variation left that Log N can describe

26
Q

test for correlation

A

stat– basic stat– correlation– variables: logP, log N— ok

27
Q

stepwise multiple regression

A

looks at all combinations of explanatory variables to retain the ones that explain the most variation
eliminates explanatory variables that do not add any new explanatory power

28
Q

do you need different predictive equations for the 2 sampling locations

A

are the intercepts and slopes of the regression equations significantly different?

29
Q

test for significant differences in y-intercept of regression lines

A

ANCOVA

30
Q

ANCOVA

A

analysis of covariance

31
Q

ANCOVA assumes

A

equal slopes (parallel lines)

32
Q

testing for equal slopes

A

are the lines parallel? if they are then interactions are not significant
we’ll assume that they are

33
Q

if y-intercepts are not significantly different (p-value ≥ alpha)

A

free to use one regression equation for both locations

34
Q

running an ANCOVA

A

stat– anova– GLM– fit general linear model

35
Q

options in ANCOVA

A
responses-- chl-a
factors- location
covariates- log P
model- location, log P, location*log P
results: only ANOVA
storage: residuals
36
Q

ANCOVA output

A
  • if interaction is not significant re-build model without interaction
  • determine if effect of sampling location is important- decide if you need 2 equations or not
37
Q

if sampling location has significant effect

A

need to 2 equations

38
Q

how to get separate equations for sampling location

A

separate data by sampling location and start again from beginning for each location separately