Biostats test 2 Flashcards

1
Q

deviation score

A

xi-x bar

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

sample variance

A

sum of all squared deviation scores divided by n-1. (s squared)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

purpose of ANOVA

A

splitting total variance in two parts; one pertaining to differences between groups and one pertaining to differences within groups.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

F

A

mean squares between groups (explained varianace; effect)/mean squares within groups (unexplained variance; error). if F takes on a sufficiently large value, we can reject H0 (still need to compare p versus alpha.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

another definition of variance

A

total sum of squares divided by n-1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

total sum of squares

A

basically the top half of variance; sum of squared deviation scores between all n data points (summed over all groups), relative to the grand mean (mean of all those n data points). between groups sum of squares + within groups sum of squares

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

within groups sum of squares

A

sum of squared deviation scores between each of k group’s individual values relative to that k group’s mean, summed over all groups

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

between groups sum of squares

A

sum of squared deviation scores between each between of k group means relative to grand mean

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

degrees of freedom

A

number of values that are free to vary given a boundary condition

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

how many degrees of freedom does variance have

A

n-1 because given a mean value, one degree of freedom is lost to compute the variance around the mean

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

df total sum of squares

A

n-1 because one mean is constrained by the others

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

df between groups sum of squares

A

k-1, one mean is constrained by the others

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

df within groups sum of squares

A

n-k (one degree of freedom is used up in calculating each groups mean, so since there are k groups ,we lost k degrees of freedom, one for each group mean)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

mean squares

A

computed on basis of different sums of squares by dividing the sums of squares by their degrees of freedom; so mean squares for total squares is actually variance!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

one way ANOVA

A

one factor, eg drug dose on mean reaction time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Assumptions for one way anova

A

within group variability is unexplained; considered error variance. we check for unequal precision using Levene’s

17
Q
A
18
Q

Levene’s test for homogeneity of error variance

A

H0: all of the group’s distribution and errors differ in approximately the same way, regardless of the mean for each group.

19
Q

Factorial anova

A

two factors or more, can each have different levels. for example three drug dose levels and two biological sexes. 3x2 = 6 mean values. then we can also have interactions.

20
Q

effect modification

A

effect of one IV on DV is different for levels of another IV

20
Q

mushrooming

A

when adding factors to design multiplies effect modification terms

21
Q

independent samples t-test

A

two independent groups, compare their means; like one way anova with just two levels

22
Q

repeated measures t dest

A

matched, dependent samples t-test: comparison of paired values

23
Q

one-sample t-test

A

on sample mean compared to a given, fixed value

24
Q

MANOVA

A

multivariate analysis of variance - use when we wish to look at mean difference across several dependent variables because they are believed to be meaningfully related (eg multiple factors of a construct such as biodiversity, or health)

25
Q

Hierarchy of when to use MANOVA and ANOVA

A

if we want to assess the effect of IVs on the collective body of DVs, we first use MANOVA. If there is no effect, we do not need to proceed. If there is an effect, we conduct ANOVA to find out for which DVs the IVs have an effect. For significant ANOVAs, we may want to compare the effect of factor levels.

26
Q

When do we use repeated measures ANOVA

A

when we are comparing more than 2 means and the subjects are measured either at different time instants or in several conditions. time and condition are within-subject factors that create dependencies in the data.

27
Q

difference between factors and levels

A

factor: drug dose. levels: low, medium, high

28
Q

Repeated measures ANOVA -mixed

A

if both within-subject and between-subject factors are included in one model, eg comparing the effects of treatment type (BS, 2 levels) over 4 time (WS, 4 levels)

29
Q

Sphericity assumption

A

difference scores between levels of a within-subjects factor have the same variance for the comparison of any two levels

30
Q

Mauchly’s test

A

tests sphericity (equality of variance of difference scores between any two groups). If Mauchly’s W is significant, sphericity cannot be assumed. If sphericity assumptions is violated, df’s must be corrected (made smaller, more conservative) by some kind of scheme.

31
Q

MANOVA for repeated measures

A

the sets of values measured at different time instants are considered different (but correlated) dependent variables. since we cannot look at difference scores when comparing scores on different variables, there is no more sphericity that could be violated.

32
Q

What an interaction tells us in a two-way ANOVA (or repeated measures ANOVA with between-subjects factors)

A

an interaction effect occurs when the effect of one independent variable (treatment) on the dependent variable (CO2 uptake) is not consistent across the levels of another independent variable (population).