SG 10 Flashcards

1
Q

require a number of assumptions about one or more population parameters

A

Parametric methods

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

distribution-free methods

A

nonparametric

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

ADVANTAGES OF NONPARAMETRIC METHODS

A
  • When the underlying probability distribution is unknown or is known to be different from what a parametric method requires.
  • When the level of measurement falls below what is required by a parametric technique.
  • When there is no suitable parametric technique.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

In Chi-square, the distribution tends to shift to the ____ and become more spread out with____df values.

A

right
larger

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Chi-square

A

test for independence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

The degree of skewness also decreases with increasing df such that the chi-square distribution approaches a normal distribution. t or f

A

t

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Chi-square cannot be negative since it sums squared differences divided by ______

A

positive expected frequencies

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

properties of chi-square

A
  • It cannot be negative since it sums squared differences divided by positive expected frequencies
  • It is skewed to the right.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Two variables are______ if, for all cases, the classification of a case to a particular category of one variable has no effect on the probability that the case will fall into any particular category of the second variable.

A

independent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q
  • it means having a particular attribute will have no effect on having another attribute
A

Independence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

test for independence (chi-square) 3 assumptions

A

Independent random samples
Nominal variables
Data must be organized in a contingency table

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

displays the scores on two or more variables at the same time

A

Contingency table

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

3 Limitations of the Chi-square Test

A
  • Data must be in the form of frequencies (i.e. counted data within categories).

The contingency tables = at least two columns.

Expected frequencies of any cell should not be less than 5 (although it is permissible for 20% of cells if the contingency table is larger than 2x2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

the absolute value of the difference between the observed and expected frequencies for each cell

A

Yates’ Continuity Correction

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q
  • corrected chi-square
  • the absolute value of the difference between the observed and expected frequencies for each cell

Note: For larger tables, there is no correction formula for computing chi-square

A

Yates’ Continuity Correction

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Used to determine if the observed frequencies differ significantly from an even distribution. In other words, it determines whether the sample data is consistent with a hypothesized distribution

A

Chi-square test

17
Q

______ is applicable for one categorical variables from a single population.

A

Chi-square test

18
Q

Chi-Square Test: Goodness-of-Fit-Test

A

if the observed and expected frequencies are similar, it is said that there is a “good fit”

19
Q

Chi-square test 2 assumptions

A

Independent random samples
The expected frequency of each category must be at least 5

20
Q
  • It is useful when there are cell frequencies less than 5.
  • It is therefore useful in the same situation as the chi-square test.
  • It avoids the main limitation of the chi-square, which is sufficient observations within each cell. It circumvents this limitation by comparing cumulative frequencies rather than cell frequencies.
A

K-S T-S T

21
Q
  • This test determines if there is statistical difference between two independent samples.
  • more than 2 groups
A

Kolmogorov-Smirnov Two-Sample Test

22
Q

Kolmogorov-Smirnov Two-Sample Test 3 Assumptions

A

Assumptions:
Unless sample sizes are equal, this can only be used if both samples sizes > 40
Independent random samples
At least ordinal-scale categories

23
Q
  • It can be applied to small samples and is useful for samples with unequal sizes.
A

mann-whitney U test

24
Q
  • It is also called the Wilcoxon Rank-Sum Test.
  • It is a test of equality between two independent samples.
  • It can be applied to small samples and is useful for samples with unequal sizes.
  • <20 samples
A

Mann-Whitney U Test

25
Q

test of independence =
test of equality (small/unequal sample) =
test of statistical difference of 2 samples (less than 5 frequency) =
test of equality 2 or more using variance =
test for medians =
statistical difference when HO is rejected =
abs value of the diff between O and E frequency =

A

chi-square test
mann-whitney U test
k-s t-s t
anova
kw h test
flsdt
yates’ cc