Statistics: Inferential Stats Concepts and Terms Flashcards
Inferential Statistics: overview
Descriptive stats = summarize data
Inferential Stats = make inferences about a population based on sample drawn from a population
Central Limit Theorem
Distribution approaches a normal curve as sample size increases
The mean of the sampling distribution = pop mean
SD of distribution = Standard Error of the Mean
Type I Error (α)
Rejection of a true null hypothesis
Research erroneously shows significant effects
Type II Error (β)
Retain a false null hypothesis
Research misses actual significant effects
Power (1-β)
Likelihood of rejecting false null hypothesis
Parametric v Nonparametric Tests:
Measurement Scales
Parametric Tests: Interval or Ratio Scales
Non-Parametric Tests: Nominal or Ordinal Scales
Parametric v Nonparametric Tests:
Commonalities and Differences
Both assume random selection and independent observations
Parametric tests (e.g. t-test, ANOVA) evaluate hypotheses about population means, variances, or other parameters.
Parametric Tests:
Assumptions
Normal Distribution
Homoscedasticity
Homoscedasticity
Assumption that variances of populations that groups represent are relatively equal
[For studies with more than one group]
One-way ANOVA vs Factorial ANOVA vs MANOVA
One-way ANOVA: ONE IV, ONE DV
Factorial ANOVA, two-way = 2 IV’s, three-way = 3 IVs
MANOVA: used whenever there is more than one dv
(MULTIvariate analysis)
Effect Size:
What is it?
Name two types
Measure of the practical or clinical significance of statistically significant results
Cohen’s d
Eta squared (η²)
Cohen’s d
Effect size in terms of SD (d = 1.0 = 1SD change)
Small effect size = 0.2
medium effect size = 0.5
large effect size = 0.8
Eta squared (η²)
Effect size in terms of variance accounted for by treatment
*Variance = σ², so think squared greek letter = variance
Bivariate correlation assumptions
Linearity
Unrestricted range of scores on both variables
Homoscedasticity
Bivariate correlation “language” (X, Y)
X = predictor variable
Y = criterion variable
Simple Regression Analysis
Allows predictions to be made with:
One predictor (X) One criterion (Y)
F ratio calculation
MSB/MSW
Mean square between divided by mean square within
F ratio range
F is always greater than +1
Larger F ratio = increased likelihood of stat significance