Exploring data / Statistical Testing Flashcards
has to start with IF
p-value does not tell you anything about the probability of something is true
So, given the 0 hypothesis is true, we can observe 4% of the time that there is a large difference due to our samples
Why can - if nothing has changed - still a occur a difference between 1996 & 2006 in regards to stigma?
Pure chance from random sampling
What is the p value?
- Defined as the probability - under the assumption of no effect/ difference (null hypothesis) - of obtaining a test statistic equal to or more extreme than what was actually observed (from random & repeated samples out of population)
- Ergo: Measures how likely it is that any observed difference/effect between groups is due to chance
What would a p-value of 0.05 mean?
IF there is no difference, 5% of the time when running the experiment we get a p-value of less than 0.05 due to random events –> False Positive
What is statistical significance?
- means that - given the null - the result we got is unlikely to be explained solely by chance or random factors
- a statistically significant result has a very low chance of occurring if there were no true effect in a research study
- arbitrary threshold: In most studies, a p value of 0.05 or less is considered statistically significant, but this threshold can also be set higher or lower
How to deal with skewness?
Often univariate distribution most interesting
What is the empirical rule?
Why does SE matter?
example: gender difference in variance
What does it mean if a p-value is closer to 0?
more disagreement with 0 hypothesis –> more confidence that there is a difference not due to chance/random events
Type 1 / 2 error?
type 1 error = alpha/level of significance - arbitrarily defined
What is power?