Exam 1 Flashcards
Exam 1 Study Deck for REC 4450
Define reliability
The probability of rejecting the null hypothesis when the null hypothesis is true is known as what type of error?
Identify two ways to increase statistical power and thus minimize committing a Type I or Type II error.
Describe the shape of various correlations as they would appear on a scatter plot. What’s the difference between linear and non-linear as well as negative and positive correlations?
What does a Pearson correlation coefficient measure? What is the range of values associated with this coefficient? How do you interpret the values of this coefficient?
Describe the criteria that must be met to use t-Tests to analyze data.
Be able to identify different data reduction techniques in analyzing qualitative research (open, axial, and selective coding).
What are ways researchers characterize and ensure the credibility of a qualitative study?
What are the appropriate statistical analyses a person can use based on the level of measurement for the independent and dependent variables?
Describe different techniques from which a qualitative researcher can triangulate to ensure the credibility of their research (i.e., Memoing, Observations, Field Notes, Interviews).
Identify 3 measures of central tendency and 3 measures of dispersion.
Understand normal distribution of data, positively skewed and negatively skewed data.
Describe statistical significance (including p value and .05 probability value).
The Trilogy of Evaluation
Criteria + Evidence + Judgment = Evaluation
What is Criteria?
Standards or ideals by which something is being evaluated
* Planning
* Goals and Objectives
* Determined before data collection
Judgment
Presentation of findings conclusions/recommendations (??????)
Benefit types
Individual
Communal
Economic
Environmental
The Leisure Programming Cycle
- Needs Assessment
- Development
- Implementation (Formative)
- Evaluation (Summative)
- Revision
Levels of Program Evaluation
- Inputs
- Activities
- People Involvement
- Reactions
- KASA Outcomes
- Practice changes
- End Results
What is “KASA”?
Designing Quality Programs (8 Steps)
- Ask (assess) participants
- Ask staff
- Assess current practices
- Brainstorm
- Choose strategies
- Take Action
- Share your plan
- Evaluation and share results
What is evaluation?
Evaluation is the systematic process of collecting and analyzing data to address criteria so that one can make judgments about the worth or improvement of something
Why do we do evaluation?
- Striving to enhances effectiveness and efficiency
- To determine accountability
- –TO BE ADDED–
- To enhance fidelity baseline (consistent program implementation)
- To establish the impact of a program (short and long-term)
- To secure funding for future program implementation
- To comply with standards
- To improve and set external future directions
What is the aim of evaluation?
To make objective decisions