Lecture 9 - Content Analysis, Secondary Data, & Evaluation Research Flashcards

1
Q

Content Analysis

A
  • Systematic study of messages
  • Focus on communication & social artifacts
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Units of Analysis in Content Analysis

A

Units of Analysis:
* “Thing” we are analyzing or making conclusions about
Ex. Person, behaviour, etc

  • Can differ in content analysis
  • Understand your RQ
    Ex. Image, video, speech

Units of observation:
* Where the data are coming from

Ex. Students, Surveys

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Sampling

A

Determine:
* What you’re going to read, watch, listen to
- Establish units of analysis/observation, sampling frame

Sampling frame: sampling frame is the list or source from which you draw your sample. It includes all the elements (people, items, etc.) that have a chance of being selected for your study.

Ex. If studying university students, sampling frame could be student enrollment list

When:
* Time-frame of observations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Sampling in content analysis

A

Ex. Are articles that report on gang-related homicide featured more prominently than non-gang related homicides?

What:
* Unit of observation: Newspaper articles
* Unit of analysis: Newspaper articles
* Sampling frame: All articles reporting on a homicide 2004-2015
* Sample: All articles reporting on a homicide

When:
* Period of study: Jan. 2004-Dec. 2015

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Coding in Content Analysis

A

Coding: The measurement process in content analysis

Requires logic of conceptualization and operationalization

Manifest vs. Latent content
* Related to quant vs qual data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Validity & Reliability

A
  • Pretesting coding scheme
  • Inter-rater & test-retest reliability
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Secondary Data

A

Any data you did not collect yourself

Data collected by:
* Govt or non-govt organizations, schools, prisons, other researchers

Can be publically accessed or permission may be necessary

Can also include analysis of data originally collected for a different study/purpose

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Key consideration

A
  • Have a clear and complete understanding of how the data was collected
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Limitations of Secondary Data

A
  • Access
  • Validity Issues
  • Data errors
  • limited variable operationalizations
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Summary of Data Collection Methods

A
  • Surveys:
    Pros: Good for large samples, inexpensive, convenient, priv
    Cons: Low response rates, generalizability issues
  • Qualitative Interviewing:
    Pros: Good for understanding “How” & “Why”, Rich and thick data, reaches sensitive populations
    Cons: Resource intensive, not good for broad samples/ comparisons
  • Field Research:
    Pros: Good for understanding social roles and processes
    Cons: Ethical challenges (Deception), Resource intensive, Researcher bias
  • Content Analysis
    Pros:Good for understanding messages, inexpensive
    Cons: Limited to what is available, Less control, Researcher bias, time consuming
  • Secondary data
    Pros: Resource efficient, Can access data you can’t collect yourself
    Cons: Access challenges, validity risks, limited to what was collected previously & how
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Evaluation Research

A

Purpose: Assess the outcomes of a policy, practice, or program and extent to which goals and objectives are met

Policy process: Policy and interventions are planned with specific goals intended to meet demands and needs of community
* But does the intervention deliver its intended purpose?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Types of Evaluation Research

A
  • Impact
  • Is the intervention having the desired outcome?
  • Process
  • Is the intervention being implemented as intended?
  • Summative
  • Combines both process and impact evaluations
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Evaluation Development

A

Evaluability Assessment:
* Is an evaluation possible?
* Adequate support?
* Available data?
* Stakeholders?

Problem (RQ) Formulation:
* Identify & specify intervention goals and objectives
Goal: aim of the intervention
Objective: Operationalization of goal

Measurement:
* Define target population(s) and outcomes

Determine measures for:
* Population
* Outcomes
* Delivery
* Program context

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Theory and Practice: Program Theory

A

What needs to be done to accomplish program goals, what anticipated outcomes exist, and how these goals and outcomes are achieved

  • Defines problem, identifies and logically links program components, and outlines program activities
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Program Theory: Ex. Grade 7 girls tranistioning into high school

A

Goals and Theoretical Approach: Help grade 7 girls transition into high school

Inputs: What we invest?
* Volunteers

Activities:
What we do?
* Discussion, games, activities
Who we reach?
* Self-selected grade 7 girls

Outcomes:
Short-term: More comfortable with transition into high school
Long-term: Develop skills necessary to transition into high school

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Theory and Practice: Evaluation Theory

A

How evaluation research should be conducted in order to be valid

Basic best practices of research:
* Conceptualization & Operationalization of RQ, process, outcomes, hypotheses
* Identifies ethical considerations
* Determines best, yet feasible procedures
* Special consideration: stakeholders

17
Q

Evaluation Designs

A

Randomized evaluation designs
* True experiment = Goal
* Less feasible
- Ethical and practical issues
- Requires staff acceptance and appropriate case flow
- Maintaining treatment and evaluation fidelity - not appropriate for new interventions

Quasi-experimental evaluation designs
* Most common in evaluation research
* Often not possible to establish equivalent groups
- Can’t control who was or was not exposed
- Evaluation occurs after implementation
- No appropriate comparison