Final Exam (Week 11-13) Flashcards

1
Q

What is KT?

A

Knowledge translation
Sharing findings, insights, experiences
Process that involves synthesis, dissemination, exchange, and application of knowledge
aka knowledge mobilization or knowledge sharing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is a knowledge user?

A

Anyone who can take knowledge from research and use it to make informed decisions. Can be:
Decision makers (e.g., policy makers, governments)
Practitioners (e.g., educators, physicians)
Individuals (e.g., general public, patients, athletes)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What does a knowledge user make decisions about?

A

policies, programs, or practices

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are the categories of people that are involved in KT?

A

Researchers, policy makers, caregivers, providers, persons with lived experience, other stakeholders
All are trying to reach each other

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Core elements of KT?

A

Synthesis
Dissemination
Exchange
Ethically sound application of knowledge
Overlap exists, but all are integral to KT process

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is synthesis (KT)?

A

Contextualizing/integrating/situating study findings within larger body of knowledge (ex. systematic reviews, narrative reviews, meta analyses, practical guidelines)
Basically figuring out where results fit in the context of the field

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is dissemination (KT)?

A

Tailoring KT to a particular audience (both the message and the medium of info) (ex. providing summaries for knowledge users, delivering educational sessions with parents and athletes)
Basically condensing info into a more palatable medium for the intended audience, figure out what they NEED to understand

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is exchange (KT)?

A

Engagement between researchers and knowledge users
Results in mutual learning through planning, producing, disseminating, and applying existing or new research
Very important step
Basically passing knowledge between people

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is ethically sound application of knowledge (KT)?

A

Putting knowledge into practice
Should be consistent with ethical principles, social values, and legal regulatory frameworks
Basically actually using the knowledge

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What are the ways of planning timing of KT activities?

A

End-of-project: Activities occur at the end of research study (sharing a finished product)
Ex. published journal articles, conference presentations, etc.
Integrated: Activities integrated throughout duration of research study (share as you go)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Traditional vs. Innovative KT strategies examples?

A

Traditional:
- Publications (journal articles, guidelines, manuals, reports)
- Conference presentations (verbal presentations, poster presentations, symposia)
Innovative:
- Text-based (stories, narratives, fictional narratives, poems)
- Media-based (social media, websites, online tools, TED talks, 3 min thesis competitions)
- Arts-based (short film, interpretive dance, ethnodrama, visual art, musical performances)
Relationship-oriented (community engagement, gatherings)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

How to select KT strategies?

A

should be driven primarily by research question and the intended audience
Should also be ethically sound

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Why is KT important?

A

Same reason why we care about research, adds to knowledge about a problem
Relevance to practice:
- Improving services
- Applying new knowledge
- Answer questions
- Critically think

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

4 Strategies for improving knowledge exchange?

A

Co-production: Scientist and decision-maker are working together side-by-side, some overlap
Embedding: Scientist is either embedded in a decision-maker context or vice-versa, one is working within the other and learning
Knowledge broker: Scientist is working mostly separately, intermediary is within scientific context. Scientist and intermediary are both equally (very little) directly connected to decision-maker
Boundary organization: Intermediary is working between scientist and decision-maker and is the communication boundary, no direct connection between scientist and decision-maker

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is the information explosion?

A

Number of articles skyrocketed after WWII because of research about and during the war
Millions of articles published, used to be specialized people doing literature reviews and people knew most others in their field, now too many to keep up with

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What are research reviews?

A

Provides the context where we can evaluate the merits of a study
Two types:
Non-systematic (narrative, integrative)
Systematic (meta-analysis, scoping review, meta-synthesis)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Criticism of narrative reviews?

A

Narrative review: type of non-systematic review
- Subjective
- Scientifically unsound
- An inefficient way to extract useful information
Basically no guarantee that two scientists will come to the same conclusion

18
Q

What is a systematic review?

A

Follows a methodologically rigorous process that is
transparent and replicable to minimize bias
Identifies relevant studies, checks quality, and summarizes results using scientific methodology
Can be qualitative or quantitative, addresses some criticism of narrative reviews

19
Q

What does a systematic review need?

A
  • A predefined protocol and structured research strategy that is available to anyone (replicatable)
  • Clear outlined procedure of how articles were selected and exclusion-inclusion criteria
  • If there was a quality check of individual studies, and if not, why
  • Structured process for taking and summarizing results (replicatable)
  • Discussion of what we found, and of our own limitations and contributions from review
20
Q

Key components of a systematic review?

A

Articulate research question (PICO = participants, interventions, comparisons and outcomes)
Clear description of protocol (PRISMA = preferred reporting items for systematic reviews and meta-analyses)
Register the review and protocol (PROSPERO is an international database of prospectively registered systematic reviews where there is a health-related outcome)
Description of how the strength of evidence will be assessed (GRADE = grading of recommendations, assessment, development and evaluations)

21
Q

Flow chart of PRISMA would include:

A

Identification - of all possible related articles through whatever database you’re searching (and then record the number after duplicates are removed, bound to be doubles in there)
Screening - actually read through the titles and abstracts to see if they really are related to the research question
Eligibility - read through the article itself, and see if they fit what you’re trying to assess, be sure to record why something was excluded and categorize them!
Included - the number of studies actually used
RECORD EVERYTHING so if someone tries to challenge your numbers, there’s a defence lined up to protect your findings

22
Q

Grade certainty ratings

A

Very low - Actual effect is probably markedly
different from the research findings
Low - Actual effect might be markedly
different from the research findings
Moderate - We believe that the actual effect is probably close to the research findings
High - We have a lot of confidence that
the actual effect is similar to the research findings

23
Q

What is a meta-analysis?

A

The statistical integration of the results of independent studies
A collection of systematic techniques for resolving apparent contradictions in research findings
Can only be done with quantitative research

24
Q

Why do we conduct meta-analyses?

A

Too much info/literature
- Contradictions
- Narrative reviews aren’t efficient
Need to know population effect
Hard to determine relationship(s) b/w IV’s and DV’s across studies

25
Q

Steps to a meta-analysis

A

Systematic review of literature, which includes:
- Problem specification (identify question(s) we are addressing)
- Literature search (exhaustive comprehensive searching, inclusion criteria)
- Retrieving studies
- Coding studies
- Data extraction (quantitative data converted to effect size)
- Data analyses and interpretation (putting data into context)

26
Q

What is effect size?

A

Measure of the strength of a relation
An estimate of the degree a phenomenon is present in a population and/ or the extent to which the null hypothesis is false (extent of something different actually happening)
Basically the magnitude of the effect or relationship
We use it to represent study findings in a meta- analysis (BUT type of ES must be the same across all studies)

27
Q

Calculating/estimating effect sizes

A

If we know the sample size, we can calculate/estimate ES from many different types of descriptive (means & SD, r) and inferential statistics (e.g., OR, F, t, p)
d = (Me - Mc)/sdc
d = (Me1 - Me2) - (Mc1 - Mc2)/sdpooled
Or
Pearson r (convert r to Fisher’s Z)

28
Q

Limits of meta-analyses?

A

Applies only to quantitative findings (descriptive or inferential)
Findings must be conceptually comparable, and, configured in similar statistical forms
Studies should include similar research designs

29
Q

Which PW do most mixed-methods resesarchers usually hold?

A

Pragmatism
Focused on solutions, no commitment to any single notion of reality, application-focused

30
Q

Forms of data collection/generation, analysis, and interpretation in mixed methods studies (remember the chart)

A

Both predetermined and emerging methods, open and closed-ended questions, multiple forms of data drawing on all possibilities, statistical and text analysis, interpretation of many databases (stats and themes)

31
Q

When is a mixed-methods study appropriate?

A

Driven by research question
Some lend themselves to:
Breadth = qualitative design
Depth = quantitative design
Both = mixed methods
REMEMBER mixing methods just for the sake of it can produce disjointed and unfocussed research

32
Q

Decisions to make when planning mixed methods research

A

Implementation sequence (concurrent or sequential)
Priority/weighting (more qual, quant, or equal weight)
When to integrate components (data collection, data analysis, data interpretation, KT)

33
Q

Considerations for implementation sequence

A

Timing of components
Concurrent = data collected at the same time
- Concurrent nested designs = one method dominates while the other is embedded within it
Sequential = two distinct phases

34
Q

Considerations for priority/weighting

A

More qualitative, quantitative, or equal weighting to both)

35
Q

Considerations for Integration

A

When we actually mix the two together (data collection, data analysis, data interpretation, KT)
Depends on research question and initial research planning
Side-by-side comparison: merging/converging data during analysis
Independent analysis: may need findings from one phase to plan the next, qual and quant data analyzed separately/independently
Data transformation: converting one form of data to another, qualitizing (use numerical data to describe sample or quantizing data (count codes/themes)

36
Q

What is an explanatory sequential mixed methods design?

A

Quantitative data collected first–>qualitative data collected in the second phase
- Used more often by quantitative researchers
EXPLAINING - sciencey

37
Q

What is an exploratory sequential mixed methods design?

A

Qualitative data collected first–>quantitative data collected in the second phase
- Used when developing a measurement tool, usually uses different samples
EXPLORE - fun and abstract

38
Q

Benefits of mixed methods research

A
  • Neutralizing weaknesses and maximizing strengths of designs
  • Triangulation (using many methods to get a comprehensive understanding of the phenomenon)
  • Comprehensiveness
  • Instrument development and testing
  • Assisting sampling (quant can assist with purposeful sampling for subsequent qual study)
  • Enhancing generalization (quant methods can enhance transferability to qual findings)
39
Q

Challenges of mixed methods research

A
  • Blending PW
  • Bringing together diverse researchers
  • Timelines and resources
  • Sampling and analysis decisions (potential of burdening participants)
  • Publication and evaluation (determining quality of study)
    More complex and difficult than qual or quant on their own!!
40
Q

What are the key defining features of pragmatism?

A

Consequences of actions: knowledge arises from actions, situations, consequences
Problem-centered: research problem takes top priority, including over methods and issues of knowledge
Pluralistic: many approaches used to derive knowledge
Real-world practice oriented
Researchers working together on mixed-methods research have to adapt to a pragmatist WV, even if it means compromising some characteristics of their own PW

41
Q

Is mixed methods research just getting qualitative and quantitative data and putting them together?

A

No! It’s more than analyzing them separately, there has to be a MIXING of some sort
Must be systematic and purposeful
The overall strength is greater than either quant or qual (ONLY if the question lends itself to mixed methods)

42
Q

Skills of a researcher doing mixed-methods study

A

Past research experience (in both qual and quant studies, and mixed methods), training, skills, education, PW (pragmatism training or at least compromise)
Rare to see solo researcher doing MM study, so usually research teams with qual and quant researchers (specialists) and someone who can bridge the gap like a pragmatist or someone with MM experience, individuals with diverse research experiences