Research Fundamentals and Research Design Flashcards

1
Q

What is Research?

A

Systematic investigation to establish facts, principles or generalizable knowledge

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are the characteristics of Research? (3)

A

Challenges the status quo
Creative
Systematic

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What are the Characteristics of Experimental Science? (9)

A

Systematic
Public
Peer review
Empirical Testing
Experimental Control (not always included)
Probabilistic Knowledge
Replication
Objective
Neutral

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Explain the Model of Scientific Thought:

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What are the characteristics of GOOD theories? (6)

A
  • Account for existing data
  • Explanatory value
  • Predictive value
  • Testable
  • Parsimonious
    Efficient explanation
    Only as complicated as necessary
  • Tentative: modified as new evidence becomes available - confirmed or not confirmed
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the role of hypotheses in research? (2)

A

Makes prediction
Implicitly assumes alternative relationships are possible

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Does a research question have a prediction?

A

No prediction

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are the intent of Research vs. Practice?

A

New knowledge (unknown benefit if any) vs treatment (assumed benefit)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What are the intents of Research vs. Practice?

A

New knowledge (unknown benefit if any) vs treatment (assumed benefit)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What are the innovations of Research vs Practice?

A

Novel practice vs customary practice

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What are the plans of Research vs Practice?

A

Uniformity (Research tends to collect and analyze data as a group) vs individualized (Clinician you aim to do that because of individual clients)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What does it mean that ‘’ a Researcher Must Act with Integritiy’’? (7)

A
  • Pursue questions that are relevant and meaningful and address important issues
  • Design research well using valid and reliable practices
  • Carry out research completely
  • Report results honestly and accurately
  • Report authorship accurately
    On reports
    No plagiarism
    Academic Integrity
  • Manage conflicts of interest
  • Manage resources honestly
  • Consider consequences to society
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the REB?

A

Research Ethics Boards

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the REB?

A

Research Ethics Boards

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the definition of the REB?

A

Groups in an institution responsible for reviewing research proposals that will involve human subjects to determine adherence to ethical principles

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What are the roles of REBs?

A

REBs approve, reject, propose modifications to, audit, terminate research involving human subjects within an institution

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Research involving human subjects includes: (3)

A
  • human participants,
  • human biological material
  • human data
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What doesn’t need an REB Review? (4)

A
  • Research that does not involve human subjects
  • Research on living individuals based on publicly available information (e.g., newspaper articles, biographical accounts)
  • Naturalistic observations (special case)
  • Quality assurance studies, performance reviews or testing within normal clinical or education requirements (i.e., use internal to organization)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What are Naturalistic observations? (4)

A

No personal identifying information
Behaviour naturally occurring
Environment not staged
Behaviours are innocuous and neither revealing nor embarrassing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What are the two rights that need to be balanced in your research by REB reviews? (2)

A
  • Rights and welfare of participant
  • Right of an experimenter to seek knowledge
19
Q

REB reviews focus upon: (6)

A
  • Attainment of ethical principles
  • Scientific merit
  • Risk – Minimizing it & Risk-benefit ratio analysis
  • Recruitment of participants
  • Informed consent processes and documents
    Disclosure, comprehension, voluntariness, competence
    If deception, well justified & plans for debriefing
  • Data storage & management - Confidentiality
20
Q

What are the important concepts to remember? (4)

A

Respect human rights and dignity
Respect for person / community
Risk-Benefit Analysis
Just

20
Q

What does it mean to Respect human rights and dignity? (2)

A

Morally acceptable ends
Morally acceptable means (methods) to those ends.

20
Q

What does it mean to Respect for person / community? (4)

A
  • Free and informed consent (autonomy)
  • Respect and protect the vulnerable
  • Recognition of traditionally exploited groups
  • Respect for privacy and confidentiality and anonymity where possible
21
Q

What does the important concept to make an analysis Risk-Benefit mean? (2)

A

Benevolence
Non-maleficence

22
Q

What does the important concept to be just? (2)

A

Fair
Inclusive

22
Q

What are 3 research paradigms (Models/Examples)?

A

Quantitative
Qualitative
Single-subject

23
Q

What are the two research types?

A

Experimental vs Observational/Descriptive/Non-experimental

24
Q

What are the differences between Quasi and experimental designs?

A

Quasi the groups aren’t randomly assign to groups
Experimental you randomly assign them to the treatments

24
Q

What is Descriptive research?

A

The research just looks at the data and not try to assign any treatment to subjects

24
Q

What are examples of Descriptive Research? (5)

A

Developmental
Normative
Correlational & Predictive
Qualitative
Case study

24
Q

What is the difference between Basic vs. Clinical/Applied research?

A
  • Basic research
    Used to develop, define, and test theory
    Not motivated by practical application
    Knowledge for knowledge’s sake
    May have clinical/applied implications; not directly tested
  • Clinical or applied research
    Directed toward an immediate practical problem
25
Q

What are the three types of variables?

A

Independent
Dependent
Confounding/ Extraneous

26
Q

Describe Independent Variables? (2)

A

Experimenter manipulates
Active vs assigned

27
Q

Describe Dependent variables?

A

Influenced by or co-varies with independent variable

28
Q

Describe Confounding/Extraneous variables? (3)

A

Things that vary but not the focus of the study
Can affect results so try to control or minimize
May be beyond the control of experimenter, try to account for

29
Q

Variables are _________________, _____________________, ___________

A

Operationally Defined
Reliable
Valid

30
Q

What are the two main types of Experimental Validity?

A

Internal Validity
- Construct Validity
- Statistical Conclusion Validity
External Validity

31
Q

Describe Internal Validity. (2)

A
  • Degree to which the causal relationship is properly demonstrated
  • Higher internal validity has more control over extraneous variables to eliminate or lessen their effect on DV so can state more confidently that the changes in the DV are the results of manipulation of the IV
32
Q

What are threats to internal validity? (11)

A
  • History
    (External influences over time
    Interaction between Different Treatments (in text under construct validity))
  • Maturation
    (Internal changes over time
    Between sessions
    Within session)
  • Testing Effects
    (Interaction between Testing and Treatment in text under construct validity)
  • Instrumentation
    (Are changes in the measuring tools influencing findings?
    Appropriate calibration of instruments
    Humans as instruments - Who collects the data
    Training, evidence of reliability between testers
    Biases – e.g., experimenter/interventionist collecting or scoring data, testers blinded?)
  • Statistical Regression to the Mean

-Participant Selection & Assignment
(inclusion criteria
Exclusion criteria
Homogenious Groups
Random assignment
Matching
Match then random)

  • Subject Attrition(Mortality)
  • Diffusion or Imitation of Treatment
  • Compensatory Equalization of Treatment
  • Compensatory Rivalry or Resentful Demoralization
33
Q

What are measurements/ instruments used in internal validity? (2)

A
  • Reliability and validity of measures
  • Meaning of the variables within the study
    Does label match how operationalized?
    Construct Underrepresentation
34
Q

What is the meaning of experimenter expectancies in internal validity?

A
  • Participants/observers/testers ‘guess’ what experimenter wants & act accordingly
    Participants knowing purpose of study, group they are in changes behaviour
    Tester knowing group assignment
  • Rosenthal effect
  • Placebo effect
  • Hawthorne effect
35
Q

External Validity describes:

A

How well you can extend findings from your study to a population of interest IRL
How generalizable

36
Q

What are threats to External validity? (5)

A

Selection of Participants (Heteregenious group favored)
Setting (Experimental Arrangements)
Time
Reactive Testing
Multiple Treatment Interactions

37
Q

What is the Rosenthal Effect?

A

High expectations lead to improved performance and vice versa

38
Q

What is the Placebo Effect?

A

Person’s physical or mental health appears to improve after taking a placebo or ‘dummy’ treatment

39
Q

What is the Hawthorn effect?

A

Modification of behavior by study participants in response to their knowledge that they are being observed or singled out for special treatmen

40
Q

What is Maturation?

A

Threat that is internal to the individual participant.

It is the possibility that mental or physical changes occur within the participants themselves that could account for the evaluation results.

41
Q

What is Testing Effect?

A

Threat to Internal Validity

When scores on the post-test are influenced by simple exposure to the pre-test.