Final Deck 1 Flashcards

1
Q

major types of bias:

A
  • confirmation bias
  • response bias
  • selection/sampling bias
  • recall bias
  • misinterpretation/confounding bias
  • publication bias
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

confirmation bias

A
  • the tendency to gather or weight consideration towards evidence that CONFIRMS preexistingor preferred expectations, which dismisses or fails the find contradictory evidence.
  • its easier for people to discount newer information than to look into new information.
  • EXAMPLE: being more likely to consider positive reviews about an item vs the negative reviews.
  • EXAMPLE: being more liekly to accept positive info about a preferred presidential candidate than negative.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

recall bias

A
  • retrospective reporting error when people recall information.
  • EXAMPLE: parents of children diagnosed with ASD may be more likely to recall events before their diagnosis if they are not ready to face the diagnosis.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

response bias

A
  • when responses of the participants are influenced by variables other than the construct being measured.
  • EXAMPLE: people responding to a survey inaccurately because their responses are influenced by other variables, such as the environmental or social pressure to respond a certain way.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

selection/sampling bias

A
  • systematic and directional error involved in the choice of units/cases/participants from a larger group of study
  • affects external validity
  • EXAMPLE: sending a survey to only your classmates, friends, and family because it is not a diverse sample; these people are likely to think the same as you so it creates bias.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

confounding/misinterpretation bias

A
  • incorrectly attributing an association between two variables instead of a third factor that is independently associated with both the independent variable and the dependent variable
  • affects internal validity.
  • EXAMPLE: a study that determines if there is a relationship between shoe size and height that does not account for age.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

publication bias

A
  • tendency for study results that are published in journals or other outlets to more likely show positive or statistically significant findings
  • EXAMPLE: journals only post studies that are statistically significant
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

ad hoc ergo propter hoc

A
  • Latin: after this, therefore because of this.
  • fallacy assuming that something was the cause of something else.
  • correlation does not equal causation.
  • EXAMPLE: got in a wreck because it was raining; failed the class because of the teacher.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

scientific method

A
  • a set of procedures, guidelines, assumptions, and attitudes required for the organized and systematic collection; interpretation and verification of data and the discovery of reproducible evidence.
  1. question/observation
  2. research
  3. hypothesis
  4. experiment
  5. analysis
  6. report
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Where does the literature review fall in the scientific method?

A
  • the research phase.
  • the literature must be reviewed to know what is out there already so that a hypothesis can be formed based on the gaps.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the difference between theory vs. hypothesis?

A
  • A hypothesis is more specific, a theory is more broad.
  • hypothesis: predicting results specific to one study.
  • theory: phenomenon based on multiple studies.
  • theory and hypothesis feed into one another.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Experimental vs. Non-experimental Design

A
  • experimental has control over the independent variable
  • non-experimental does not
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Experimental Design

A
  • Manipulates the independent variable to see changes in the dependent variable.
  • Has control and experimental groups.
  • Has random sampling and assignment.
  • Can be blind/double blind.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Non-experimental Design

A
  • Less variable control
  • More descriptive/applied
  • Correlation studies
  • Types: surveys, polls, interviews, case studies.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Quantitative vs Qualitative

A
  • Quantitative uses numerical data, usually deductive
  • Qualitative uses data collected in words and is used to make observations, analyze narratives, and make themes, usually inductive
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Quantitative pros and cons

A
  • pros: standardization, reliability, easy to analyze statistically, larger samples collected quickly.
  • cons: less ability to obtain characteristics.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Qualitative pros and cons

A
  • pros: greater depth and exploration
  • cons: more time consuming, more intensive work
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

translational research

A
  • a bridge between basic research (doesn’t have a specific problem in mind) and applied research (research conducted to address a specific problem in society- testing a specific intervention or medication.)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Ethics

A
  • the principles of morally correct conduct accepted by a person or group considered appropriate to a specific field.
  • EXAMPLE: in psychological research, proper ethics requires that participants be treated fairly and without harm, investigators report results and finding honestly.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

research ethics

A
  • the values, principles, and standards that guide the conduct of individual researchers in several areas, including the design and implementation of studies and the reporting of findings.
  • EXAMPLE: research in that stipulate the studies involving data collection from human participants must be evaluated by institutional review boards.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

code of ethics

A
  • each organization has their own provisions and code of ethics to follow.
  • the goal is to outline behaviors that take place in the field,
  • it takes the four principles of ethics and translates them into specific behaviors that can be good or bad.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Two principles from ASHAs Code of Ethics

A
  1. “Individuals should honor their responsibility to hold paramount the welfare of persons they serve professionally or who are participants in research and scholarly activities, and thy shall treat animals involved in research in a humane manner.”
    -Summary: Prioritize the welfare of the participants of the study.
  2. “Individuals shall honor their responsibility to achieve and maintain the highest level of professional competence and performance.”
    - Summary: Maintain professional competence.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Tuskegee

A
  • A study done that gave people (sharecroppers) syphilis and did not treat them, even after penicillin was shown to be effective.
  • The study caused many people to get sick and die.
  • It led to an enormous public outrage and caused Congress to pass the National Research Act (created the group of people who developed the Belmont Report).
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

The Belmont Report

A
  • The Belmont Report led to the eventual development of 4 ethical principles.
  • (beneficence, nonmaleficence, autonomy (informed consent), and justice (fairness, equitable distribution of benefits and risks of the study).
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

The Common Rule

A
  • The standardization of participant protections.
  • Came about in the 90s.
  • Used for almost all federal departments in regards to research.
26
Q

Vulnerable Populations

A

Additional protections for research with:
- prisoners
- children
- pregnant women

27
Q

List/Describe 4 ethical principles

A
  1. Beneficence: Doing good, being kind, improving well-being
  2. Nonmaleficence: Avoid doing harm, focus on NOT reducing wellbeing
  3. Autonomy: A person’s right to make his/her own decisions; competence; the autonomous rights of one person should not infringe on the rights of another (respect for persons informed consent)
  4. Justice: Fair distribution of risks/benefits of study across the population
28
Q

Ethical Dilemma

A

when two or more of the ethical principles come in conflict

29
Q

Give an ethical dilemma example

A
  • My school district is in a rural part of the state and has a difficult time recruiting and retaining ASHA certified SLPs. Therefore, by necessity, the district is hiring less qualified clinicians and support personnel and delegating supervisory responsibilities to me.
  • beneficence and nonmalfecience are in conflict
30
Q

What is an IRB?

A
  • Their scope is to protect the participants of the planned study.
  • They review every aspect of research to make sure it complies with the requirements of the government and of the university
31
Q

How does the IRB work?

A

Ensures
-Risks are minimized
-Risks are reasonable compared to benefits
-Selection of participants is equitable
-Informed consent will be obtained
-Confidentiality is adequately maintained
-Allows for exemptions if participants cannot be identified
-Surveys, interview, questionnaires
-Studies of existing records
-Research on normal educational process
-Also allows for expedited review if minimal risk

32
Q

Describe a literature review and 3 reasons to conduct one

A
  • A literature review goes through all of the previous work in an area of interest.
  • A standalone literature review summarizes and updates the current research.

3 reasons to conduct one are
- understand previous literature and your place in it
- identify useful or flawed measures
- avoid “dead end” topics
- get ideas on reporting structure

33
Q

How is a literature review different from a systematic review vs. meta analysis?

A
  • A literature review aims to create an overall summary regarding the literature and identify trends/gaps.
  • Lit reviews are smaller and focus on a specific research question.
  • A systematic review uses a more broad structure to summarize the overall research.
  • A meta analysis is a type of study that uses more quantitative techniques that takes previous studies and re-analyzes the statistical work according to effect size.
  • It does this to group consistent studies and make averages.
34
Q

EBSCO vs. Database vs. Journal.

A
  • EBSCO is like a search engine/search service for articles (like google).
  • A database is a curated collection of journals (MEDLINE, PsychInfo, CINHAL etc.)
  • A journal is an individual thing that publishes studies (ex: New England Journal of Medicine)
35
Q

Define a predatory journal

A
  • Predatory journals are publications that claim to be legitimate scholarly journals, but misrepresent their publishing practices.
  • Some common forms of predatory publishing practices include falsely claiming to provide peer review, hiding information about Article Processing Charges (APCs), misrepresenting members of the journal’s editorial board, and other violations of copyright or scholarly ethics.
36
Q

List/describe 3 characteristics of a predatory journal

A
  • False claims about peer review
  • misrepresenting numbers/members of the editorial board
  • violations of copyright
37
Q

List 3 factors that contributed to the development of predatory journals

A
  • Proliferation of research (open access journals- ex: journals that do not require a subscription)
  • Groups came out with very large open access journals on the internet that were totally free and began to misrepresent their publishing practices
  • rising journal subscription costs to libraries
    Prices of journals and databases have gone up, so it’s harder for libraries to maintain the databases
  • tenure pressure
  • certain researcher-paid open access journals with lax or no peer review

The internet allows predatory journals to really take off*

38
Q

Predatory journals vs. online journals

A
  • These are not the same thing
  • Journals can be online and open access and still be legitimate
39
Q

Impact Factor

A
  • A calculation of citations and published articles cited in Scopus (Elsevier).
  • An impact factor is given to a journal, and it is a measure of the number of times people cite that.
  • More citations leads to a higher impact factor/seems to be more reliable
  • this was intended to combat predatory journals
40
Q

Research interest –> research topic –> research question

A
  • Identify a research interest (ex: Alzheimer’s Disease) and search it in a database
  • To get to a research topic, add specific details to make it more narrow (who, what, where, when, why)
  • A research question is the most specific

-*library sources help you by providing research, and reading abstracts to help you get ideas of what variables to choose

41
Q

What does a research question specify?

A
  • Who you are going to be evaluating
  • What specific construct are you researching (operational definition)
  • Where will you conduct the study
  • When will you conduct the study and how long will it take?
  • The minimum 2 variables
42
Q

What is the FINER criteria for a research question?

A
  • Feasible (ample participants, time, $)
  • Interesting
  • Novel (confirms/refutes/extends previous findings, or finds something new)
  • Ethical
  • Relevant (research, clinical)
43
Q

How does operationalizing variables impact the statistical design?

A

When you operationalize the constructs, it lends itself to the design because
- defining the construct can tell you what type of measurements to use (interviews, assessments, surveys)
- the type of data you will need (ordinal, nominal etc)
- how you will compare changes in the construct (pre-test post-test, correlation etc.)

44
Q

Define Unit of Analysis

A
  • the unit that you are collecting your data on → this is usually individual peoples (ex: parent perceptions of child progress in speech language services).
  • The unit of analysis can also be a group (ex: school district report cards- the individual district is rated).
  • For the research class for this project, the unit of analysis is individuals taking the survey.
  • *it helps with design because when you know the construct, you can operationalize it.
  • Ex: anxiety in students is the unit of analysis- operationalized by scores on an anxiety assessment
45
Q

Operational definition vs. variable

A
  • Operational definitions need to define the observable traits that can be measured to gain information about a construct
  • a variable is something that can change in and between participants (ex: level of anxiety)
46
Q

Conceptual vs. Operational Definitions

A
  • Conceptual definitions explain the background, understanding, and theoretical framework of an idea
  • An operational definition explains how the idea will be observed and measured in the study
  • background vs measure
47
Q

constant vs variable

A
  • A constant is a group that does not change throughout the experiment.
  • A variable is something that can change or is manipulated to change.
48
Q

List 2 types of variables

A

Discreet, continuous

49
Q

Define discreet variable

A

Has a finite range/set categories (ex: Likert scale)
-discreet can be further broken down into dichotomous and polytomous variables
-dichotomous: having 2 categories
-polytomous: having 3 or more categories

50
Q

Define continuous variable

A
  • Has an infinite range (ex: temperature-can be in between degrees)
51
Q

Contrast independent and dependent variable

A
  • Independent variable is the thing you test or change
  • Dependent variable is what happens as a result of the independent variable manipulation
52
Q

What is the key difference between mediator and moderator variables?

A
  • A mediator variable affects the dependent variable
  • a moderator is indirect.
53
Q

Define/give an example of a mediator variable

A
  • A mediator variable has a direct line of relationship that affects the dependent variable.
  • The independent variable causes the mediator and then causes the dependent.
  • Ex: temperament. If you have a certain personality, you might be more likely to have a specific type of emotional regulation. And then your temperament directly affects emotional regulation
  • Ex: study time affects your test score.
54
Q

Define/give an example of a moderator variable

A
  • Can affect the relationship between the independent-mediator-dependent variable.
  • There is no direct cause from the independent variable to the moderator (not part of the chain, but can affect the relationship).
  • Ex: In a relationship between emotional reactivity and emotional dysfunction, studies have demonstrated that there is a relationship between the independent variable (reactivity) and dependent variable (dysfunction).
  • Emotional coping mechanisms → there is no relationship between emotional reactivity and coping mechanisms.
  • But, coping mechanisms will influence the relationship between activity and dysfunction (people who are taught coping mechanisms are likely to be less reactive). – The coping mechanisms are a MODERATOR- because they are not part of the chain
55
Q

List the 4 scales of measurement

A

Nominal, Ordinal, Interval, Ratio

56
Q

Nominal Scale- describe/give example

A
  • Nominal Scale-most basic scale of measurement.
  • They are arbitrary- the only thing about a nominal scale is two different points in the scale cannot have the same value
  • property: identity
  • mathematical operation: count
  • descriptive stats: mode
57
Q

Ordinal Scale-describe/give example

A
  • Follows the rule of magnitude (order). One value is higher than another.
  • They have a magnitude relationship between the different values on the scale.
  • The variables have a relationship of magnitude but it is not a set/equal amount between them.
  • Example-Likert scale: usually a 5-7 point ordinal scale (agree/strongly agree/neutral-has no equal/set value between them)
  • property: identity, magnitude
  • mathematical operations: rank order
  • descriptive stats: mean, median, mode
58
Q

Interval Scale-describe/give example

A
  • has identity, magnitude (order) and equal intervals between the points. This is something like an IQ score (have an average of 100 and a SD of 15).
59
Q

What is the difference between interval and ordinal scales?

A
  • An ordinal scale has an order, but there is no equal distance between the values (agree/strongly agree etc)
  • An interval scale also has an order, and the intervals between values are equal and meaningful (ex: IQ scores)
60
Q

Ratio Scales-describe/give example

A
  • Has a starting point at absolute zero
  • ex: the number of times someone stutters (they have to start at 0 first)
61
Q

What is the difference between an interval and ratio scale?

A

The interval scale does not start at absolute zero, where the ratio scale does