SOCW-427 MIDTERM STUDY DECK Flashcards
Uncritical documentation
Assuming that because something is described in the literature it must be true; literature is cited, but no information is given about how the cited author arrived at a conclusion
Procedural fidelity
the match between how a method should be implemented for maximal effect and how it is implemented
Evidence-based practice
A process in which practitioners consider the best scientific evidence available pertinent to a particular practice decision as an important part of their decision making.
Evidence-Based Practice
a process in which the best scientific evidence pertinent to a practice decision is an important part of the information practitioners consider when making that practice decision.
Attributes of Evidence-Based Practice
Critical thinking Career-long learning Flexibility -Integrating scientific knowledge with practice expertise and knowledge of client attributes
Evidence-based practitioners will:
-Think for themselves -Consider whether beliefs or assertions of knowledge are based on sound evidence and logic -Think open mindedly, recognizing and questioning unstated assumptions underlying beliefs and assertions -Be willing to test their own beliefs or conclusions and then alter them on the basis of new experiences and evidence -Formulate appropriate questions and then gather and appraise evidence as a basis for making decisions
Steps in Evidence-Based Practice
Step 1: Formulate a Question to Answer Practice Needs Step 2: Search for the Evidence Step 3: Critically Appraise the Relevant Studies You Find Step 4: Determine Which Evidence-Based Intervention Is Most Appropriate for Your Particular Client(s) Step 5: Apply the Evidence-Based Intervention Step 6: Evaluation and Feedback
Four common types of EBP questions
- What intervention, program, or policy has the best effects? 2. What factors best predict desirable or undesirable consequences? 3. What’s it like to have had my client’s experiences? 4. What assessment tool should be used?
Reasons for Studying Research
To increase your practice effectiveness by critically appraising research studies that can inform practice decisions (Publication does not guarantee quality) The NASW Code of Ethics requires research utilization Compassion for clients?
How do social workers know things?
-Agreement reality -Experiential Reality -Science -Tradition —Such as accumulated practice wisdom that has not been scientifically verified -Authority -Relying on “experts” -Common sense -Popular media
The Scientific Method
-All knowledge is provisional and subject to refutation (everything is open to question) -Knowledge is based on observations that are: —Orderly and comprehensive (avoidance of overgeneralization) —As objective as possible —Replicated in different studies
Flaws in Unscientific Sources
-Inaccurate Observation -Overgeneralization -Selective Observation -Ex Post Facto Hypothesizing -Ego Involvement in Understanding -Premature Closure of Inquiry
Critical Thinking
Careful appraisal of beliefs and actions to arrive at well-reasoned ones that maximize the likelihood of helping clients and avoiding harm.
What is required for critical thinking?
1) Problem Solving 2) Clarity of Expression 3) Critical appraisal of evidence and reasons 4) Consideration of alternative points of view
Pseudoscience
Makes science-like claims with no evidence
Quackery
Promotion of something known to be false or untested.
Fundamental Attribution Error
The tendency to attribute the cause of behaviors to personal characteristics instead of the environment
Behavioral Confirmation Bias
The tendency to search for data that support favored positions and to ignore data that do not
Criteria of evidence-informed client choice
1) The decision involves which intervention to use 2) The person is given research-based information about effectiveness of at least two alternatives, which may include doing nothing 3) The person provides input in the decision-making
Questions that address Social Validity concerns
1) Are the goals important and relevant to desired change? 2) Are methods acceptable or too costly? 3) Are clients happy with expected or unexpected outcome?
Cultural Competence
being aware of and appropriately responding to the ways in which cultural factors and cultural differences should influence what we investigate, how we investigate, and how we interpret our findings
Steps to improve cultural competence
Cultural immersion: cultural and scientific literature; cultural events, travel, etc. Participant observation (Chap 18) Advice from colleagues who are members of the culture of interest Input from community members/leaders Focus groups
Three main threats to culturally competent measurement include:
- The use of interviewers whose personal characteristics or interviewing styles offend or intimidate minority respondents or make them reluctant to divulge relevant and valid information 2. The use of language that minority respondents do not understand, and 3. Cultural bias
Quantitative research methods
Research methods that seek to produce precise and generalizable findings. Studies using quantitative methods typically attempt to formulate all or most of their research procedures in advance and then try to adhere precisely to those procedures with maximum objectivity as data are collected.
Qualitative research methods
Research methods that are more flexible than quantitative methods, that allow research procedures to evolve as more observations are gathered, and that typically permit the use of subjectivity to generate deeper understandings of the meanings of human experiences.
Mixed methods research
A stand-alone research design in which a single study not only collects both qualitative and quantitative data, but also integrates both sources of data at one or more stages of the research process so as to improve the understanding of the phenomenon being investigated.
Quantitative Methods Emphasize:
-Precision -Generalizability -Testing hypotheses
Qualitative Methods Emphasize:
-Deeper understandings -Describing contexts -Generating hypotheses -Discovery
Which method specifies research procedures in advance
Quantitative
Which method contains flexibly allowing research procedures to evolve as data are gathered
Qualitative
Quantitative Collection
Office, agency, mail, or internet data collection setting
Qualitative Collection
Data collected in natural environment of research participants
Quantitative Emphases
-Deductive -Larger samples -Objectivity -Numbers/statistics -Less contextual detail -Close-ended questions -Less time-consuming -Easier to replicate
Qualitative Emphases
-Inductive -Smaller samples -Subjectivity -Words/patterns -Rich descriptions -Open-ended questions -More time-consuming -Harder to replicate
What makes a good research question?
-Is narrow and specific -Has more than one possible answer -Is posed in a way that can be answered by observable evidence -Addresses the decision-making needs of agencies or practical problems in social welfare -Has clear significance for guiding social welfare policy or social work practice -Is feasible to answer
What are some feasibility issues with research?
-Scope of study -Time required -Fiscal costs -Ethical considerations -Cooperation required from others -Obtaining advance authorization
Hypothesis
Tentative and testable statement about a presumed relationship between variables
Independent Variable
The variable in a hypothesis that is postulated to explain or cause another variable
Dependent Variable
The variable in a hypothesis that is thought to be explained or caused by the independent variable
Hypotheses should be:
- clear and specific
- have more than one possible outcome
- value free
- testable
Nominal Level of Measurement
Describes a variable in terms of the number of cases in each category of that variable. Examples -gender -ethnicity -religious affiliation
Ordinal level of measurement
Describes a variable whose categories can be rank-ordered according to how much of that variable they are. We know only whether one case has more or less of something than another case, but we don’t know precisely how much more. Examples level of client satisfaction brief rating scale:
Reliability
-A particular measurement technique, when applied repeatedly to the same object, would yield the same result each time -The more reliable the measure, the less random error
Validity
Are you measuring what you are supposed to be measuring?
Face Validity
A crude and subjective judgment by the researcher that a measure merely appears to measure what it is supposed to measure
Content Validity
-The degree to which a measure covers the range of meanings included within the concept -Established based on judgments as well
Bias
A distortion in measurement based on personal preferences and beliefs.
Random error
A measurement error that has no consistent pattern of effects.
Element
The unit selected in a sample about which information is collected.
Population
The theoretically specified aggregation of study elements.
Study population
The aggregation of elements from which the sample is actually selected.
Random selection
A sampling method in which each element has an equal chance of selection independent of any other event in the selection process.
Overgeneralization
Assuming that a few similar events are evidence of a general pattern.
Selective Observation
After concluding that a pattern exists, paying attention to only the data that supports the pattern that was identified.
Cross-sectional study
A Snapshot in time. Just one measurement with no follow-up.
Longitudinal study
Studies that conduct observations at different points in time.
Paradigm
A set of philosophical assumptions about the nature of reality- a fundamental model or scheme that organizes our view of some things.
Contemporary positivism
A paradigm that emphasizes the pursuit of objectivity in our quest to observe and understand reality.-
Social constructivism
A paradigm that emphasizes multiple subjective realities and the difficulty of being objective.
Interpretivism
A research paradigm that focuses on gaining an empathic understanding of how people feel inside, seeking to interpret individuals’ everyday experiences, their deeper meanings and feelings, and the idiosyncratic reasons for their behaviors.-qualitative
Critical social science
A research paradigm distinguished by its focus on oppression and its commitment to using research procedures to empower oppressed groups.
Feminist paradigm
A research paradigm, like the critical social science paradigm, distinguished by its commitment to using research procedures to address issues of concern to women and to empower women.
Theory
A systematic set of interrelated statements intended to explain some aspect of social life or enrich our sense of how people conduct and find meaning in their daily lives.
Culturally competent research
Research that is sensitive and responsive to the ways in which cultural factors and cultural differences influence what we investigate, how we investigate, and how we interpret our findings.
Three main threats to culturally competent
measurement include:
- The use of interviewers whose personal characteristics or interviewing styles offend or intimidate minority respondents or make them reluctant to divulge relevant and valid information
- The use of language that minority respondents do not understand, and
- Cultural bias
Three Ethical Controversies
Observing Human Obedience
Trouble in the Tearoom
Social Worker Submits Bogus Article to Test Journal Bias
Steps to improve cultural competence
Cultural immersion: cultural and scientific literature; cultural events, travel, etc.
Participant observation (Chap 18)
Advice from colleagues who are members of the culture of interest
Input from community members/leaders
Focus groups
Systematic Error
When the information we collect consistently reflects a false picture
-Biases: The most common way our measures systematically measure something other than what we think they do is when biases are involved, e.g.:
Acquiescent response set
Social desirability bias
Random error
Random errors have no consistent pattern of effects. They do not bias the measures.
Examples:
-Cumbersome, complex, boring measurement procedures
-Measure uses professional jargon which respondents are not familiar with
Stratification
The grouping of units masking up a population into homogeneous groups (or strata) before sampling.
Purposive sampling
Selecting a sample based on your own judgement about which units are most representative or useful.
Criteria for Inferring Causality
1) Cause (independent variable) must precede the effect (dependent variable) in time
2) The two variables are empirically correlated with one another
3) The observed empirical correlation between the two variables can not be due to the influence of a third variable that causes the two under consideration
Quasi-experimental Designs
§Designs that attempt to control for threats to internal validity and thus permit causal inferences but are distinguished from true experiments primarily by the lack of random assignment of subjects