SOCW-427 MIDTERM STUDY DECK Flashcards

1
Q

Uncritical documentation

A

Assuming that because something is described in the literature it must be true; literature is cited, but no information is given about how the cited author arrived at a conclusion

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Procedural fidelity

A

the match between how a method should be implemented for maximal effect and how it is implemented

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Evidence-based practice

A

A process in which practitioners consider the best scientific evidence available pertinent to a particular practice decision as an important part of their decision making.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Evidence-Based Practice

A

a process in which the best scientific evidence pertinent to a practice decision is an important part of the information practitioners consider when making that practice decision.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Attributes of Evidence-Based Practice

A

Critical thinking Career-long learning Flexibility -Integrating scientific knowledge with practice expertise and knowledge of client attributes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Evidence-based practitioners will:

A

-Think for themselves -Consider whether beliefs or assertions of knowledge are based on sound evidence and logic -Think open mindedly, recognizing and questioning unstated assumptions underlying beliefs and assertions -Be willing to test their own beliefs or conclusions and then alter them on the basis of new experiences and evidence -Formulate appropriate questions and then gather and appraise evidence as a basis for making decisions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Steps in Evidence-Based Practice

A

Step 1: Formulate a Question to Answer Practice Needs Step 2: Search for the Evidence Step 3: Critically Appraise the Relevant Studies You Find Step 4: Determine Which Evidence-Based Intervention Is Most Appropriate for Your Particular Client(s) Step 5: Apply the Evidence-Based Intervention Step 6: Evaluation and Feedback

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Four common types of EBP questions

A
  1. What intervention, program, or policy has the best effects? 2. What factors best predict desirable or undesirable consequences? 3. What’s it like to have had my client’s experiences? 4. What assessment tool should be used?
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Reasons for Studying Research

A

To increase your practice effectiveness by critically appraising research studies that can inform practice decisions (Publication does not guarantee quality) The NASW Code of Ethics requires research utilization Compassion for clients?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How do social workers know things?

A

-Agreement reality -Experiential Reality -Science -Tradition —Such as accumulated practice wisdom that has not been scientifically verified -Authority -Relying on “experts” -Common sense -Popular media

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

The Scientific Method

A

-All knowledge is provisional and subject to refutation (everything is open to question) -Knowledge is based on observations that are: —Orderly and comprehensive (avoidance of overgeneralization) —As objective as possible —Replicated in different studies

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Flaws in Unscientific Sources

A

-Inaccurate Observation -Overgeneralization -Selective Observation -Ex Post Facto Hypothesizing -Ego Involvement in Understanding -Premature Closure of Inquiry

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Critical Thinking

A

Careful appraisal of beliefs and actions to arrive at well-reasoned ones that maximize the likelihood of helping clients and avoiding harm.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is required for critical thinking?

A

1) Problem Solving 2) Clarity of Expression 3) Critical appraisal of evidence and reasons 4) Consideration of alternative points of view

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Pseudoscience

A

Makes science-like claims with no evidence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Quackery

A

Promotion of something known to be false or untested.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Fundamental Attribution Error

A

The tendency to attribute the cause of behaviors to personal characteristics instead of the environment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Behavioral Confirmation Bias

A

The tendency to search for data that support favored positions and to ignore data that do not

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Criteria of evidence-informed client choice

A

1) The decision involves which intervention to use 2) The person is given research-based information about effectiveness of at least two alternatives, which may include doing nothing 3) The person provides input in the decision-making

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Questions that address Social Validity concerns

A

1) Are the goals important and relevant to desired change? 2) Are methods acceptable or too costly? 3) Are clients happy with expected or unexpected outcome?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Cultural Competence

A

being aware of and appropriately responding to the ways in which cultural factors and cultural differences should influence what we investigate, how we investigate, and how we interpret our findings

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Steps to improve cultural competence

A

Cultural immersion: cultural and scientific literature; cultural events, travel, etc. Participant observation (Chap 18) Advice from colleagues who are members of the culture of interest Input from community members/leaders Focus groups

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Three main threats to culturally competent measurement include:

A
  1. The use of interviewers whose personal characteristics or interviewing styles offend or intimidate minority respondents or make them reluctant to divulge relevant and valid information 2. The use of language that minority respondents do not understand, and 3. Cultural bias
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Quantitative research methods

A

Research methods that seek to produce precise and generalizable findings. Studies using quantitative methods typically attempt to formulate all or most of their research procedures in advance and then try to adhere precisely to those procedures with maximum objectivity as data are collected.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

Qualitative research methods

A

Research methods that are more flexible than quantitative methods, that allow research procedures to evolve as more observations are gathered, and that typically permit the use of subjectivity to generate deeper understandings of the meanings of human experiences.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

Mixed methods research

A

A stand-alone research design in which a single study not only collects both qualitative and quantitative data, but also integrates both sources of data at one or more stages of the research process so as to improve the understanding of the phenomenon being investigated.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

Quantitative Methods Emphasize:

A

-Precision -Generalizability -Testing hypotheses

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

Qualitative Methods Emphasize:

A

-Deeper understandings -Describing contexts -Generating hypotheses -Discovery

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

Which method specifies research procedures in advance

A

Quantitative

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

Which method contains flexibly allowing research procedures to evolve as data are gathered

A

Qualitative

31
Q

Quantitative Collection

A

Office, agency, mail, or internet data collection setting

32
Q

Qualitative Collection

A

Data collected in natural environment of research participants

33
Q

Quantitative Emphases

A

-Deductive -Larger samples -Objectivity -Numbers/statistics -Less contextual detail -Close-ended questions -Less time-consuming -Easier to replicate

34
Q

Qualitative Emphases

A

-Inductive -Smaller samples -Subjectivity -Words/patterns -Rich descriptions -Open-ended questions -More time-consuming -Harder to replicate

35
Q

What makes a good research question?

A

-Is narrow and specific -Has more than one possible answer -Is posed in a way that can be answered by observable evidence -Addresses the decision-making needs of agencies or practical problems in social welfare -Has clear significance for guiding social welfare policy or social work practice -Is feasible to answer

36
Q

What are some feasibility issues with research?

A

-Scope of study -Time required -Fiscal costs -Ethical considerations -Cooperation required from others -Obtaining advance authorization

37
Q

Hypothesis

A

Tentative and testable statement about a presumed relationship between variables

38
Q

Independent Variable

A

The variable in a hypothesis that is postulated to explain or cause another variable

39
Q

Dependent Variable

A

The variable in a hypothesis that is thought to be explained or caused by the independent variable

40
Q

Hypotheses should be:

A
  • clear and specific
  • have more than one possible outcome
  • value free
  • testable
41
Q

Nominal Level of Measurement

A

Describes a variable in terms of the number of cases in each category of that variable. Examples -gender -ethnicity -religious affiliation

42
Q

Ordinal level of measurement

A

Describes a variable whose categories can be rank-ordered according to how much of that variable they are. We know only whether one case has more or less of something than another case, but we don’t know precisely how much more. Examples level of client satisfaction brief rating scale:

43
Q

Reliability

A

-A particular measurement technique, when applied repeatedly to the same object, would yield the same result each time -The more reliable the measure, the less random error

44
Q

Validity

A

Are you measuring what you are supposed to be measuring?

45
Q

Face Validity

A

A crude and subjective judgment by the researcher that a measure merely appears to measure what it is supposed to measure

46
Q

Content Validity

A

-The degree to which a measure covers the range of meanings included within the concept -Established based on judgments as well

47
Q

Bias

A

A distortion in measurement based on personal preferences and beliefs.

48
Q

Random error

A

A measurement error that has no consistent pattern of effects.

49
Q

Element

A

The unit selected in a sample about which information is collected.

50
Q

Population

A

The theoretically specified aggregation of study elements.

51
Q

Study population

A

The aggregation of elements from which the sample is actually selected.

52
Q

Random selection

A

A sampling method in which each element has an equal chance of selection independent of any other event in the selection process.

53
Q

Overgeneralization

A

Assuming that a few similar events are evidence of a general pattern.

54
Q

Selective Observation

A

After concluding that a pattern exists, paying attention to only the data that supports the pattern that was identified.

55
Q

Cross-sectional study

A

A Snapshot in time. Just one measurement with no follow-up.

56
Q

Longitudinal study

A

Studies that conduct observations at different points in time.

57
Q

Paradigm

A

A set of philosophical assumptions about the nature of reality- a fundamental model or scheme that organizes our view of some things.

58
Q

Contemporary positivism

A

A paradigm that emphasizes the pursuit of objectivity in our quest to observe and understand reality.-

59
Q

Social constructivism

A

A paradigm that emphasizes multiple subjective realities and the difficulty of being objective.

60
Q

Interpretivism

A

A research paradigm that focuses on gaining an empathic understanding of how people feel inside, seeking to interpret individuals’ everyday experiences, their deeper meanings and feelings, and the idiosyncratic reasons for their behaviors.-qualitative

61
Q

Critical social science

A

A research paradigm distinguished by its focus on oppression and its commitment to using research procedures to empower oppressed groups.

62
Q

Feminist paradigm

A

A research paradigm, like the critical social science paradigm, distinguished by its commitment to using research procedures to address issues of concern to women and to empower women.

63
Q

Theory

A

A systematic set of interrelated statements intended to explain some aspect of social life or enrich our sense of how people conduct and find meaning in their daily lives.

64
Q

Culturally competent research

A

Research that is sensitive and responsive to the ways in which cultural factors and cultural differences influence what we investigate, how we investigate, and how we interpret our findings.

65
Q

Three main threats to culturally competent
measurement include:

A
  1. The use of interviewers whose personal characteristics or interviewing styles offend or intimidate minority respondents or make them reluctant to divulge relevant and valid information
  2. The use of language that minority respondents do not understand, and
  3. Cultural bias
66
Q

Three Ethical Controversies

A

Observing Human Obedience
Trouble in the Tearoom
Social Worker Submits Bogus Article to Test Journal Bias

67
Q

Steps to improve cultural competence

A

Cultural immersion: cultural and scientific literature; cultural events, travel, etc.
Participant observation (Chap 18)
Advice from colleagues who are members of the culture of interest
Input from community members/leaders
Focus groups

68
Q

Systematic Error

A

When the information we collect consistently reflects a false picture
-Biases: The most common way our measures systematically measure something other than what we think they do is when biases are involved, e.g.:
Acquiescent response set
Social desirability bias

69
Q

Random error

A

Random errors have no consistent pattern of effects. They do not bias the measures.
Examples:
-Cumbersome, complex, boring measurement procedures
-Measure uses professional jargon which respondents are not familiar with

70
Q

Stratification

A

The grouping of units masking up a population into homogeneous groups (or strata) before sampling.

71
Q

Purposive sampling

A

Selecting a sample based on your own judgement about which units are most representative or useful.

72
Q

Criteria for Inferring Causality

A

1) Cause (independent variable) must precede the effect (dependent variable) in time
2) The two variables are empirically correlated with one another
3) The observed empirical correlation between the two variables can not be due to the influence of a third variable that causes the two under consideration

73
Q

Quasi-experimental Designs

A

§Designs that attempt to control for threats to internal validity and thus permit causal inferences but are distinguished from true experiments primarily by the lack of random assignment of subjects