SOCW-427 Info of Importance Flashcards

1
Q

Which method specifies research procedures in advance

A

Quantitative

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Which method contains flexibly allowing research procedures to evolve as data are gathered

A

Qualitative

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What makes a good research question?

A

-Is narrow and specific -Has more than one possible answer -Is posed in a way that can be answered by observable evidence -Addresses the decision-making needs of agencies or practical problems in social welfare -Has clear significance for guiding social welfare policy or social work practice -Is feasible to answer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is required for critical thinking?

A

1) Problem Solving 2) Clarity of Expression 3) Critical appraisal of evidence and reasons 4) Consideration of alternative points of view

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What are some feasibility issues with research?

A

-Scope of study -Time required -Fiscal costs -Ethical considerations -Cooperation required from others -Obtaining advance authorization

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Variables

A

Broader concepts that vary (include more than one attribute or level of a concept) and that researchers investigate, e.g. age, gender, level of self-esteem, number of abusive incidents, etc.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Variable

A

A concept being investigated that is characterized by different attributes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Validity

A

Are you measuring what you are supposed to be measuring?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Uncritical documentation

A

Assuming that because something is described in the literature it must be true; literature is cited, but no information is given about how the cited author arrived at a conclusion

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

TROUT

A

Tentative- Everything we know today, may change by tomorrow. Replication- All studies need to be replicated Observation- Knowledge is grounded in orderly and comprehensive observations Unbiased- Observations should be unbiased Transparent- All details are openly specified for review and evaluation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Triangulation

A

The use of more than one imperfect data collection alternative in which each option is vulnerable to different potential sources of error.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Three main threats to culturally competent measurement include:

A
  1. The use of interviewers whose personal characteristics or interviewing styles offend or intimidate minority respondents or make them reluctant to divulge relevant and valid information 2. The use of language that minority respondents do not understand, and 3. Cultural bias
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Three Ethical Controversies

A

Observing Human Obedience Trouble in the Tearoom Social Worker Submits Bogus Article to Test Journal Bias

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Three Advanced Mixed Methods Designs

A

=Intervention Mixed Methods Design =Social Justice Mixed Methods design =Multiphase Mixed Methods design

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Theoretical sampling

A

A sampling method associated with the grounded theory paradigm of qualitative research, in which new cases are selected that seem to be similar to those that generated previously detected concepts and hypotheses, but once the researcher perceives that no new insights are being generated from observing similar cases, a different type of case is selected, and the same process is repeated until the observation of different types of cases seems to be generating no new insights.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

The Scientific Method

A

-All knowledge is provisional and subject to refutation (everything is open to question) -Knowledge is based on observations that are: —Orderly and comprehensive (avoidance of overgeneralization) —As objective as possible —Replicated in different studies

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Test-retest reliability

A

A method for assessing a measure’s consistency or stability.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Systematic sampling

A

An efficient alternative to random sampling, in which every kth element in the sampling frame list-after a random start-is chosen for inclusion in the sample.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Systematic review

A

reports comprehensive searches for published and unpublished studies that address a research question

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Systematic Error

A

When the information we collect consistently reflects a false picture -Biases: The most common way our measures systematically measure something other than what we think they do is when biases are involved, e.g.: Acquiescent response set Social desirability bias

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

systematic error

A

A measurement error that occurs when the information we collect consistently reflects a false picture of the concept we seek to measure.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Study population

A

The aggregation of elements from which the sample is actually selected.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Straw person argument

A

illogical reasoning distorting an argument in order to attack it

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Stratification

A

The grouping of units masking up a population into homogeneous groups (or strata) before sampling.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

Steps to improve cultural competence

A

Cultural immersion: cultural and scientific literature; cultural events, travel, etc. Participant observation (Chap 18) Advice from colleagues who are members of the culture of interest Input from community members/leaders Focus groups

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

Steps in Evidence-Based Practice

A

Step 1: Formulate a Question to Answer Practice Needs Step 2: Search for the Evidence Step 3: Critically Appraise the Relevant Studies You Find Step 4: Determine Which Evidence-Based Intervention Is Most Appropriate for Your Particular Client(s) Step 5: Apply the Evidence-Based Intervention Step 6: Evaluation and Feedback

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

Spurious relationship

A

A relationship between two variables that are no longer related when a third variable is controlled.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

sociological propaganda:

A

the penetration of an ideology by means of its sociological context via economic, political, and sociological structures

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

Social Justice Mixed Methods design

A

Methods based on a social justice theory and aimed at yielding a call for action to improve the plight of vulnerable, marginalized or oppressed groups

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

Social desirability bias

A

The tendency of people to say or do things that will make them or their reference group look good.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

Social constructivism

A

A paradigm that emphasizes multiple subjective realities and the difficulty of being objective.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q

Snowball sampling

A

A nonprobability sampling method used when the members of a special population are difficult to locate. Each selected member of the target population whom one is able to locate is asked to provide the information needed to locate other members of that population that they happen to know.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
33
Q

Simple random sampling

A

Using a table of random numbers to select sampling units after assigning a single number to each element in the sampling frame list.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
34
Q

self-reports

A

A source of data that can be used when operationally defining variables according to what people say about their own thoughts, views, or behaviors.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
35
Q

Sampling unit

A

An element or set of elements considered for selection in some stage of sampling.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
36
Q

Sampling ratio

A

The proportion of elements in the population that are selected in a systematic sample.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
37
Q

Sampling interval

A

The standard distance between elements selected in systematic sampling.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
38
Q

Sampling frame

A

The list or quasi-list of elements from which a sample is selected.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
39
Q

Sampling error

A

The difference between the true population parameter and the estimated population parameter.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
40
Q

Research Purposes

A

-Exploration -Description -Explanation -Evaluation -Constructing Measurement Instruments -Multiple Purposes -Explaining and Predicting

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
41
Q

Relying on testimonials

A

Claiming that a method is effective based on one’s own experiences

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
42
Q

Relying on case examples

A

Drawing conclusions about many clients based on one or a few unrepresentative individuals

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
43
Q

Reliability

A

-A particular measurement technique, when applied repeatedly to the same object, would yield the same result each time -The more reliable the measure, the less random error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
44
Q

Relationships between variables can be

A

positive, negative, or curvilinear

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
45
Q

Relationship

A

Variables changing together in a consistent, predictable fashion.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
46
Q

Relationship

A

Variables that change together in a consistent, predictable fashion, e.g., height and weight

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
47
Q

Reasons for Using Mixed Methods

A

-Extend main findings -Generate research questions or techniques -Corroborate findings

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
48
Q

Reasons for Studying Research

A

To increase your practice effectiveness by critically appraising research studies that can inform practice decisions (Publication does not guarantee quality) The NASW Code of Ethics requires research utilization Compassion for clients?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
49
Q

Ratio level of measurement

A

A level of measurement that describes variables (such as age or number of children) whose attributes have all the qualities of interval measures and are also based on a true zero point.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
50
Q

Random selection

A

A sampling method in which each element has an equal chance of selection independent of any other event in the selection process.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
51
Q

Random sampling

A

A precise, scientific procedure for selecting research population elements for a sample that guarantees an equal probability of selection of each element when substantial samples are selected from large populations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
52
Q

Random Error

A

Random errors have no consistent pattern of effects. They do not bias the measures. Examples: -Cumbersome, complex, boring measurement procedures -Measure uses professional jargon which respondents are not familiar with

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
53
Q

Random error

A

A measurement error that has no consistent pattern of effects.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
54
Q

Quota sampling

A

A type of non probability sampling in which units are selected into the sample on the basis of pre-specified characteristics so that the total sample will have the same distribution of characteristics as are assumed to exist in the population being studied.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
55
Q

Questions that address Social Validity concerns

A

1) Are the goals important and relevant to desired change? 2) Are methods acceptable or too costly? 3) Are clients happy with expected or unexpected outcome?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
56
Q

Quantitative research methods

A

Research methods that seek to produce precise and generalizable findings. Studies using quantitative methods typically attempt to formulate all or most of their research procedures in advance and then try to adhere precisely to those procedures with maximum objectivity as data are collected.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
57
Q

Quantitative Methods Emphasize:

A

-Precision -Generalizability -Testing hypotheses

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
58
Q

Quantitative Emphases

A

-Deductive -Larger samples -Objectivity -Numbers/statistics -Less contextual detail -Close-ended questions -Less time-consuming -Easier to replicate

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
59
Q

Quantitative Collection

A

Office, agency, mail, or internet data collection setting

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
60
Q

Qualitative research methods

A

Research methods that are more flexible than quantitative methods, that allow research procedures to evolve as more observations are gathered, and that typically permit the use of subjectivity to generate deeper understandings of the meanings of human experiences.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
61
Q

Qualitative Methods Emphasize:

A

-Deeper understandings -Describing contexts -Generating hypotheses -Discovery

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
62
Q

Qualitative Emphases

A

-Inductive -Smaller samples -Subjectivity -Words/patterns -Rich descriptions -Open-ended questions -More time-consuming -Harder to replicate

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
63
Q

Qualitative Collection

A

Data collected in natural environment of research participants

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
64
Q

Quackery

A

Promotion of something known to be false or untested.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
65
Q

Purposive sampling

A

Selecting a sample based on your own judgement about which units are most representative or useful.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
66
Q

Pseudoscience

A

Makes science-like claims with no evidence

67
Q

Procedural fidelity

A

the match between how a method should be implemented for maximal effect and how it is implemented

68
Q

Probability sampling

A

The use of random procedures to select a sample that can allow us to estimate the expected degree of sampling error in a study and determine or control the likelihood of specific units in a population being selected for the study.

69
Q

Predictive Validity

A

The degree to which an instrument accurately predicts a criterion that will occur in the future.

70
Q

positive relationship

A

A relationship in which the dependent variable increases as the dependent variable increases, or decreases as it decreases. Both variables move in the same direction.

71
Q

Population

A

The theoretically specified aggregation of study elements.

72
Q

Phases in the Research Process

A

-Problem formulation -Designing the study -Data collection -Data processing -Data analysis -Interpreting the findings -Writing the research report

73
Q

Parameter

A

The summary description of a given variable in a population.

74
Q

Paradigm

A

A set of philosophical assumptions about the nature of reality- a fundamental model or scheme that organizes our view of some things.

75
Q

P.I.E.

A

Person in Environment This is what makes social work different than medicine.

76
Q

Ordinal level of measurement

A

Describes a variable whose categories can be rank-ordered according to how much of that variable they are. We know only whether one case has more or less of something than another case, but we don’t know precisely how much more. Examples level of client satisfaction brief rating scale:

77
Q

Operational definition

A

A definition of a variable that identifies the observable indicators that will be used to determine that variable’s attributes.

78
Q

Nonprobability sampling

A

The use of procedures to select a simple sample that does not involve random selection.

79
Q

Nominal Level of Measurement

A

Describes a variable in terms of the number of cases in each category of that variable. Examples -gender -ethnicity -religious affiliation

80
Q

Nominal definition

A

A dictionary-like definition that uses a set of words to help us understand what a term means, but does not tell us what indicators to use in observing the term in a research study.

81
Q

Nine Types of Mixed Methods Designs

A

3 possible emphases (qualitative, quantitative, equal) by 3 possible sequences (qualitative 1st, quantitative 1st, concurrent)

82
Q

Newness

A

illogical reasoning touting something because it is novel

83
Q

negative, or inverse relationship

A

A relationship between two variables that move in opposite directions. As one increases the other decreases and likewise.

84
Q

Multiphase Mixed Methods design

A

Several mixed methods projects implemented in multiple phases over time longitudinally and focusing on a common objective

85
Q

Moderating Variables

A

-Can influence the strength and direction of relationships between independent and dependent variables -Sometimes called control variables -When controlled for in a study can show that the relationship between the independent and dependent variables is really spurious

86
Q

Mixed methods research

A

A stand-alone research design in which a single study not only collects both qualitative and quantitative data, but also integrates both sources of data at one or more stages of the research process so as to improve the understanding of the phenomenon being investigated.

87
Q

Metric equivalence

A

scores on a measure are comparable across cultures.

88
Q

Meta-analysis

A

A systematic review that pools the statistical results across studies of particular interventions and generates conclusions about which interventions have the strongest impacts on treatment outcome

89
Q

meta-analysis

A

A type of systematic review that pools the statistical results across studies of particular interventions and generates conclusions about which interventions have the strongest impact on treatment outcome

90
Q

Mediating variable (Intervening variable)

A

The mechanism by which an independent variable affects a dependent variable.

91
Q

Measurement Error

A

Data do not accurately portray the concept we attempt to measure -Systematic error -Random error

92
Q

Measurement equivalence

A

a measurement procedure developed in one culture will have the same value and meaning when administered to people in another culture.

93
Q

Longitudinal studies

A

Studies that conduct observations at different points in time.

94
Q

Linguistic equivalence

A

when an instrument has been translated and back-translated successfully.

95
Q

Known groups validity

A

Whether an instrument accurately differentiates between groups known to differ in respect to the variable being measured.

96
Q

Intervention Mixed Methods Design

A

Both approaches merged to get a better handle on the meaning of the results of an evaluation of an intervention

97
Q

Interval level of measurement

A

A level of measurement that describes variables (such as IQ or Fahrenheit temperature) whose attributes are rank-ordered and have equal distances between adjacent attributes, but which do not have a true zero point.

98
Q

Interpretivism

A

A research paradigm that focuses on gaining an empathic understanding of how people feel inside, seeking to interpret individuals’ everyday experiences, their deeper meanings and feelings, and the idiosyncratic reasons for their behaviors.-qualitative

99
Q

Interobserver reliability or inter-rater reliability

A

The degree of agreement or consistency between or among observers or raters.

100
Q

Internal consistency reliability

A

The degree to which scores among scale items, or scores among subsets of items, correlate with each other.

101
Q

Intensity sampling

A

A qualitative sampling technique similar to deviant case sampling in which cases are selected that are more or less intense than usual, but not so unusual that they would be called deviant.

102
Q

Influence by manner of presentation

A

Believing a claim because of the apparent sincerity, speaking voice, attractiveness, stage presence, likability, or other trait of a speaker

103
Q

Inductive method

A

A research process based on inductive logic, in which the researcher begins with observations, seeks patterns in those observations, and generates tentative conclusions from those patterns.

104
Q

Independent Variable

A

The variable in a hypothesis that is postulated to explain or cause another variable

105
Q

Hypothesis

A

A tentative and testable statement about how changes in one variable are expected to explain changes in another variable.

106
Q

Hypothesis

A

Tentative and testable statement about a presumed relationship between variables

107
Q

Hypotheses should be:

A

-be clear and specific -have more than one possible outcome -be value free -testable

108
Q

How do social workers know things?

A

-Agreement reality -Experiential Reality -Science -Tradition —Such as accumulated practice wisdom that has not been scientifically verified -Authority -Relying on “experts” -Common sense -Popular media

109
Q

Fundamental Attribution Error

A

The tendency to attribute the cause of behaviors to personal characteristics instead of the environment

110
Q

Four common types of EBP questions

A
  1. What intervention, program, or policy has the best effects? 2. What factors best predict desirable or undesirable consequences? 3. What’s it like to have had my client’s experiences? 4. What assessment tool should be used?
111
Q

Flaws in Unscientific Sources

A

-Inaccurate Observation -Overgeneralization -Selective Observation -Ex Post Facto Hypothesizing -Ego Involvement in Understanding -Premature Closure of Inquiry

112
Q

Feminist paradigm

A

A research paradigm, like the critical social science paradigm, distinguished by its commitment to using research procedures to address issues of concern to women and to empower women.

113
Q

Face validity

A

Whether a measure merely seems to be a reasonable way to measure some variable, based only on subjective judgement.

114
Q

Face Validity

A

A crude and subjective judgment by the researcher that a measure merely appears to measure what it is supposed to measure

115
Q

Exploratory Sequential Mixed Methods Design:QuantitativeStart

A

Quantitative Data Collection & Analysis  Follow up with  Qualitative Data Collection & Analysis  Interpretation

116
Q

Exploratory Sequential Mixed Methods Design: Qualitative Start

A

Qualitative Data Collection & Analysis  Follow up with  Quantitative Data Collection & Analysis  Interpretation

117
Q

Evidence-based practitioners will:

A

-Think for themselves -Consider whether beliefs or assertions of knowledge are based on sound evidence and logic -Think open mindedly, recognizing and questioning unstated assumptions underlying beliefs and assertions -Be willing to test their own beliefs or conclusions and then alter them on the basis of new experiences and evidence -Formulate appropriate questions and then gather and appraise evidence as a basis for making decisions

118
Q

Evidence-Based Practice

A

a process in which the best scientific evidence pertinent to a practice decision is an important part of the information practitioners consider when making that practice decision.

119
Q

Evidence-based practice

A

A process in which practitioners consider the best scientific evidence available pertinent to a particular practice decision as an important part of their decision making.

120
Q

Element

A

The unit selected in a sample about which information is collected.

121
Q

Doing a Literature Review early helps in what?

A

-Understanding if the question has already been answered -Building on existing research

122
Q

Direct observation

A

A source of data that can be used when operationally defining variables based on observing actual behavior.

123
Q

Deviant case sampling

A

A form of purposive sampling in which cases that don’t fit into regular patterns are selected to improve understanding of regular patterns.

124
Q

Dependent Variable

A

The variable in a hypothesis that is thought to be explained or caused by the independent variable

125
Q

Deductive method

A

A research process based on deductive logic, in which the researcher begins with a theory, derives hypotheses, and ultimately collects observations to test the hypotheses.

126
Q

Curvilinear relationship

A

A relationship in which the nature of the relationship changes at certain levels of the variables.

127
Q

Cultural Competence

A

being aware of and appropriately responding to the ways in which cultural factors and cultural differences should influence what we investigate, how we investigate, and how we interpret our findings

128
Q

Cross-sectional study

A

A Snapshot in time. Just one measurement with no follow-up.

129
Q

Critical Thinking

A

Careful appraisal of beliefs and actions to arrive at well-reasoned ones that maximize the likelihood of helping clients and avoiding harm.

130
Q

Critical social science

A

A research paradigm distinguished by its focus on oppression and its commitment to using research procedures to empower oppressed groups.

131
Q

Criterion-related validity

A

The degree to which an instrument relates to an external criterion that is believed to be another indicator or measure of the same variable that the instrument intends to measure.

132
Q

Criteria of evidence-informed client choice

A

1) The decision involves which intervention to use 2) The person is given research-based information about effectiveness of at least two alternatives, which may include doing nothing 3) The person provides input in the decision-making

133
Q

Convergent Mixed Methods Design

A

Qualitative Data Quantitative Data Collection & Analysis Collection & Analysis   Merge  Interpretation

134
Q

Control variable

A

A moderating variable that we seek to control by holding it constant in our research design.

135
Q

Content validity

A

The degree to which a measure seems to cover the entire range of meanings within a concept.

136
Q

Content Validity

A

-The degree to which a measure covers the range of meanings included within the concept -Established based on judgments as well

137
Q

Contemporary positivism

A

A paradigm that emphasizes the pursuit of objectivity in our quest to observe and understand reality.-quantitative

138
Q

Construct validity

A

The degree to which a measure relates to other variables as expected within a system of theoretical relationships and as reflected by the degree of its convergent and discriminant validity.

139
Q

Constant

A

One attribute that is included in a study without including other attributes of the same variable.

140
Q

Concurrent validity

A

The degree to which an instrument corresponds to an external criterion that is known concurrently.

141
Q

Conceptual equivalence

A

instruments and observed behaviors have the same meanings across cultures.

142
Q

Concept

A

A mental image that symbolizes an idea, an object, an event, a behavior, a person, etc.

143
Q

Concept

A

A mental image that symbolizes an idea, an object, an event, a behavior, or a person.

144
Q

Coefficient alpha

A

The average of the correlations between the scores of all possible subsets of half the items on a scale.

145
Q

Cluster sampling

A

A multistage sampling procedure that starts by sampling groups (clusters) of elements in the population and then sub sampling individual members of each selected group afterward.

146
Q

CIAO

A

Question formulation if one or more interventions are specified in advance: CIAO C: client characteristics I: intervention being considered A: alternative intervention (if any) O: outcome

147
Q

Bias

A

A distortion in measurement based on personal preferences and beliefs.

148
Q

Behavioral Confirmation Bias

A

The tendency to search for data that support favored positions and to ignore data that do not

149
Q

Bandwagon

A

illogical reasoning that posits the “everyone else is doing it” argument

150
Q

Available records

A

A source of data for a study, in which the information of concern has already been collected by others.

151
Q

Availability sampling

A

A sampling method that selects elements simply because of their ready availability and convenience. Frequently used in social work because it is usually less expensive than other methods and because other methods may not be feasible for a particular type of study or population.

152
Q

Attributes of Evidence-Based Practice

A

Critical thinking Career-long learning Flexibility -Integrating scientific knowledge with practice expertise and knowledge of client attributes

153
Q

Attributes

A

Concepts that make up a broader concept are called attributes, e.g. male/female vs. gender

154
Q

Attributes

A

Characteristics of persons or things.

155
Q

Appeal to tradition:

A

Accepting a practice solely because it has been used for a long time

156
Q

Appeal to numbers or popularity:

A

Relying on number of people who use a method or who have a belief

157
Q

Appeal to good intentions

A

Assuming that good intentions reflect good results

158
Q

Appeal to authority:

A

Basing claims solely on a person’s status; no evidence is provided to support or refute claims made

159
Q

Appeal to anecdotal experience

A

Accepting Or rejecting claims about the effectiveness of methods based on unsystematic personal experience

160
Q

After this, therefore on account of this-post hoc ergo propter hoc:

A

The incorrect belief that if Event A (a service program) precedes Event B (a positive outcome), A has caused B

161
Q

Ad-hominem appeals:

A

Attacking (or praising) the person rather than examining the person’s argument

162
Q

Ad hominem attack

A

illogical reasoning discrediting the person rather than the argument

163
Q

acquiescent response set

A

The tendency to agree or disagree with all statements regardless of their content.

164
Q

Acculturation

A

the process in which a group or individual changes after coming into contact with a majority culture, taking on the language, values, attitudes, and lifestyle preferences of the majority culture