Social Research Methods Flashcards
Paradigm
A model or frame of reference through which to observe and understand. Ways of looking at things/social life. Fundamental frameworks or viewpoints for observation and reasoning
Macrotheory
A theory aimed at understanding the “big picture” of institutions, whole societies, and the interactions among societies.
Microtheory
A theory aimed at understanding social life at the intimate level of individuals and their interactions.
Difference between paradigm and theory
A paradigm is a general framework or viewpoint (a “way of looking”). Theory aims at explaining what we see. Theories flesh out and specify paradigms
Theories
Systematic sets of interrelated statements intended to explain some aspect of social life. Systematic sets of interrelated statements intended to explain things. Logical explanations of relationships of concepts or variables. Must be universal and abstract. Can generalize main ideas across locations and time.
Operationalization
One step beyond conceptualization. Operationalization is the process of developing operational definitions, or specifying the exact operations involved in measuring a variable. Or, how will we measure our variables?
Operationalization
One step beyond conceptualization. Operationalization is the process of developing operational definitions, or specifying the exact operations involved in measuring a variable. Or, how will we measure our variables?
Development of specific research procedures in which concepts are mapped to empirical observations in the real world
How to measure a concept? indicators
Deductive model of research
Research is used to test theories. theory->hypothesis->observation->confirmation
Research to test theories
Inductive model of research
Research is used to develop theories (based on the analysis of research data).
observation->pattern->tentative hypothesis->theory
Theory as a result of observations
Three Key Principles of The Belmont Report
- Respect for Persons 2. Beneficence 3. Justice
IRB
Institutional Review Board–required for federally-funded research, often used by universities and colleges. Chief responsibility is to ensure that the risks faced by human participants are minimal. Reviews all research proposals involving human subjects
Informed Consent
A norm in which subjects base their voluntary participating in research projects on a full understanding of the possible risks involved. Required a statement that subjects voluntarily participate in research and fully understand any possible risks involved in the research.
Anonymity
Achieved in a research project when neither the researchers nor the readers of the findings can identify a given response with a given respondent
Confidentiality
Guaranteed when the researcher can identify a given person’s response but promises not to do so publically
Debriefing
Interviewing subjects to learn about their experience of participation in the project. This is especially important if there’s a possibility that they have been damaged by that participation
Three Purposes of Research
Exploration, Description, Explanation
Idiographic Explanation
Tries to identify all the factors contributing to one situation. Attempts to explain a single situation exhaustively.
Nomothetic Explanation
Tries to identify a few common factors contributing common to many situations. Attempts to explain a set of situations (a general law) rather than a single case
Criteria for nomothetic causality
1) correlation 2) cause takes place before effects 3) variables are nonspurious
Correlation
An empirical relationship between two variables such that (1) changes in one are associated with changes in the other or (2) particular attributes of one variable are associated with particular attributes of the other. Correlation in and of itself does not constitute a causal relationship between the two variables, but it is one criterion of causality
Spurious relationship
A coincidental statistical correlation between two variables, shown to be caused by some third variable. A is correlated with B. In reality, neither A nor B is a cause of the other. Instead, C causes A and B.
Necessary cause
Condition that must be present for the effect to follow.
Sufficient cause
Condition that, if present, guarantees the effect, but may not be the only cause.
Units of analysis
The what or whom being studied. In social science research, the most typical units of analysis are individual people. May be individuals, groups, departments, organizations, or some phenomena such as lifestyles.
Social artifact
Any product of social beings or their behavior. Can be a unit of analysis
Ecological fallacy
Erroneously drawing conclusions about individuals solely from the observations of groups.
Cross-sectional study
Observations of a sample, or cross section, of a population or phenomenon that are made at one point in time.
Longitudinal study
A study design involving the collection of data at different points in time.
Trend study
A type of longitudinal study in which a given characteristic of some population is monitored over time.
Cohort study
A study in which some specific sub-population, or cohort, is studied over time, although data may be collected from different members in each set of observations
Panel study
A type of longitudinal study, in which data are collected from the same set of people (the sample or panel) at several points in time.
Conceptualization
The process whereby fuzzy and imprecise notions (concepts) are made more specific and precise.
- Specifies what we mean when we use particular terms
- Involves specification and refinement of abstract concepts
- Produces specific agreed-upon meanings “concepts” for research
Constructs
theoretical creations that are based on observations but that cannot be observed directly or indirectly
Concept
constructs derived by mutual agreement from mental images (conceptions). Abstract terms representing common characteristics of objects. Basic building blocks of theory.
Conceptions
summarize collections of seemingly related observations and experiences
Indicator
An observation that we choose to consider as a reflection of a variable we wish to study. Thus, for example, attending religious services might be considered an indicator of religiosity.
Concrete observable behaviors
Dimension
A specifiable aspect of a concept. “Religiosity” for example, might be specified in terms of a belief dimension, a ritual dimension, a devotional dimension, a knowledge dimension, and so forth
Specifiable aspects of a concept
Distinguishable components of a more abstract concept
Specification
The process through which concepts are made more specific
Nominal measures
Names or labels for characteristics
A nominal variable has attributes that are merely different, as distinguished from ordinal, interval, or ratio measures. All a nominal variable can tell us about two people is if they are the same or different
Ordinal measure
Logically rank-ordered attributes. No meaning for distance
A level of measurement describing a variable with attributes we can rank-order along some dimension. An example is socioeconomic status as composed of the attributes high, medium, and low.
Interval measure
Rank-ordered attributes. Meaningful distance between attributes
A level of measurement describing a variable whose attributes are rank-ordered and have equal distances between adjacent attributes. The Fahrenheit temperature scale is an example of this, because the distance between 17 and 18 is the same as that between 89 and 90.
Ratio measure
A meaningful absolute zero point. A meaningful fraction or ratio.
A level of measurement describing a variable with attributes that have all the qualities of nominal, ordinal, and interval measures and in addition are based on a “true zero” point. Age is an example of a ratio measure
Reliability
A matter of stability and consistency. The same result over and over again
That quality of measurement method that suggests that the same data would have been collected each time in repeated observations of the same phenomenon. In the context of a survey, we would expect that the question “Did you attend religious services last week?” would have higher reliability than the questions”About how many times have you attended religious services in your life?” This is not to be confused with validity.
Does this measure give consistent results?
Test-Retest Method
Help ensure reliability by testing reliability of subjects. Subjects take same test after period of time.
Validity
Concerns the extent to which indicators measure the real meaning of the concept under consideration. The extent to which any measuring instrument measures what it is intended to measure. Does the empirical indicator truly reflect what the concept means?
The extent to which an empirical measure adequately reflects the real meaning of the concept under consideration.
Face validity
That quantity of an indicator that makes it seem a reasonable measure of some variable. Assesses whether “on its face” empirical indicators appear to be a good translation of the construct
Criterion-related validity
The degree to which a measure relates to some external criterion. Also called predictive validity (the ability of a measure to predict reality).
Construct validity
The degree to which a measure relates to other variables as expected within a system of theoretical relationships.
Content validity
How much a measure covers the range of meanings included within a concept.
The extent to which a measure adequately represents the components of the total domain of a concept
index
a type of composite measure that summarizes and rank-orders several specific observations and represents some more-general dimension
A composite score of a variable. A variable measured by several items/ indicators in an ordinal level
- Combine indicators into a single numerical score
- Simply adding scores of each of the items/ indicators. Equal weight on each item
internal validation/item analysis
an assessment of whether each of the items included in a composite measure makes an independent contribution or merely duplicates the contribution of other items in the measure.
Examines the extent to which the index is related to the individual items
- The relationship between individual items in the index and the index
- Addresses reliability
external validation
the process of testing the validity of a measure, such as an index or scale, by examining its relationship to other, presumed indicators of the same variable. If the index really measure prejudice, for example, it should correlate with other indicators of prejudice.
- Those respondents scored as the most extreme on the index should score as the most extreme in answering other items which were not included in the index.
- Addresses validity
Social science
A systematic observation of human behavior for understanding social patterns
Major aspects of social science
Theory, not philosophy: What and why, not what should be
Social regularities/patterns
Aggregates, not individuals: The collective behavior of individuals. Social theories based on aggregate patterns
Variables
Things that are being observed, logical groupings of attributes
Independent variable
A variable that determines another variable
Dependent variable
A variable that is caused by another variable
Attributes
Characteristics that describe an object
Qualitative
non-numerical
Quantitative
numerical
Two important elements of scientific research
Logic
Observation
Voluntary participation
No one should be forced to participate
No harm to the participants
Any risk? Any physical harm, psychological distress, or social, economical and legal risk?
Maximize possible benefits and minimize possible harms
The Belmont Report
Released in 1979 by the National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research in response to the Tuskegee Study, The Belmont Report is the foundational document of the current system of U.S. human subjects protections. The Belmont Report outlines three key ethical principles for conducting research with human subjects: respect for persons, beneficence, and justice.
Respect for persons
Respect for persons incorporates at least two ethical convictions: first, that individuals should be treated as autonomous agents, and second, that persons with diminished autonomy are entitled to protection.
Beneficence
Persons are treated in an ethical manner not only by respecting their decisions and protecting them from harm, but also by making efforts to secure their well-being. Two general rules have been formulated as complementary expression of beneficent actions in this sense: (1) do not harm and (2) maximize possible benefits and minimize possible harms.
Justice
Who ought to receive the benefits of research and bear its burdens? This is a question of justice, in the sense of ‘fairness in distribution’ or ‘what is deserved.’ An injustice occurs when some benefit to which a person is entitled is denied without good reason or when some burden is imposed unduly.
Deception
Justification with compelling scientific or administrative concerns -> Debriefing
Types of IRB Review
Exempt, Expedited, Full-Review
Exempt
No or minimal risk. In general, research which does not propose to disrupt or manipulate the normal life experiences of subjects, incorporate any form of intrusive procedures, or involve deception will be exempt from full Committee review
Expedited
No or minimal risk & certain categories (e.g., collecting images, voices, etc.). Does not include intentional deception, does not employ sensitive populations or topics, and includes appropriate consent procedures.
Full Review
Not qualifying for exempt or expedited review
Vulnerable populations
- Children (under the age of 18) – obtain both assent from the child and permission from parent(s) or guardian(s)
- Prisoners/ other institutionalized persons
- Individuals receiving protective services or treatment for mental illness, alcohol or drug dependency
- Individuals with impaired decision-making capacity
Exploratory study
Attempt to develop an initial and new understanding of some phenomena. Lack of any rigorous theories or expectations
Descriptive Study
Attempt to describe “what and how things are.” Many qualitative studies
Explanatory Study
Attempt to explain “why things are”. Discovering the relationships among variables
Units of Observation
Who is actually observed. For example, men and women are observed for a study on couples.
Conceptual definition
A verbal description of essential properties of the concept’s meaning
Operational definition/variable
Specific statement about how the concept is to be measured. Operational variable also called an indicator
Random Error
The result of temporary variations. Adds variability to the data, but does not affect the average performance for observed sample.
Inversely related to reliability
Nonrandom error
Also known as systematic error or bias. The result of systematic variations of measurement
Name 4 Reliability Assessments
- Test-retest
- Parallel-forms
- Intercoder reliability
- Internal consistency
Parallel-Forms Reliability
Test two parallel forms which measure the same construct on the same occasion
Intercoder Reliability
Test whether different observers using the same measure obtain equivalent results
Internal Consistency
Test whether a set of different indicators of the same concept are homogenous. Looking for correlation
Cronbach’s Alpha
Most common test for internal consistency
Convergent Validity
Assesses whether different measures of the same concept are highly correlated to each other. The same concept measured in different ways yields similar results.
Discriminant Validity
Assesses whether the concept measured can be differentiated from other concepts that are theoretically intended to differ
Name 4 Measures of Validity
- Face Validity
- Content Validity
- Convergent Validity
- Discriminant Validity