Week 3 : Conceptualization, Generalization, and Validity Flashcards

1
Q

Process of conceptualization & operationalization

A
  • abstract concept in hypothesis –>
  • (conceptualization) concrete definition of concept –>
  • (operationalization) measure of defined concept
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Definition of conceptualization

A

the process of precisely defining ideas and turning them into variables

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Definition of Operationalization…

A

The process of linking the conceptualized variables to a set of procedures for measuring them

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Example conceptualizing poverty

A
  • defining the concept ‘poverty;
  • unit of analysis = person, household, country, etc.
  • ‘have less’, ‘have not’, ‘minimum’, ‘basic needs’
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Example operationalizing ‘poverty’

A
  • measuring the concept ‘poverty’
  • absolute standards (specific quantitative assessment) = whether the person has food, clothing, shelter, etc.
  • relative standard = whether the person falls below the average income level (many other options)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Concepts

A
  • highly abstact ideas that summarize social phenomena & are linked together within hypotheses
  • rigorous empirical social science wants to make these concepts into something more concrete (this starts with translating concepts into variables
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Variables…

A
  • representations that capture the presence or absence of a concept or the level of a concept
  • E.g. ‘are you in love? Yes or no’ and ‘how much would you say you love your partner?’
  • To study love, researchers have to change the concept of ‘love’ to something concrete that someone can have more or less of than someone else
  • in this way, variables break down concepts into data points that can be compared
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Units of analysis

A
  • the unit of analysis is the level of social life that we want to generalize
  • social artifacts are a unit of analysis that encompasses aspects of social life that can be counted
  • a single concept can be studied with different units of analysis
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Dimensions…

A
  • general concepts usually encompass multiple dimensions
  • components that represent different manifestations, angles or units of the concepts
  • dimensions are about conceptualization
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

4 main types of variables

1) Nominal variable

A
  • categorial variable (finite set of values)
  • categorizes states/statuses that are parallel and cannot be ranked or ordered
  • e.g. race, hair colour, school sector
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

4 main types of variables

2) Ordinal variable

A
  • categorial
  • have categories that can be organized in some way
  • categories can be ranked from low-high, but the difference/distance between ranks cannot be known
  • e.g. school quality
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

4 main types of variables

3) Interval variable

A
  • continuous
  • have a continuum of values with meaningful distances (or intervals) between them, but no true zero
  • categories can be compared explicitly instead of just saying one is better than the other, but we cannot think of them in terms of proportions (cuz no true 0)
  • E.g. temperature, time of day, SAT score
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

4 main types of variables

4) ratio variables

A
  • continuous
  • interval variables that do have a true 0
  • so they can be compared & contrasted with most detail
  • differences can be expressed in terms of the distance between values but also in proportion or percentage of another
  • e.g. school size, body weight
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Composite variables

A
  • one that averages a set of variables to measure one conept
  • internal reliability… the degree to which the various items in a composite variable lead to a consistent response (use Chronbach’s alpha)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Indicators

A
  • the last step
  • indicators are the values assigned to a variable
  • they are part of the plan that researchers develop to sort the variable into various categories
  • Identifying an indicator is a conceptual procedure that provides the basis for measurement
  • Indicator : a household will be classified as ‘poor’ if they have the value of ___ in the variable___
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Operationalization involves 2 steps…

A
  1. converting a conceptual definitions into an operational definitionn that sets the parameters for measurement
  2. using the operational definition to collect data
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Operationalization in quantiative vs qualitative…

A
  • quantitative = operationalization is an end result of the conceptualization process
  • qualitative = usually starts with a more open conceptualization process & observation refines the process to lead to a conceptual definition
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Reductionism…

A
  • occurs when conclusions are made about the macro level unit on the basis of analyses of micro-level data
  • reduces something complex into too-simple terms or simplifies societal or group phenomena into individualist explanations
19
Q

4 basic forms of measurement

1 - Reports

A
  • direct feedback, written or verbal, from people
  • 1) open ended questions
  • 2) closed ended questions
  • 3) response categories (must be exhaustive & categories must be mutually exclusive)
20
Q

4 basic forms of measurement

2 - Observation

A
  • Observation is the process of seeing, recording & assessing social phenomena
  • commonly used in qualitative studies (ethnography)
  • generally focuses on… degree & amount/frequency
21
Q

4 basic forms of measurement

3 - artifact counts/assessments

A
  • are tied to content analysis and similar materials-based methodologies
  • involves the cataloging of social artifacts & objects qualitatively or quantitatively
22
Q

4 basic forms of measurement

4 - Manipulation

A
  • usually paired w other forms of measurement during experiments
  • experimenters need to ‘do something’ to subjects & observe results
  • control group & experimental group
23
Q

Cross-sectional study designs

A
  • data are collected at only one point from different groups of ppl
24
Q

Longitudinal study designs

A
  • data are collected at multiple points in time
  • repeated cross-sectional study design = data are collected at multiple points but from diff peple ay each time (new sample at each stage)
  • panel design = data are collected multiple times from the same ppl, the same sample at each stage (attrition problems)
25
Q

Assessing measurements

Reliability

A
  • refers to dependability, consistency and predictability
  • how consistently the same operation yields the same results
  • not about being correct, just being stable & predictable
26
Q

Assessing measurements

Validity

A
  • Refers to the degree to which a measure truly and accurately measures the defined concepts
  • This is highest when conceptualization is strongly linked to operationalization
  • a highly valid measure/method gets to the heart of the concept being mesured
  • validity is about being correct, not stability
  • much harder to assess than reliability
27
Q

Qualitative research

credibility & dependability

A
  • credibility = Do the respondents agree with how the researcher is presenting and interpreting their words?
  • dependability = whether the findings are consistent & whether the same findings would be repeated given the same circumstances/context
28
Q

Reliability

Chronbach’s alpha

A
  • straightforward calculation that measures a specific kind of reliability for a composite variable
  • composite variable = a variable that averages a set of items (typically from a survey) to measure some specific concept
  • internal reliability = concents the degree to which the various items lead to a consistent response & therefore is the same concept
  • higher the alpha the more internally reliable the composite is - meaning ppl respond to the various items in a consistent way (0-1)
29
Q

Reliability

Intercoder reliability

A
  • when multiple observers or coders are used in data collection, it’s important to report meaures of intercoder reliability
  • reveals how much difference coders/observers agree with one another when looking at the same data
  • Cohen’s kappa = statistic for assessing intercoder reliability that guages level of agreement between 2 raters in categorizing the same set of cases and is measured on a 0-1 scale
30
Q

Reliability in design

A
  • the most effective method for increasing the reliability of a research method is to do a very careful job of conceptualization
  • Conceptualization that is both thorough and sound will generally point to the most reliable operationalization
31
Q

Reliability

Precision

A
  • precision is a key elemenet of meaurement that supports reliability
  • the more detailed & precise the measures are, the more reliable they tend to be
  • tip is to use multiple indicators to measure any one construct
32
Q

Reliability

Robustness

A
  • how well the operational protocol is working
  • aims to guage the robustness of some operational approach in the early stages of a study so that the researcher can asses any reliability problems before the study is fully underway
  • split-half method, test-retest method, pilot testing
33
Q

Reliability - robustness

1 - split-half method

A
  • appropriate when dealing with a measure consisting of multiple items
  • researcher randomly splits the set of items for a measure into 2 sets to create two seperate measures instead of one
  • these two measures are then tested in a sample of individuals
  • comparing how the same ppl respond to the two measures gives some idea of how reliable the overall measure would be
34
Q

Reliability - robustness

2 - test-retest method

A
  • the same measure is administered to a sample and then readministered ot the same sample later
  • if the scores are similar it is reliable
  • issue… other factors could change responses that have nothing to do with reliability
35
Q

Reliability - robustness

3 - pilot testing

A
  • involves administering some version of a measurement protocal to a small preliminary sample of subjects as a means of assessing how well the measure works & whether any problems need fixed
  • after the polit test, researchers make necessary corrections before officially administering the measure to a larger sample
36
Q

Validity

Internal validity of a study

A
  • refers to the degree to which a study establishes a causal effect of the independent variable on the dependent variable
  • experimental designs are the most internallt valid cuz they can establish causality
37
Q

Validity - internal validity of a measure

1 - face validity

A
  • the simplest dimension of internal validity
  • whether something looks valid on the face of it
  • the 1st and most shallow assessment of validity that a reasearcher could make
38
Q

Validity - internal validity of a measure

2 - criterion-related validity

A
  • assesses the validity of a measure according to how closely it is associated with some other factor
  • we gain confidence in the validity of a measure when it’s related to things which it is supposed to and unrelated to things which it should be unrelated
  • concurent validity = will correlate strongly with some pre-existing measure of the construct that has already been deeemed valid
  • predictive validity = it will correlate strongly with a measure that it should predict
  • why not just use the best pre-existing meaure? feasibility
39
Q

Validity - internal validity of a measure

3 - content validity

A
  • all about coverage
  • how well a measure is encompassing the many different meanings of a single concept & all possible answers
40
Q

Validity - internal validity of a measure

4 - construct validity

A
  • refers to how well multiple indicators are connected to some underlying factor
  • important cuz some phenomena are not directly observable and can only be identified by catalouging their observable symptoms
  • convergent validity = a dimension of validity guaging whether concepts that should be associated with each other are
  • discriminant validity = a dimension of validity guaging whether concepts that should not be associated with each other are not
41
Q

construct validity example

A

Example: Happiness
A. In the past week, how often have you felt at peace?
B. In the past week, how often have you felt angry?
C. In the past week, how often do you take public transport?

42
Q

Factors threatening internal validity…

A
  • Participant attention
  • Experimentor effects
  • Selection bias (control & treatment groups differ)
43
Q

External validity…

A
  • the degree to which the results of a study can generalize beyond the study
  • involves 2 basic questions
  • 1) how representative is the group being studied? (E.g. if only men are included on a study about parenting that doesn’t really make any sense)
  • 2) how ‘real’ is the study?