12 - Analysing Qualitative Data Flashcards
Analytic induction
method of moving from particular to general via instances: theory is modified in the light of features of new instances.
Example: Analytic induction is like learning from examples. You start with specific cases, build a theory, and adjust it as you study more instances. It’s a way to generalize from specific examples.
Analytic procedure
the methodological procedure used to analyse data and its epistemological justification; usually located in methods section of qualitative reports.
Coding unit
item categories identified in qualitative data using content analysis.
Content analysis
search of qualitative materials (especially text) to find “coding units” (usually words, phrases or themes); analysis often concentrates on quantitative treatment of frequencies but can be purely qualitative approach.
Example: It can involve counting how often these coding units appear, which is a quantitative approach, or it can be a purely qualitative analysis, focusing on understanding the content’s meaning and context.
Contextuality constructionist
theory of knowledge (epistemological position), which sees knowledge and truth as relative; different versions are possible depending on the context in which knowledge claims are made.
They think that what’s considered true or knowledgeable can change depending on the context in which it’s discussed
Epistemology
Theory of knowledge and of how knowledge is constructed.
Basically: the study of knowledge and how we come to know things
Idiographic
Approach that emphasizes unique characteristics and experiences of the individual, not common traits.
In a nutshell, idiographic is an approach that focuses on the distinct and individual characteristics and experiences of a person, rather than looking at common traits shared by a group.
Inductive analysis
Work with qualitative data, which permits theory and hypotheses to evolve from the data rather than hypothetico-deductive testing of hypotheses set before data are obtained.
Instead of starting with preconceived hypotheses, it allows theories and hypotheses to emerge from the data itself. It’s a more flexible and open-ended approach where your understanding and ideas develop as you delve into the data, rather than testing hypotheses established before data collection.
Nomothetic
Approach that looks for common and usually measurable factors on which all individuals differ.
It aims to find general patterns and principles that can be applied to a broader population, focusing on what people have in common rather than their unique characteristics or experiences.
Paralinguistics
body movements and vocal sounds that accompany speech and modify its meaning
Example: pauses, “uhm”, “err” etc. Ref pappa mentioned it in his book.
Radical constructionist
theory of knowledge (epistemological position) that sees knowledge and truth as semantic constructions.
This perspective questions the notion of objective reality and emphasizes the role of human perception and language in shaping what we consider to be knowledge and truth.
Respondent validation/ member checking
attempts to validate findings and interpretations by presenting these to original participants for comments and verification.
This helps ensure that the participants’ perspectives align with the way you’ve interpreted the data, adding credibility to your research outcomes.
Selective coding
Higher order treatment of initial themes and categories where superordinate themes may emerge that bind lower categories together.
It involves taking the initial themes and categories identified in your data and further refining them. Selective coding is like organizing your data into a more structured and comprehensive framework, with the superordinate themes serving as the glue that binds together the lower-level categories.
Triangulation
Comparison of at least 2 views/ explanations of the same thing(s) - events, actions etc..
It’s like looking at something from multiple angles to get a more comprehensive and accurate understanding