Chapter 17 - 18 Flashcards
Qualitative Data Analysis + Conclusions
Qualitative data analysis
is aimed at making valid inferences from the often overwhelming amount of collected data
Steps of qualitative data analysis:
- data reduction: the process of selecting, coding and categorizing the data
- data display: ways of presenting the data
- drawing and verifying conclusions: based on patterns in the reduced set of data
coding
the analytic process through which the qualitative data that you have gathered are reduced, rearranged, and integrated to form a theory
codes:
are labels given to units of text which are later grouped and turned into categories
coding unit:
choose the unit of analysis—eg words, sentences, paragraphs, themes
categorization
the process of organizing, arranging, and classifying coding units
grounded theory
where there is no theory available, you must generate codes, and categories inductively from the data
data display
taking your reduced data and displaying them in an organized, condensed manner
category reliability
the extent to which judges are able to use category definitions to classify the qualitative data
interjudge reliabilty
% of coding agreements out of the total number of coding decisions
- agreement rates at or above 80% are considered to be satisfactory
validity in qualitative reseach
- accurately represent the collected data (internal)
- can be generalized or transferred to other contexts or settings (external)
how to know if it is valid
supporting generalization by counts of events
ensuring representatives of cases and the inclusion of deviant cases
- the selection of deviant cases provides a strong test of your theory!
content analysis
an observational research method that is used to systematically evaluate the symbolic content of all forms of recorded communication
-establishes the existence and frequency of concepts in text
relational analysis
builds on conceptual analysis by examining the relationships among concepts in the text
narrative analysis
a approach that aims to elicit and scrutinize the stories we tell ourselves and their implications for our lives
analytic induction
an approach where universal explanations of phenomenon one sought by the collection of data until no cases that are inconsistent with a hypothetical explanation of a phenomenon are found
big data
a term used to describe the exponential growth and availability of data from digital sources inside and outside the organization
- defined by its volume, variety, and velocity
conclusions
conclusions represent your informed judgement with regard to (how) the organizational problem (can best be solved)
- culmination of research project
- can be applied to inform evidence based decision making
argumentation
a conclusion (or claim)– which is the main point the argument is trying to establish
premises (the evidence)— which support the conclusion
if the premises are true…
then the conclusion must be true
- in such cases the argument is said to be valid
Amplitive (or inductive) arguments
- in real life, most arguments are not deductive
- the relationship between the premises and conclusion is less than perfect
- could be either good or bad
Toulmin’s Model
takes the conclusion as a starting point and emphasizes that conclusions are often based on probability (strong relationships) rather than on certainty (perfect relationships)
when is Toulmin’s model useful?
when arguments are amplitive (inductive)
- conclusion and premises are less than perfect
6 constituent parts of the Toulmin model
- the conclusion
- the premise
- the warrant
- a backing
- qualifiers
- counter arguments