analysing qualitative data Flashcards
What is content analysis? (2m)
. method of analysing qualitative data
. changing large amounts of qualitative data
. into quantitative
. done by identifying meaningful codes
. can be counted
. so data is presented on graph
Why is it appropriate to use a content analysis? (1m)
. data being analysed is qualitatice
What is meant by coding? (1m)
. initial process of content analysis
. qualitative data is placed into meaningful categories
How is a content analysis carried out?
Explain how you would analyse qualitative data? (4m)
. read/view video or transcript (CONTEXT)
. identify/create coding categories (GIVE E.G.)
. re-read/re-listen
. tally every time each code appears
. present quantitative data in graph/table
What is a thematic analysis? (2m)
. method of analysing qualitative data
. identifying emergent themes
. present data in qualitative format
. e.g. presentation, text, newspaper
How is a thematic analysis carried out? (2-4m)
. watch/listen to video/recording create a transcript (CONTEXT) - IF NOT ALREADY TRANSCRIPT
. read and re-read transcript
. identify coding categories
. look for words which appear repeatedly
. combine codes to reduce the number
. into 3/4 themes that are linked (CONTEXT) (GIVE E.G.)
. present data in qualitative format
Strength of content/thematic analysis - AO3
. easy to assess reliability of findings and conclusions
. as other researchers can access the materials
. and use coding system
. to ensure findings consistent
Limitation of content/thematic analysis - AO3
. researcher bias
. content which confirms researcher’s hypothesis more likely to be identified
. than content that contradicts aims and expectations
. lowers internal validity
COUNTER ARGUEMENT:
. many modern researchers aware of own biases
. often make references to these in report
Definition of reliability
. ability to repeat a study
. in similar conditions
. to gain consistent results
Reliability of content analysis using test re-test
. complete content analysis by creating series of coding categories (GIVE E.G.)
. tally every time it occurs within data
. same researcher repeats content analysis
. on same qualitative data and tally every time
. compare results from each content analysis
. correlate results using stats test
. strong positive correlation of +0.8 = high reliability
Reliability of content analysis using inter-rater reliability
. two raters (psychologists)
. read through qualitative data separately
. create coding categories together (GIVE E.G.)
. two raters read exact same content (CONTEXT)
. tally each time categories occur separately
. compare tallies from both raters
. correlate using appropriate stats test
. strong positive correlation of +0.8 = high reliability
Define operationalising (2m)
- to be specific and clear
- when defining coding categories
- to make the codes more measurable
Importance of improving reliability of content analysis
- if coding categories are not operationalised not possible to repeat to check for consistent results
Assessing validity of content analysis using face validity
- quickest most superficial way
- independent psychologist in same field
- see if coding category (CONTEXT)
- looks like measures what it claims to measure (CONTEXT)
- at face value / first sight
- if YES content analysis is valid
Assessing validity of content analysis using concurrent validity
- comparing results of new content analysis (CONTEXT)
- with results from another similar pre-existing content analysis
- already established for its validity
- if results from both are similar
- assume test is valid
- correlation of two coding results
- gained from appropriate stats test should exceed +0.8