Topic 3 Research Design: An Overview Flashcards
What is a Research Design
the overall strategy that you choose to integrate the different components of the study in a coherent and logical way
- plan for answering a research question using empirical data
Method-> design
Quantitative + Qualitative
Focus
Quan = specific beh that ben easily quantified (questionnaires)
Quali= people beh in natural settings and describing their world in their own words (discussions)
Objective
Quan= quantify data and generalise results from a sample to understand the population of interest -> track data over time
Qual= understanding underlying reasons or motivations -> provide insight into setting of a problem
HOW MUCH Sample and data collection FOR QUAN AND QUAL STUDIES
Quan = LArge and broad, statistically projectable
- standardised instruments, operation of variables
Qual = small and narrow, not statistically projectable
- adapted to the situation, variables not defined in advance
types of quantitative research
types of qualitive design
Quan
- experimental
-> true-experimental
— Iv manipulated
— random assignment
— Control group or multiple measure
-> quasi-experimental
— Iv manipulated
— NOT random assignment
— Control group or multiple measure
-> single-subject
— IV manipulated
- non-experimental
-> Analytical/correlational (w comparison)
-> Descriptive (w/out)
survey studies, naturalistic/ecology, meta-analysis study, case-study
qual
- ethnography, case study, historical, in-depth interview, document review
observational methods
objectives, methods, issues
naturalistic objectives, methods, issues (qualitative)
-> complete accurate picture of what happened in setting
-> keep detailed field of notes, write or dictate on a regular basis
-> subectivity, alteration of results if not concealed observation, time consuming
Systematic objectives, methods, issues (quantitative)
-> obs. of a specific beh in a setting, develop hypothesis
-> coding systems: decide which behaviors are of interest, choose a setting, observe and codify
-> equipment, reactivity, reliability
quan - Experimental -> true experiment
goal> psych to understand human beh - accurately describe causal underpinning - predict beh
-> cause-effect relationship
-> using predictive analytics
is the use of statistical algorithms to identify the likelihood of future outcomes based on data
-> having high internal validity
estabilish trustworthy v relationship (eliminate alt explanations)
quan - Experimental -> true experiment 2
Maximise independent variance
Minimize error
controlling external variable
-> large samples outcomes predicted by chance have a Normal (Gaussian) distribution (bell shaped curve)
mean = average outcome
other outcomes are distributed around the mean
-> outcomes gets smaller
variance
variance is the amount that scores vary around the mean score (SHOULD BE SIMILAR BETWEEN THE GROUPS)
= HOMOgeneity (sameness) of variance
= non-homogeneity of variance => if they are not similar, it might affect the validity of the outcome. Statistical significance is often based on how much the scores vary
How do we minimise error variance / controlling external variable?
- large group of participants
- suitable measuring instruments
- rigorous research planning
- balancing the design
- removal of outliers
- assumption of normality
controlling external variables - balancing the design
balanced designs have equal numbers of observation for all possible level combinations - compared to unbalanced
balance 30 - 30 - 30
unbalanced 30 - 28 - 30
controlling external variables - removal of outliers
def - data points that significantly differ from other observations. They can arise from data variability or measurement errors, potentially skewing results and leading to inaccurate conclusions. Identifying :
-> visualise :
Box Plots - Show the spread of data points to identify outliers.
Scatter Plots - Visualize relationships between variables to spot outliers.
-> statistical methods
Z-Scores - Calculate the z-score for each data point; values beyond ±3 are typically outliers.
Interquartile Range (IQR) - Calculate IQR (Q3 - Q1). Points below Q1 - 1.5IQR or above Q3 + 1.5IQR are considered outliers.
-> automated detection
Automated Detection - Use algorithms like DBSCAN to detect outliers based on data distribution.
Controlling external variable - Assumption of normality
When the data do not have a normal distribution. A possible way to fix this is to apply a statistical transformation. Transforming data is a method of changing the distribution by applying a mathematical function to each participant’s data value. -
like log transformation for a positively skewed to a normal distribution
exponential transformation for a negatively skewed residual to a normal distribution
DEFINING THE EXPERIEMENTAL METHOD
- cross-sectional studies
different group of people are tested at the same time and their results are compared. Quick to carry out, easily replicated to test the reliability of the findings - longitudinal studies
participants are studied over a long period of time -> track development, monitor changes over time - cohort studies
several groups are studied over a long period of time
example
exposed -> ->
time passes compare
not exposed -> ->
Within-subject
each participant takes part in all experimental conditions
between-subject
each participant takes part in one experimental condition
Random assignment to experimental conditions
benefit:
-eliminates the selection bias
-balances the groups with respect to many known and unknown confounding or prognostic variables
-> randomised an experiment is essential for testing the efficacy of the treatment
in a clinical research, if treatment groups are systematically different, research results will be biased
Quasi-experimental design
Anything less is quasi-experimental (‘‘almost’’ experimental). In general, a quasi-experiment exists whenever causal conclusions cannot be drawn because there is less than complete control over the variables in the study, usually because random assignment is not feasible.
-> Impossibility of creating experimental conditions: single-group studies.
aim:
― Analytical-predictive objectives
― Impossibility to control confounding variables
― Reduced internal validity (because extraneous variables are not held constant)
The quasi-experimental, sometimes called the pre-post intervention design is often used to evaluate the benefits of specific interventions.
Single-subject
Individual “case” is the unit of intervention and unit of data analysis; The case provides its own control for purposes of comparison.
example
-> Management at an electric fry pan company is testing a flextime schedule to boost productivity by improving morale. Pittsburgh plant adopts flextime, Cleveland plant maintains the regular schedule. Productivity in both plants will be measured and compared to evaluate the effect.
Correlational vs experimental
C: concerned with investigating the relationships between naturally occurring variables and with studying individual differences
E: interested in minimising or controlling these differences in order to show that some stimulus factor influences beh
Analytical/ correlations study
Positive correlations
both vs increase or decrease at the same time
negative correlations
as the amount of one variable increases the other decreases
no correlation
there is no relationship between the two v
correlation coefficient (r)
A correlation coefficient is a statistic that describes how strongly variables are related to one another
P r provides info about the strength of the relationship and the direction of the relationship
0.8< r < 1 STRONG
0.5 < r < 0.8 moderate
< 0.5 WEAK
The Pearson correlation coefficient (r) is designed to detect only linear relationships. -> curvilinear relationship
Common approaches in correlationational research
- survey study
- archival data
- content analysis
-> method for collecting information or data as reported by individuals. Administered to research participants who answer the questions themselves
-> data that has already been collected for another purpose Newspapers, census data, institutional records, hospital records
-> Analyze spoken or written record
Common approaches in correlational research
-> meta-analysis
-> case study
-> Statistical procedure combining data from multiple studies
Systematic and quantitative, combines data in order to provide information about effect sizes
— Identify studies ; Determine eligibility of studies ; Get the data from these studies ; Statistically analyze the data
-> The case study research design is useful for testing whether scientific theories and models actually work in the real world. (CREd library - clinical research education)
single-subject design vs. case studies
Common approaches in correlational research
- Systematic observation
Make observations in a natural environment - Utilizes coding- target behaviors are specified and then watched
Rank of internal validity
In order of lowest internal validity to highest: Correlational (low)
Quasi-experimental (moderate) Experimental (high)
DESCRIPTIVE STUDY
-> freq. distribution
-> descriptive study
-> (when analysing data) A frequency distribution indicates the number of individuals who receive each possible score on a variable
-> DS allows scientists to make precise statements about the data
=> central tendency - mean, median, mode (most freq score)
=> variability - standard deviation (ave. deviation of scores from mean)
QUALITATIVE RESEARCH - ethnography
-> prolonged study of an intact culture in its normal setting. the researcher gather primarily observational data
not a static design, wherein the categories [or themes described by the participants] develop during the study
QUALITATIVE RESEARCH - case study
-> The researcher explores a single entity or phenomenon (‘the case’) bounded by time and activity (a program, event, process, institution, or social group)
collects detailed information by using a variety of data collection procedures
-> Phineas Gage
pole through skull (the denotation that the accident in the frontal brain affected his personality entirely)
QUALITATIVE RESEARCH
- Historical
The use of historical data to answer a question
-> systematic collection and objective evaluation of data related to past occurrences in order to test hypotheses concerning causes, effects, or trends of these events that may help to explain present events and anticipate future events
QUALITATIVE RESEARCH
- In-depth interview
- document review
-> Conducting intensive individual interviews with a small number of respondents to explore their perspectives on a particular idea, program, or situation.
-> Documents are interpreted by the researcher to give voice and meaning around an assessment topic. Analyzing documents incorporates coding content into themes similar to how focus group or interview transcripts are analyzed.
Literature Review, Systematic Review, Meta-analysis, Umbrella review
- comprehensive summary of all the knowledge available on a specific topic. It involves reading and synthesizing the findings from various sources like books, articles, and other research papers.
- detailed and comprehensive review of existing literature using a structured and predefined method. It aims to minimize bias by using a systematic approach to search, appraise, and synthesize research evidence.
- statistical technique used within a systematic review to combine the results of multiple studies. It provides a more precise estimate of the effect size by pooling data.
- (overview of reviews) synthesizes data from multiple systematic reviews and meta-analyses on a broad topic.