Week 4 Flashcards
Strategies associated with quantitative research
–>Surveys
*(often) theory testing
!! establish correlation, not causation!!
*primary data: quant or qual
–>Experiments
*theory testing
!!!Establish causation!!!
*primary data: quant or qual
–>Decision science
*to solve problems, improve the process, forecast
*mathematical modeling
–>Secondary data analysis (regressions run etc.)
*usually quant but also qual. data
*(often) theory testing but also to explore a new topic
*very prominent in some fields (finance etc)
Theory testing: Hypotheses
More than 1 hypothesis developed from existing literature (theory).
Conceptual model: representation of all the hypotheses
*A good formulation of a hypothesis always includes the direction of effect (not just ‘A influences B’, but ‘A increases B’, or ‘the more A, the more B’)
*Measure with data and test with stats: reject or not.
Types of variables in hypotheses
- IV or explanatory
Explains change in DV. - DV
Changed, or influenced, by IV - Moderator
Changes the strength of a relationship between two variables - Mediator
‘Transmits’ the effect of one variable on another.
Design of survey: important components
- Define variables to measure
–> also control variables!! - Operationalisation
- Questionnaire design
–>attention checks
–> self of interview administered - Pilot test
- population/sample technique
- Which methods to combine
!!!It is important to correctly identify types of variables upfront – otherwise you cannot design your experiment!!!
Quality of surveys: risks
- Quality of questionnaire
–> Risk to CONSTRUCT VALIDITY
<do questions actually cover what we intend to cover?>
Fix:
*reliability check (Cronbach’s alpha)
*use previously validated scales
- Quality of sampling
–>Risk to EXTERNAL VALIDITY, sampling bias
Fix:
*sampling frame
*why not a random sample used - Response rate and threat of non-response bias
–>Risk to CONSTRUCT VALIDITY
Fix:
*Length of questionnaire
*Make efforts to maximize the response rate - Response quality
–>Risk to CONSTRUCT VALIDITY
Fix:
*clarity of instructions
*check knowledgeability of respondent
*analyze missing values
Internal validity issues for surveys
Originate from poor analysis (e.g. wrong type of analysis or poor execution of the chosen method)
Confounding variables
Distort “true” relationship between two variables!!
It might be difficult to observe them, but if they affect relationship between IV and DV the causality cannot be established!!!
Dealing with confounding variables: Statistical control (control variables), Randomization, Restriction, Matching
Design of experiment
*Define your variables
(confounding variables—>control variables)
*Decide on operationalization of variables
*Environment
Lab (online or offline), field, quasi-experiment
*Design
setting, treatment, instructions, measurements, records…
*Participants
why these? How to recruit?
*Multiple relationships—> multiple experiments
*Need to compare relationship between more that 1 interventions–>different experimental groups
*Pre-test (pilot test)!
Experiment procedure
Assign participants in groups—>
Measure DV–>
Measure IV in the experimental group only—>
Remeasure DV
*Maximise “stability” of environment
*Maximise “consistency”
*Control of participants and experimenters (researchers)
*Document consistently
Validity and reliability components in the experiments
(methodological quality)
- Construct validity
–>quality of measurement of variables
Improve: pilot test as you improve design - Internal validity
Strong in lab experiments if executed and designed well
Improve: good control - External validity
Low in lab experiments–> “ecological validity”: does not represent real life
Improve: different groups to generalize - Reliability
Documentation transparency maximised (replication)
Decision science: decision problem 3 elements
- Decisions
- Objectives
- Constraints
Advantages of quantitative methods
*Save costs: Evaluate different decisions before implementing them (cheaper)
*Handle a large decision space: quickly identify best decision between best alternatives
*Make systematic trade-offs
*Deal with uncertainty: optimal for uncertain future
*Avoid biases
*Automated and hybrid decision making
*Deal with risk (what if analysis)
Typical process for a decision science project
real-world problem—>
assumptions—>
“Modeled problem”—>
Decisions—>
real-world problem
Never a 100% image of the real-world problem!
1. Problem definition
2. Mathematical model formulation
3. Selecting/ developing a solution method
4. Solve the problem (feedback if continuous)
5. Present your conclusions/ deliverables
!!No distinct step in data collection
Data in decision science
To test/ use quantitative methods:
* primary or secondary
*empirical or simulated
simulated data help with:
–>‘fix’ poor quality of empirical data or compensate for the lack of it
–>needed to create multiple scenarios
–>testing a solution method on a large set of cases
Model formulation: the most important step!!!!
3 types of questions in decision science
- Methodological (solving)
Finding ways to model and solve problems.
Developing rules or methods to get good solutions.
<How to solve it?>
- Theoretical (understanding)
Understanding the properties of problems and solution methods.
Checking how good or fast a method is in different situations.
<How does it work?>
- Practical (applying)
Applying methods to real-world problems.
Seeing how solutions change with different conditions.
<What works best in real life?>
Key Risks in Decision Science Methodology
*Misunderstanding the problem leads to a wrong model—verify understanding (Validity).
*A model that’s too simple or complex must be justified (Practicality & Validity).
*A mismatched solution method may be ineffective—align it with purpose (Practicality).
*A far-from-optimal solution needs analysis, e.g., case study (Practicality).
*A non-robust solution is too sensitive to input changes—use sensitivity analysis (Validity).
Good decision science balances validity (accuracy) and practicality (real-world use).
Advantages & disadvantages of secondary data
(+) Convenience (can be obtained faster)
(+) Can be cheap or even free
(+) Size and range of datasets available
(+) Easier opportunity for study replication
(+) You can explain change and evolution
(+) More objectivity to data
(+) Potentially high quality (if collected by professionals)
(+) Unobtrusive nature
(-)Possible mismatch of existing data with your research objective/RQ
(-) Risk of incompliance, biased sample
(-) Temptation to start research from available data rather than from RQ/objective – ethical problem*
(-) cost of time to learn about data
(-) high quality- expensive
(-) can be time-consuming
What to do if secondary data is not ‘perfect’ (or sufficient) for the initial research idea?
- Combine several data sources (merge existing sources or add your own through desk research)
- Adjust (slightly) your research focus
- Investigate which research problems can be studied with the available data
- You can think about what important and interesting questions can be studied using available data, but…
—>Be careful about this approach: data first, hypothesis second is NOT acceptable in research!!!!
Quality assessment of secondary data
- Always look at Coverage and Measurements to assess whether data is suitable for you!
- If the data that you think is suitable - look for risks for validity and reliability
*Risk to construct validity
–>Unclear or poor variable measurement
*Risk to external validity
–>Wrong sample (age, region, etc.)
*Risk to reliability
–>Unclear secondary data process or lack of documentation
Steps in quantitative data analysis
- Collect
- Structure
- CLEAN (distinct fro qual)
- Analyse
- Interpret analysis
- Draw conclusions
When would you use secondary analysis as research method?
*Use secondary analysis when data is suitable, access is difficult, or the topic is sensitive.
*Useful for historical or long-term studies.
*Common in finance due to abundant data.
*In management research, mainly used for theory testing.
*Theory testing follows a structured approach like surveys.
*In finance and accounting, hypotheses may be implicit.