Week 4 Flashcards

1
Q

Strategies associated with quantitative research

A

–>Surveys
*(often) theory testing
!! establish correlation, not causation!!
*primary data: quant or qual

–>Experiments
*theory testing
!!!Establish causation!!!
*primary data: quant or qual

–>Decision science
*to solve problems, improve the process, forecast
*mathematical modeling

–>Secondary data analysis (regressions run etc.)
*usually quant but also qual. data
*(often) theory testing but also to explore a new topic
*very prominent in some fields (finance etc)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Theory testing: Hypotheses

A

More than 1 hypothesis developed from existing literature (theory).
Conceptual model: representation of all the hypotheses

*A good formulation of a hypothesis always includes the direction of effect (not just ‘A influences B’, but ‘A increases B’, or ‘the more A, the more B’)
*Measure with data and test with stats: reject or not.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Types of variables in hypotheses

A
  1. IV or explanatory
    Explains change in DV.
  2. DV
    Changed, or influenced, by IV
  3. Moderator
    Changes the strength of a relationship between two variables
  4. Mediator
    ‘Transmits’ the effect of one variable on another.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Design of survey: important components

A
  1. Define variables to measure
    –> also control variables!!
  2. Operationalisation
  3. Questionnaire design
    –>attention checks
    –> self of interview administered
  4. Pilot test
  5. population/sample technique
  6. Which methods to combine

!!!It is important to correctly identify types of variables upfront – otherwise you cannot design your experiment!!!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Quality of surveys: risks

A
  1. Quality of questionnaire
    –> Risk to CONSTRUCT VALIDITY

<do questions actually cover what we intend to cover?>
Fix:
*reliability check (Cronbach’s alpha)
*use previously validated scales

  1. Quality of sampling
    –>Risk to EXTERNAL VALIDITY, sampling bias
    Fix:
    *sampling frame
    *why not a random sample used
  2. Response rate and threat of non-response bias
    –>Risk to CONSTRUCT VALIDITY
    Fix:
    *Length of questionnaire
    *Make efforts to maximize the response rate
  3. Response quality
    –>Risk to CONSTRUCT VALIDITY
    Fix:
    *clarity of instructions
    *check knowledgeability of respondent
    *analyze missing values
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Internal validity issues for surveys

A

Originate from poor analysis (e.g. wrong type of analysis or poor execution of the chosen method)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Confounding variables

A

Distort “true” relationship between two variables!!
It might be difficult to observe them, but if they affect relationship between IV and DV the causality cannot be established!!!

Dealing with confounding variables: Statistical control (control variables), Randomization, Restriction, Matching

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Design of experiment

A

*Define your variables
(confounding variables—>control variables)
*Decide on operationalization of variables
*Environment
Lab (online or offline), field, quasi-experiment
*Design
setting, treatment, instructions, measurements, records…
*Participants
why these? How to recruit?
*Multiple relationships—> multiple experiments
*Need to compare relationship between more that 1 interventions–>different experimental groups
*Pre-test (pilot test)!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Experiment procedure

A

Assign participants in groups—>
Measure DV–>
Measure IV in the experimental group only—>
Remeasure DV

*Maximise “stability” of environment
*Maximise “consistency”
*Control of participants and experimenters (researchers)
*Document consistently

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Validity and reliability components in the experiments
(methodological quality)

A
  1. Construct validity
    –>quality of measurement of variables
    Improve: pilot test as you improve design
  2. Internal validity
    Strong in lab experiments if executed and designed well
    Improve: good control
  3. External validity
    Low in lab experiments–> “ecological validity”: does not represent real life
    Improve: different groups to generalize
  4. Reliability
    Documentation transparency maximised (replication)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Decision science: decision problem 3 elements

A
  1. Decisions
  2. Objectives
  3. Constraints
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Advantages of quantitative methods

A

*Save costs: Evaluate different decisions before implementing them (cheaper)
*Handle a large decision space: quickly identify best decision between best alternatives
*Make systematic trade-offs
*Deal with uncertainty: optimal for uncertain future
*Avoid biases
*Automated and hybrid decision making
*Deal with risk (what if analysis)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Typical process for a decision science project

A

real-world problem—>
assumptions—>
“Modeled problem”—>
Decisions—>
real-world problem

Never a 100% image of the real-world problem!
1. Problem definition
2. Mathematical model formulation
3. Selecting/ developing a solution method
4. Solve the problem (feedback if continuous)
5. Present your conclusions/ deliverables

!!No distinct step in data collection

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Data in decision science

A

To test/ use quantitative methods:
* primary or secondary
*empirical or simulated
simulated data help with:
–>‘fix’ poor quality of empirical data or compensate for the lack of it
–>needed to create multiple scenarios
–>testing a solution method on a large set of cases

Model formulation: the most important step!!!!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

3 types of questions in decision science

A
  1. Methodological (solving)
    Finding ways to model and solve problems.
    Developing rules or methods to get good solutions.

<How to solve it?>

  1. Theoretical (understanding)
    Understanding the properties of problems and solution methods.
    Checking how good or fast a method is in different situations.

<How does it work?>

  1. Practical (applying)
    Applying methods to real-world problems.
    Seeing how solutions change with different conditions.

<What works best in real life?>

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Key Risks in Decision Science Methodology

A

*Misunderstanding the problem leads to a wrong model—verify understanding (Validity).
*A model that’s too simple or complex must be justified (Practicality & Validity).
*A mismatched solution method may be ineffective—align it with purpose (Practicality).
*A far-from-optimal solution needs analysis, e.g., case study (Practicality).
*A non-robust solution is too sensitive to input changes—use sensitivity analysis (Validity).

Good decision science balances validity (accuracy) and practicality (real-world use).

17
Q

Advantages & disadvantages of secondary data

A

(+) Convenience (can be obtained faster)
(+) Can be cheap or even free
(+) Size and range of datasets available
(+) Easier opportunity for study replication
(+) You can explain change and evolution
(+) More objectivity to data
(+) Potentially high quality (if collected by professionals)
(+) Unobtrusive nature

(-)Possible mismatch of existing data with your research objective/RQ
(-) Risk of incompliance, biased sample
(-) Temptation to start research from available data rather than from RQ/objective – ethical problem*
(-) cost of time to learn about data
(-) high quality- expensive
(-) can be time-consuming

18
Q

What to do if secondary data is not ‘perfect’ (or sufficient) for the initial research idea?

A
  • Combine several data sources (merge existing sources or add your own through desk research)
  • Adjust (slightly) your research focus
  • Investigate which research problems can be studied with the available data
  • You can think about what important and interesting questions can be studied using available data, but…

—>Be careful about this approach: data first, hypothesis second is NOT acceptable in research!!!!

19
Q

Quality assessment of secondary data

A
  1. Always look at Coverage and Measurements to assess whether data is suitable for you!
  2. If the data that you think is suitable - look for risks for validity and reliability
    *Risk to construct validity
    –>Unclear or poor variable measurement
    *Risk to external validity
    –>Wrong sample (age, region, etc.)
    *Risk to reliability
    –>Unclear secondary data process or lack of documentation
20
Q

Steps in quantitative data analysis

A
  1. Collect
  2. Structure
  3. CLEAN (distinct fro qual)
  4. Analyse
  5. Interpret analysis
  6. Draw conclusions
21
Q

When would you use secondary analysis as research method?

A

*Use secondary analysis when data is suitable, access is difficult, or the topic is sensitive.
*Useful for historical or long-term studies.
*Common in finance due to abundant data.
*In management research, mainly used for theory testing.
*Theory testing follows a structured approach like surveys.
*In finance and accounting, hypotheses may be implicit.