Lecture 11&12, Chapter 10&11 Flashcards

1
Q

Methodology or methodologism

A

Tendency to see methodological rigour as the only requirement for scientific research, at the expense of theory formation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Pseudoscience

A

Branch of knowledge that pretends to be scientific but that violates the scientific method on essential aspects, such as lack of openness to testing by others and reliance on confirmation rather than falsification

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Hermeneutics

A

approach in psychology according to which the task of the psychologist is to interpret and understand persons on the basis of their personal and socio-cultural history

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Humanistic psychology

A

psychological movement promoted by Rogers and Maslow as a reaction against psychoanalysis and behaviourism; stressed that people are human, inherently positive, endowed with free will and living within a socio-cultural context

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Feminist psychology

A

movement in psychology aimed at understanding women; is particularly concerned with the way in which women are treated in mainstream psychology

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Postcolonial psychology

A

movement in psychology addressing the issues of racism and the ways in which dominant groups treat other groups

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Unconscious plagiarism

A

indicates how the scientific and the hermeneutic approach in psychology have influenced each other without the proponents being aware of it

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Replicability

A

The probability of obtaining the same finding when a scientific study is rerun (in the same way)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Replication crisis

A

a crisis of confidence in scientific research, because many published findings cannot be repeated if studies are rerun, questioning the reliability of scientific findings

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

File drawer problem

A

issue that the scientific literature badly represents the research done because experiments that do not find significant differences are less likely to get published

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Conceptual replication

A

replication in which an effect is investigated differently from the original study; is good to examine the generality of a finding, but can magnify biases in the scientific literature if combined with the file drawer problem

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Questionable research practices

A

research practices undermining the statistical conclusions that can be drawn from a study; usually increase the chances of finding a predicted effect

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

P-hacking

A

Manipulating data in order to obtain a desired (significant) p-value

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

HARKing

A

unexpected significant finding in a statistical analysis is presented as an effect that was the focus of the research and, therefore, addresses an important theoretical question (hypothesising after the results are known)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Registered report

A

a type of research article that is evaluated by scientific journals before the data are collected; goal is to make the evaluation independent of the obtained results and solely dependent on the research question, the research design, and the proposed analyses

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Bayesian statistics

A

data analysis that deviates from the traditional hypothesis testing with p-values; estimates the relative probabilities of the null hypothesis and the alternative hypothesis; is hoped to correct existing misunderstandings of statistics

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Pottery barn rule

A

In science is the moral obligation of a scientific journal to publish a failure to replicate a finding previously published in the journal

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Open science

A

science practice where all relevant information is made easily available, so that other researchers can check the findings and integrate them in their own research

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Repository

A

in science is a location where data and analysis programs are stored, so that others can retrieve them (typically on the internet)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Transparency and Openness Promotion (TOP) guidelines

A

list of criteria written by advocates of open science describing the extent to which journals adhere to the standards of open and reproducible science

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Secondary data analysis

A

Reanalysis of existing data to address new research questions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Big data

A

Collection and use of large datasets for secondary data analysis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Publish or perish

A

refers to the practice in academia that a person will not be appointed or promoted unless they have a strong portfolio of scientific publications

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Peer review

A

in science is the evaluation of scientific work by research colleagues (peers) to decide whether the work is good enough to be published (or financed in case of grant applications)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

Journal impact factor

A

number that estimates the impact a journal has on a research area; based on the average number of citations to articles in the journal in subsequent years

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

Declaration on Research Assessment (DORA)

A

declaration that asks science funders and evaluators to look at the quality of the research itself (proposed and completed) rather than at the prestige of the outlets in which the research is published

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

Open access journal

A

Journal that ca be consulted without paying a subscription or fees for reading articles (usually via internet)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

Article processing charge (APC)

A

Price asked by open access scientific journals to process a manuscript and publish it in the journal

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

Mega-journal

A

large-volume, peer- reviewed academic open access journal designed to make more scientific findings available by focusing on methodological rigour rather than theoretical contribution; has a large remit, so that articles on many topics can be included

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

Double dipping

A

Practice in science in which journals make money both by journal subscriptions and by article fees for open access

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

Predatory journal

A

scientific journal that gives the impression of being genuine (peer-reviewed, with mechanisms of error control, promise of longevity) without adhering to the standards; tries to lure scientists to pay APCs for very limited service

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q

Quantitative research methods

A

research methods based on quantifiable data; are associated with the natural-science approach based on the hypothetico- deductive model

33
Q

Illusory correlation

A

Perception of a correlation between events for which no independent evidence can be found

34
Q

Confounding variable

A

Variable that was not taken into account in the study and that may be the origin of the effect observed

35
Q

Quantitative imperative

A

A bias only to find measurable topics interesting because quantitative research methods require numerical data

36
Q

Qualitative research methods

A

research methods based on understanding phenomena in their historical and socio- cultural context; are associated with the hermeneutic approach based on understanding the meaning of a situation

37
Q

Ideographic approach

A

The conclusions of a study stay limited to the phenomenon under study

38
Q

nomothetic approach

A

A study is run in search of universal principles that exceed the confines of the study

39
Q

Bracketing

A

Requirement in qualitative research to look at a phenomenon with an open mind and to free oneself from preconceptions

40
Q

Semi-structured interview

A

Interview in which each interviewee gets a small set of core questions, but for the rest of the time is encouraged to speak freely; achieved by making use of open-ended questions, non-directive questions

41
Q

Focus group

A

Technique in which a group of participants freely discuss a limited set of questions, usually makes use of the same questions used in a semi-structured interview

42
Q

Grounded theory

A

qualitative research method that tries to understand what is going on in a particular situation and which, on the basis of a qualitative analysis and induction, tries to come to a theoretical insight grounded in the data

43
Q

Interpretative phenomenological analysis

A

Qualitative research method that tries to understand how a phenomenon is experienced by the people involved

44
Q

Discourse analysis

A

Qualitative research method that aims to discover how social relations between people are determined by the language they use

45
Q

What are strengths of quantitative research as a method

A
  1. Lends itself well for statistical analyses of large datasets
  2. Can produce precise predictions that can be tested
  3. Makes comparison (between groups or subjects) possible/easier
  4. Easier to investigate confounds and validity threats
46
Q

What are weaknesses of quantitative research as a method

A
  1. Little interest in the perception of participants
  2. Research limited by what is measurable
  3. Better suited to testing general theories than to finding solutions for specific problems
  4. If you don’t have a well-developed theory yet, they aren’t very helpful; not suitable for generating theories
  5. Assumes that psychological characteristics can be measured
47
Q

What are 3 approaches to qualitative research and explain them

A
  1. Grounded theory; analyses to build a specific theory
  2. IPA; puts much emphasis on the subjective experience of the people that are studies
  3. Discourse analysis; assumes that social reality is constructed by the way we communicate and tries to understand this by looking at language
48
Q

What are 3 limitations to grounded theory and what was the solutions to these limitations

A
  1. Assumes there is an objective reality to be discovered
  2. Importance of verification and induction
  3. Did not take into account that data consists of participants’ perceptions/interpretations
    —> IPA was solution
49
Q

What are strengths of qualitative research as a method

A
  1. Direct involvement with situation
  2. Generates theories and enables exploration
  3. Responsive to the needs of participants
50
Q

What are weaknesses of qualitative research as a method

A
  1. Less suitable for demonstrating general laws
  2. Little room for precise predictions/falsification
  3. Less suitable for deciding between theories
  4. Largely based on introspection/subjective evaluation
  5. Involvement of researcher may be a disadvantage in high-stakes situations
51
Q

What are 6 criteria that are important when doing qualitative research

A
  1. Representativeness. Researchers must provide the criteria they used for their data analysis, so that the representativeness of the reported instances can be gauged.
  2. Confirmability. Would someone else come to the same conclusions on the basis of an equivalent data analysis?
  3. Credibility. Do the conclusions sound credible to the participants involved in the study?
  4. Comparison of situations that differ on one critical aspect. Another way to check the validity of conclusions is to compare the data with those of similar cases that deviate in a critical aspect.
  5. Alternative explanations. Could the data be explained differently?
  6. Refutability. Is there evidence that refutes the conclusions? This principle very much
    resembles the falsification test in quantitative psychology.
52
Q

What are 4 misconceptions about qualitative research according to Marecek

A
  1. Qualitative and quantitative provide the same kind of understanding
  2. Qualitative research is a first exploration
  3. Qualitative research is purely inductive
  4. Qualitative research is the same as quantitative psychology but without numbers
53
Q

Who coined the term social constructionism and was he happy with how it progressed

A

Latour, no

54
Q

What did postmodernism entail

A

It rejects the assumptions and principles in modernity, rests on social constructionism which states that our social reality is a construction of our minds, took this very far and extended it to biology/physics/etc. as well. It says that everything is relative and that there’s no such thing as objective reality.

55
Q

What did Alan Sokal do and why

A

He was a physicist and published an article in a postmodernist journal about how physics proved that there is no objective reality. This was a hoax with fake evidence however, and with this he tried to show the political consequences of a relativist view and how the journal would publish anything that looked good and proved their view

56
Q

Explain the cargo cult science and who tells the story

A

Richard Feynman; In WW-2 the Americans built bases on pacific islands to refuel their jets, people that lived on the islands had never seen such a thing so when the Americans left a variety of religious practices arose among the islanders aimed to bring them back —> supposedly a metaphor for science; even though they do this, the planes don’t land —> it looks like science but it isn’t

57
Q

What are Merton’s 4 core values and what do they mean

A
  1. Communalism - scientific products belong to no one
  2. Universalism - truth claims are judged the same, no matter what makes them
  3. Disinterestedness - scientists have no interest in the outcome of research
  4. Organized skepticism - ideas are cracked down on and rigorously testes, regardless of who proposes them
58
Q

What is the problem with meltons norms and what are these norms or ideas called

A

They are not always actualized, but that does not make them useless; they are regulative ideas = things we strive for

59
Q

What are 4 examples of pseudo-science cases and what happened

A
  1. Vul’s voodoo correlations - many correlations between psychological constructs and brain areas were reported that were so high that they were basically impossible
  2. Barth’s fantastic effects - priming people with words like old and elderly led to slower walking etc.
  3. Bem’s Feeling the Future - “proved” that clairvoyance was real, turned out that they engaged in QRP’s
  4. Stapel - published really amazing results, turned out that data for 55 papers was entirely made up
60
Q

What is p-hacking

A

Deleting outliers, adding covariates, splitting the sample, selecting variables until you find a significant effect, etc.

61
Q

What is publication bias

A

Only publishing positive results

62
Q

What is HARKing

A

Hypothesizing After the Results are Known

63
Q

What are funnel plots and how do they help us

A

They show the effect size in smaller to larger sample sizes; larger sample sizes should mean the results deviate less from the actual effect size —> for publication bias we see that there are data points missing so it looks like smaller sample sizes are more accurate than they should be

64
Q

What are the 3 pillars of open science

A
  • open data
  • open materials
  • preregistration
65
Q

What is the open science framework

A

It is a repository in which people can upload their data to make it available to everyone

66
Q

What is Psychological Science Accelerator (PSA)

A

It can replicate findings very fast, it employs huge distributed studies to evaluate research findings

67
Q

What are 6 criticisms of open science

A
  1. Increased bureaucracy and more work for researchers
  2. Most data are never downloaded, many preregistrations aren’t adhered to, badges aren’t checked
  3. Too much focus on replication, not enough on creativity
  4. Open science in expensive and can compound inequalities
  5. Open science is not always feasible or desirable
  6. Some approaches may decline (e.g. field studies, qualitative research)
68
Q

What were the 4 elements of Diltheys approach

A
  1. Psych should be content based, focus on what the mind comprises and not how it functions
  2. Subject matter of psych is the human experience in its totality
  3. A persons life is embedded in context, cannot be studied in isolation
  4. Natural-scientific research method cannot grasp the totality of mental life within its context, appropriate method for psych is understanding
69
Q

What are the 3 levels of understanding according to dilthey

A
  1. Elementary forms of understanding used to solve simple problems of life
  2. Empathy through which an observer can re-experience someone else’s experience
  3. The hermeneutic level of understanding, by which an observed person can be better understood than they understand themselves
70
Q

Who founded humanistic psychology

A

Rogers and Maslow

71
Q

What are 3 basics of critical psychology

A
  • idealism instead of realism
  • science is a social construction
  • psychologists have a moral responsibility because psych affects reality
72
Q

What is a problem with conceptual replications

A

Failure does not say much about the original study because it wasn’t exactly the same, but success adds weight to the success of the studied you replicated

73
Q

What were 5 negative consequences of the journal impact factor

A
  1. Increased power of the Web of Science company
  2. Further cemented the dominance of research published in English
  3. Books became less important than journal articles
  4. Some practices emerged to improve the amount of citations
  5. The JIF was sometimes misinterpreted
74
Q

What are the 8 levels of the TOP guidelines

A
  1. Citation standards (for data and materials)
  2. Data transparency
  3. Analytic methods (code) transparency
  4. Research materials transparency
  5. Design and analysis transparency
  6. Preregistration of studies
  7. Preregistration of analysis plans
  8. Replication
75
Q

What are 4 reasons psych is not seen as a science

A
  1. Little overlap between stereotypical view of scientist vs psychologist
  2. Difference between psychology researchers and practitioners
  3. Unlike scientific results, psych findings are easy to understand
  4. Not all psychologists are convinced of the added value of the scientific method
76
Q

What are 6 ways in which we can check the validity of qualitative research

A
  1. Representativeness; must provide criteria used for data analysis
  2. Confirmability; would someone else come to the same conclusions
  3. Credibility; do the conclusions sounds credible to participants
  4. Comparison of situations that differ in one critical aspect
  5. Alternative explanations
  6. Refutability (falsification)
77
Q

What is the phenomenological perspective

A

An extension of the qualitative approach, tries to develop an interpretative methodology, focuses on intentionality, consciousness and qualia instead of behavior —> understanding instead of explanation

78
Q

What was Gergen’s most important contribution

A

He argues that psych transforms reality, instead of only passively describing it; theories should therefore be evaluated based on their ability to generate new openings for action instead of their truth

79
Q

What are the 6 phases of TA

A
  1. Becoming familiar with the data
  2. Generating initial codes
  3. Searching for themes
  4. Reviewing themes (quality control)
  5. Naming and defining the themes
  6. Writing the report