Exam 1: Ch. 1-5 Flashcards
What makes psychology a science?
Invention of computers led to cognition
Mental processes and behavior are intertwined
What defines science?
Knowledge in the form of testable predictions and explanations
What differentiates science from pseudoscience? Ex?
Lacks reliance on empiricism and skepticism
Ex: phrenology
Empiricism
Claims based on evidence/data
Skepticism
Not accepting a claim without evidence
Confirmation bias
Selectively accepting evidence that confirms a belief, or vice versa
What are the 4 goals of the scientific method?
- Description
- Prediction
- Explanation
- Application
SM: Description
Describes the events and relationships between variables
SM: Prediction
Make a prediction
SM: explanation
Why does it occur?
SM: Application
Apply knowledge to improve lives
Difference between correlation and causation?
Correlation shows a relationship but does not tell you WHY the two are related. Causation explains the cause
Empirical Approach
using a collection of data to base a theory or conclusion
General Research Process Steps (7)
- Develop question
- Generate hypothesis
- Form operational definitions
- Choose a design
- Evaluate ethical issues
- Analyze and interpret data
- Report results
Why do you need literature review during the hypothesis development process?
Q
What is a construct? Give an example
Concepts that are clearly defined; the concept that is being tested
Ex: emotion, memory, mood
IV
Altered or manipulated
2 levels
Experimental/control
What is an operational definition? Example?
How the construct will be measured?
Ex: by their reading ability
What is the difference between basic and applied research?
Basic=lab setting
Applied=real world setting
Selecting a sample: inclusion
H
Selecting a sample: exclusion
Q
Selecting a sample: power
Q
Selecting a sample: representative
Q
Reliability
What is it?
What are the three types?
How consistent a measure is; if you measure many times, will the results be the same?
- Internal consistency
- Test-retest
- iInter-rater
Validity
What is it?
4 types?
Whether it measures what it’s supposed to measure
- Face
- Convergent
- Discriminant
- Criterion-prediction
Difference between reliability and validity?
Reliability has to do with consistency while validity has to do with whether or not it measures what it is supposed to measure
What is external (or ecological) validity?
Results applicable to the real world
What is quantitative data?
Data that is translated into and analyzed into numerical data
What is qualitative data? Ex?
Subjective data such as a case study, ex testing memory.
What are confounds?
How do experiments try to eliminate them? (2ways)
Other variables that may be causing an effect on another
- Manipulate only 1 factor at a time
- Measure outcome variable
Converging evidence
Best method to confirm evidence
Evidence from various sources that lead to the same conclusion
Replication
Doing the study over again the exact same way to support theories further
Multi-method approach
1
Components of informed consent
What does IC ensure?
What does it cover?
- Competence, knowledge and volition
- Who you are, what you’re doing, why, benefits/risks, what they’ll be asked to do and for how long, voluntary participation, no penalty for withdrawal
When is informed consent required?
G
When is informed consent not required?
Research will not cause any distress
Observational research
Minimal risk
No more risk than what daily life involves
Deception and its concerns
Importance of the study
Availability of alternatives
How harmful is it?
IRB- what is it; what is its purpose?
Institutional Review Board
It is a governing board that approves all research and protects research participants
IACUC: what is it? What is its purpose?
Institutional Animal Care and Use Committees
Protects the rights of animals and gives the ok to use animals in research
Observational Designs
Sampling behavior that represents the population
OD: Naturalistic
Is this with or without intervention?
Without observation
Natural setting
Observer is a passive recorder
Observation with intervention
Three methods?
Most psych research
- Participant observations
- Structured observations
- Field experiments
Open participant observation
Q
Structured Observation
Examples?
Observer intervenes to cause/set up event
Ex/ observing mother/child in a lab
Piaget’s observing children problem solving
Field Observation
What is it?
Where is it done?
Example?
Researcher manipulates one or more IVs in a natural setting
Outside the lab
Bystander effect
Time sampling
What is it?
3 types?
Choose time intervals for making observations
Systematic, random, event sampling
Event sampling
Type of?
What is it?
Type of time sampling
Observer records each event that is special
Situation sampling
What is it?
What does it increase?
Observer behavior in different location and conditions
Increases external validity
Subject sampling
What is it?
Ex?
Observe some set of people
Ex/ every 10th member of an audience
Coding of observational data
Process of converting observed behavior into quantitative date
Ex: coding children’s behaviors into ratings
Inter-rater reliability
Do different people rate the same behaviors in the same way?
Nominal scale
Ex?
Names or mutually exclusive categories; no mathematical meaning
Ex: blood group, edu levels
Ordinal scale
Ex?
Rank, order, greater/less than
Ex: letter grades, rank from best to worst
Interval scale
Ex?
Rank order equidistant between values; calculate distance but not ratio
Ex: temperature in F or C
Ratio scale
Ex?
Rank order, equidistant, meaningful zero
Ex: response time, age, weight
How to control/prevent bias
Recognize its presence
Have uninformed or blind observer
Advantages of unobtrusive/nonreactive data
People cannot react to presence of behavior
Disadvantages of non reactive/unobtrusive data
Validity harder to obtain
Bias may be present
Disadvantages of reactive/obtrusive design
Individuals react to observer presence
Behavior may not be typical of them
Threatens external validity of findings
Physical trace
Remnants, fragments, products of past behavior
Archival data
Public records/private documents describing activities of groups individuals etc
Why is a multi-method approach important?
H
Content analysis
Coding archival records to allow researchers to make inferences
Selective deposit
Problem with what?
What is it?
Archives
Some info is saved, some is not. May be incomplete or inaccurate
Selective survival
A problem with what?
What is it?
A problem with archival records
Some archives/traces have survived while others have not.
Simple probability
Al
Stratified
Population split into groups
Selection bias
Specific group within population is under or over represented
Response rate bias
Some people are more likely to respond to surveys than others
Advantages/disadvantages of convenience sampling
S
Advantages/disadvantages of probability sampling
N
Cross-sectional survey design
Done all at once; a snapshot in time
Longitudinal survey design
What type of sample?
Good for what?
Problems?
Same sample, multiple time points changes in individuals
Problem with sample attrition
Successive independent sample When is it done? What is it good for? What type of samples used? Consistency?
Done over multiple time points
Good for describing changes in public opinion
Uses different samples of same pop
Questions/sampling consistent
What is attrition? Why is it a problem with longitudinal designs?
Q
Internal consistency
Type of what?
What does it mean?
Type of reliability
Do all questions/items measure the same thing
Test-retest
Type of what?
What does it mean?
Reliability
Do the times measure the same thing each time?
Face validity
Is it obvious as to what the items are intended to measure?
Discriminate validity
Does it distinguish between groups?
Criterion-prediction
Is the measure associated with real world examples of the construct?
Ex: are emotional helpers interested in helping careers?
Do all experiments need a control?
No
DV
Affected by the manipulation of the IV
Does the DV depend on the level of the IV?
Yes
Validity: internal consistency
Do all the questions/items measure the same thing?
What is used to measure reliability?
What is considered “good”?
Cronbach’s alpha
>.70
Reliability: test-retest
Do items measure the same thing each time?
Reliability: Inter-rater
Do different people rate the same behaviors in the same way?
Steps to informed consent (9)
Explain purpose Right to decline/stop at any time Potential consequences stopping mid stream Potential risks Potential benefits Limits of confidentiality Incentives Contact info Answer questions
Who is unable to give informed consent?
Children
Adults with mental disabilities
External validity concerning observation
Extent to which the study’s findings may be used to describe people settings or conditions beyond those used in the study
Types of probability sampling? (2)
Simple random
Stratified