Exam 1: Ch. 1-5 Flashcards

You may prefer our related Brainscape-certified flashcards:
0
Q

What makes psychology a science?

A

Invention of computers led to cognition

Mental processes and behavior are intertwined

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
1
Q

What defines science?

A

Knowledge in the form of testable predictions and explanations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What differentiates science from pseudoscience? Ex?

A

Lacks reliance on empiricism and skepticism

Ex: phrenology

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Empiricism

A

Claims based on evidence/data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Skepticism

A

Not accepting a claim without evidence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Confirmation bias

A

Selectively accepting evidence that confirms a belief, or vice versa

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What are the 4 goals of the scientific method?

A
  1. Description
  2. Prediction
  3. Explanation
  4. Application
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

SM: Description

A

Describes the events and relationships between variables

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

SM: Prediction

A

Make a prediction

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

SM: explanation

A

Why does it occur?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

SM: Application

A

Apply knowledge to improve lives

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Difference between correlation and causation?

A

Correlation shows a relationship but does not tell you WHY the two are related. Causation explains the cause

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Empirical Approach

A

using a collection of data to base a theory or conclusion

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

General Research Process Steps (7)

A
  1. Develop question
  2. Generate hypothesis
  3. Form operational definitions
  4. Choose a design
  5. Evaluate ethical issues
  6. Analyze and interpret data
  7. Report results
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Why do you need literature review during the hypothesis development process?

A

Q

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is a construct? Give an example

A

Concepts that are clearly defined; the concept that is being tested

Ex: emotion, memory, mood

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

IV

A

Altered or manipulated
2 levels
Experimental/control

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What is an operational definition? Example?

A

How the construct will be measured?

Ex: by their reading ability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What is the difference between basic and applied research?

A

Basic=lab setting

Applied=real world setting

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Selecting a sample: inclusion

A

H

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Selecting a sample: exclusion

A

Q

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Selecting a sample: power

A

Q

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Selecting a sample: representative

A

Q

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Reliability
What is it?
What are the three types?

A

How consistent a measure is; if you measure many times, will the results be the same?

  1. Internal consistency
  2. Test-retest
  3. iInter-rater
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Validity
What is it?
4 types?

A

Whether it measures what it’s supposed to measure

  1. Face
  2. Convergent
  3. Discriminant
  4. Criterion-prediction
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

Difference between reliability and validity?

A

Reliability has to do with consistency while validity has to do with whether or not it measures what it is supposed to measure

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

What is external (or ecological) validity?

A

Results applicable to the real world

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

What is quantitative data?

A

Data that is translated into and analyzed into numerical data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

What is qualitative data? Ex?

A

Subjective data such as a case study, ex testing memory.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

What are confounds?

How do experiments try to eliminate them? (2ways)

A

Other variables that may be causing an effect on another

  1. Manipulate only 1 factor at a time
  2. Measure outcome variable
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

Converging evidence

A

Best method to confirm evidence

Evidence from various sources that lead to the same conclusion

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

Replication

A

Doing the study over again the exact same way to support theories further

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q

Multi-method approach

A

1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
33
Q

Components of informed consent
What does IC ensure?
What does it cover?

A
  1. Competence, knowledge and volition
  2. Who you are, what you’re doing, why, benefits/risks, what they’ll be asked to do and for how long, voluntary participation, no penalty for withdrawal
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
34
Q

When is informed consent required?

A

G

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
35
Q

When is informed consent not required?

A

Research will not cause any distress

Observational research

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
36
Q

Minimal risk

A

No more risk than what daily life involves

37
Q

Deception and its concerns

A

Importance of the study
Availability of alternatives
How harmful is it?

38
Q

IRB- what is it; what is its purpose?

A

Institutional Review Board

It is a governing board that approves all research and protects research participants

39
Q

IACUC: what is it? What is its purpose?

A

Institutional Animal Care and Use Committees

Protects the rights of animals and gives the ok to use animals in research

40
Q

Observational Designs

A

Sampling behavior that represents the population

41
Q

OD: Naturalistic

Is this with or without intervention?

A

Without observation
Natural setting
Observer is a passive recorder

42
Q

Observation with intervention

Three methods?

A

Most psych research

  1. Participant observations
  2. Structured observations
  3. Field experiments
43
Q

Open participant observation

A

Q

44
Q

Structured Observation

Examples?

A

Observer intervenes to cause/set up event

Ex/ observing mother/child in a lab
Piaget’s observing children problem solving

45
Q

Field Observation
What is it?
Where is it done?
Example?

A

Researcher manipulates one or more IVs in a natural setting
Outside the lab
Bystander effect

46
Q

Time sampling
What is it?
3 types?

A

Choose time intervals for making observations

Systematic, random, event sampling

47
Q

Event sampling
Type of?
What is it?

A

Type of time sampling

Observer records each event that is special

48
Q

Situation sampling
What is it?
What does it increase?

A

Observer behavior in different location and conditions

Increases external validity

49
Q

Subject sampling
What is it?
Ex?

A

Observe some set of people

Ex/ every 10th member of an audience

50
Q

Coding of observational data

A

Process of converting observed behavior into quantitative date
Ex: coding children’s behaviors into ratings

51
Q

Inter-rater reliability

A

Do different people rate the same behaviors in the same way?

52
Q

Nominal scale

Ex?

A

Names or mutually exclusive categories; no mathematical meaning

Ex: blood group, edu levels

53
Q

Ordinal scale

Ex?

A

Rank, order, greater/less than

Ex: letter grades, rank from best to worst

54
Q

Interval scale

Ex?

A

Rank order equidistant between values; calculate distance but not ratio

Ex: temperature in F or C

55
Q

Ratio scale

Ex?

A

Rank order, equidistant, meaningful zero

Ex: response time, age, weight

56
Q

How to control/prevent bias

A

Recognize its presence

Have uninformed or blind observer

57
Q

Advantages of unobtrusive/nonreactive data

A

People cannot react to presence of behavior

58
Q

Disadvantages of non reactive/unobtrusive data

A

Validity harder to obtain

Bias may be present

59
Q

Disadvantages of reactive/obtrusive design

A

Individuals react to observer presence
Behavior may not be typical of them
Threatens external validity of findings

60
Q

Physical trace

A

Remnants, fragments, products of past behavior

61
Q

Archival data

A

Public records/private documents describing activities of groups individuals etc

62
Q

Why is a multi-method approach important?

A

H

63
Q

Content analysis

A

Coding archival records to allow researchers to make inferences

64
Q

Selective deposit
Problem with what?
What is it?

A

Archives

Some info is saved, some is not. May be incomplete or inaccurate

65
Q

Selective survival
A problem with what?
What is it?

A

A problem with archival records

Some archives/traces have survived while others have not.

66
Q

Simple probability

A

Al

67
Q

Stratified

A

Population split into groups

68
Q

Selection bias

A

Specific group within population is under or over represented

69
Q

Response rate bias

A

Some people are more likely to respond to surveys than others

70
Q

Advantages/disadvantages of convenience sampling

A

S

71
Q

Advantages/disadvantages of probability sampling

A

N

72
Q

Cross-sectional survey design

A

Done all at once; a snapshot in time

73
Q

Longitudinal survey design
What type of sample?
Good for what?
Problems?

A

Same sample, multiple time points changes in individuals

Problem with sample attrition

74
Q
Successive independent sample
When is it done?
What is it good for?
What type of samples used?
Consistency?
A

Done over multiple time points
Good for describing changes in public opinion
Uses different samples of same pop
Questions/sampling consistent

75
Q

What is attrition? Why is it a problem with longitudinal designs?

A

Q

76
Q

Internal consistency
Type of what?
What does it mean?

A

Type of reliability

Do all questions/items measure the same thing

77
Q

Test-retest
Type of what?
What does it mean?

A

Reliability

Do the times measure the same thing each time?

78
Q

Face validity

A

Is it obvious as to what the items are intended to measure?

79
Q

Discriminate validity

A

Does it distinguish between groups?

80
Q

Criterion-prediction

A

Is the measure associated with real world examples of the construct?
Ex: are emotional helpers interested in helping careers?

81
Q

Do all experiments need a control?

A

No

82
Q

DV

A

Affected by the manipulation of the IV

83
Q

Does the DV depend on the level of the IV?

A

Yes

84
Q

Validity: internal consistency

A

Do all the questions/items measure the same thing?

85
Q

What is used to measure reliability?

What is considered “good”?

A

Cronbach’s alpha

>.70

86
Q

Reliability: test-retest

A

Do items measure the same thing each time?

87
Q

Reliability: Inter-rater

A

Do different people rate the same behaviors in the same way?

88
Q

Steps to informed consent (9)

A
Explain purpose
Right to decline/stop at any time
Potential consequences stopping mid stream
Potential risks
Potential benefits 
Limits of confidentiality 
Incentives
Contact info
Answer questions
89
Q

Who is unable to give informed consent?

A

Children

Adults with mental disabilities

90
Q

External validity concerning observation

A

Extent to which the study’s findings may be used to describe people settings or conditions beyond those used in the study

122
Q

Types of probability sampling? (2)

A

Simple random

Stratified