Midterm (Ch 1, 2, 3, 5) Flashcards

1
Q

Holding onto facts/beliefs just because they have been known for a long time

A

Method of Tenacity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Info is accepted as being true because it feels right

A

Method of Intuition

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Relying on an expert’s expertise to answer questions

A

Method of Authority

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Method faith

A

Blindly following an authority without verifying information

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Rational Method

A

Seeking answers with logical reasonning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is logical reasoning

A

Argument -> Premises -> Conclusions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Empirical method

A

Answer questions with direct observations/experiences (using 5 senses or else)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Problems with authority method of acquiring knowledge

A

authorities can be biased, answers could be opinions of the authority and not facts, assuming that one’s expertise can be applied to other domains

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Problems with rational method of acquiring knowledge

A

conclusions cannot be true unless premises are true, people suck at making valid reasoning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What are variables

A

Conditions that have different values for different individuals

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is deduction, when is it used

A

Going from a general statement to specific conclusions, is used to find a testable prediction after our hypothesis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What does it means that the scientific method is empirical

A

answers are obtained by making observations through structured testing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What does it means that the scientific method is public

A

Makes its findings available to others to consult and replicate

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What does it means that the scientific method is objective

A

It prevents researchers’ bias from affecting the results (for example by doing blind experiments)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is pseudoscience

A

Relies on subjective evidence, its arguments cannot be refuted, ignores failure of its theories, was never tested/rarely updated

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Quantitative Research

A

Measurable (nbrs, values)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Qualitative Research

A

Produces narrative reports (notes from observations)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What are the 10 steps of the research process

A
  1. Find a research idea (select topic + review literature)
  2. Form a hypothesis (select the answer most likely to happen)
  3. Determine how to define/measure variables
  4. Identify the subjects, their selection process, and their ethical treatment
  5. Select the research strategy (what is the question asked, are there any ethical constraints)
  6. Select the research design (methods/procedures to conduct experiment)
  7. Conduct the study (collect data)
  8. Evaluate the data (stats analysis)
  9. Report the results
  10. Refine/reformulate the research idea (test boundaries of results, refine the original research question)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What is the fundamental assumption of research?

A

That the world is governed by orderly (natural) laws, and that the scientific method allows us to uncover these laws

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

What are the 4 goals of research?

A

Observe, Describe, Explain, Predict

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

What is the basic approach of research

A

To understand a particular phenomena (ex: how drinking alcohol affects coordination)

GENERAL SITUATION

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

What is the applied approach of research

A

To solve a particular problem (ex: solve alcoolism)

PRECISE PROBLEM

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

What is inductive reasoning and when is it used

A

Making generalizations based on a few observations, used to find a research idea

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

What are the 3 conditions that a theory has to fulfill

A

Parsimony, precision, testability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
What is parsimony
Explaining many results with a few concepts
26
What is an independent variable
The "cause" manipulated by the experimenter
27
What is a dependent variable
The "result" or effect from the manipulation of the independent variable
28
What are common sources of research topics (5)
``` Personal interests Practical problems/solutions Casual observations Reports of others' observations Behavioural theories ```
29
What is the goal of doing a lit search
Take the existing research further, making sure what we do is new, and useful
30
What is a primary source
1st hand report: authors describing their own observations
31
What is a secondary source
2nd hand reports: the authors are discussing someone else's observations
32
What are the characteristics of a good hypothesis
Logical Testable Refutable Positive (abt the existence of smth)
33
What are constructs
Hypothetical elements that explain behaviours we assume exists (ex :self-esteem) Constructs can be influenced by external stimuli and influence external behaviours (thats how we observe them)
34
What is an operational definition
Its used to measure a construct indirectly by its causes and effects (turns abstract into concrete) OP. DEF IS NOT A CONSTRUCT
35
True or false: Constructs can be tested direclty
False
36
What are 2 problems that can occur with operational definitions
1- They might leave out important components of a construct | 2- They often include extra components which are not part of the construct
37
What is validity
When a measurment is actually measuring what it claims to measure
38
What is face validity (method to measure validity)
Subjective assessement on validity (does it make sense?)
39
What is concurrent validity (mehtod to measure validity)
Comparing the results of 2 measuring techniques
40
What is predictive validity (mehtod to measure validity)
When measurements of a construct accurately predict a behaviour according to a theory
41
What is construct validity? (mehtod to measure validity)
Being able to show that the measures of a v behave in the same direction as the actual v
42
What is convergent validity? (mehtod to measure validity)
Creating 2 methods of measurement for 1 construct and show that they are correlated
43
What is divergent validity? (mehtod to measure validity)
Measuring 2 different constructs with the same method and showing that there is no correlation between the measures
44
What is reliability?
A method of measuring that produces identical results when used repeatedly is reliable
45
What does this equation means | Measured Score = True Score + Error
There is always some degree of error when making a measurement
46
What are the 3 sources of error when measuring
Observer error, environmental change, participants' change
47
What is observer error
human measuring/computing makes an error
48
What is an environmental change (source of error)
Change in time, temperature, climate, etc when making measurements
49
What is a participants' change (source of error)
change in their focus, mood, etc when making measurements
50
What is test-retest (measures of reliability)
scores tested twice close in time and compared
51
What is parallel-forms (measures of reliability)
Test-retest with different measuring methods
52
What is inner-rater reliability (measures of reliability)
Difference between the observations of 2 researchers
53
What is split-half reliability (measures of reliability)
dividing the questions in 2 and comapring the 2 half results
54
True or false: reliability is a prerequisite for validity
true
55
True or false: something not valid cannot be reliable
false
56
What is a self-report measure
a way of letting participants asses themselves with a questionnaires, scales from 1 to 10, etc (a way to measure a construct)
57
What are physiological measures of constructs
Physiological manifestations of a construct (heart rate,, temperature, etc)
58
What are behavioural measures of constructs
Tasks, natural or structured, to define/measure constructs
59
What is a ceiling effect
When measures of a construct are restricted at a higher measure (ex test is too easy)
60
What is a floor effect
When measures of a construct are restricted at a lower measure (ex test is super hard)
61
What are artifacts
Nonnatural features accidentally introduced into something being observed
62
What is experimenter bias
When the experimenter knows/expects the outcome and can influence the results
63
What is the solution to experimenter bias?
Standardize experiment (ex do a blind experiment)
64
What is a single-blind experiment
When the experimenter doesnt know what the procedures and expected results are
65
What is a double-blind experiment
When the participants and the experimenter dont know what the procedures and expected results are
66
What are demand characteristics
Cues that tells the participants what the expected outcomes are or how they are expected to behave
67
What is participant reactivity
When a natural behaviour is modified to satisfy the requirements of a study because subjects know that they are being watched
68
Subject role: good
Wants to fit the hypothesis
69
Subject role: Negativistic
are trying to act contrary to the hypothesis
70
Subject role: Apprehensive
overly concerned about their personal characteristics being evaluated
71
Subject role: Faithful
attempts to follow the instructions and not act out on their suspicions abt the study
72
What is internal validity
When we can safely say that changes in X have caused the observed change in Y
73
What is external validity
to which extent results can extend to other settings/populations
74
What are the 4 sources of measurement error
1-Participants 2-Instrumentation 3-Testing environment 4-Scoring guidelines
75
What is the solution to measurement error
standardizing each problematic aspect of the experiment
76
What is a target population
group defined by researcher's specific interests (not easily accessible)
77
What is an accessible population
population accessible to the researcher (from which the sample will be selected)
78
What is a biased sample
with characteristics noticeably different than the pop
79
What is sampling bias
a sampling procedure that favors the selection of some individuals
80
What says the law of large nbrs
a bigger sample is more reliable than a smaller sample (to a certain extent)
81
What is a power analysis
estimation of the sample needed to evaluate the pop
82
What are the characteristics of a probability sample
the odds of selecting each participants are the same, exact pop size is known, selection must be a random process
83
What are the characteristics of a nonprobability sample
Researchers do not know the exact size of pop, individuals do not have = chance of being selected, not using unbiased method of selection
84
What is simple random sample (P)
Define pop, list all members, use random process to select from list (with or without replacement)
85
What is systematic sampling(P)
list all ind, going down list from random point and choosing every nth individual
86
What is stratified random sampling(P)
``` Identify subgroups (strata) Use SRS in each strata ```
87
What is proportionate stratified random sampling(P)
identify subgroups identidy their % in the pop make sure their % in pop = % in sample
88
What is cluster sampling(P)
randomly selecting groups instead of individuals
89
What is combined-strategy sampling(P)
combine 2+ strategies
90
What is Quota sampling (NP)
establishing quotas for each sub group, then convenience sample in groups
91
What is convenience sampling (NP)
using individuals available to us
92
Purposive sampling (NP)
approaching eligible participants
93
Snowball sampling (NP)
find 1 person who fits criteria and ask to find more people