Thinking and reasoning Flashcards
what does thinking inc?
deductive reasoning
solving logical/mathematical problems that have right answers (from premises, generate valid conclusion)
or evaluate validity of conclusion
inductive reasoning (predicting future from past data)
stat generalisations, probabilistic judgements, predictions
hyp-testing, rule induction
problem-solving
working out how to get from state A to B (numerous solutions, varying degrees of constraint)
judgement and decision-making: Choosing among options
creative thinking, daydreaming, imagining, etc
what is the research on thinking?
focused primarily on cases where
there is right answer, +/
way of evaluating rationality of answer, and/
way of assessing efficiency with which one gets there
as well as asking
how do people think (what are processes and representations)?
strong emphasis on human imperfection
why are people apparently irrational in thinking?
what limits efficiency of thinking (relative to ideal thinker)?
practical motivation for focus:
practical importance of fallibility of medical, legal, military, etc. decision-making and
possible remediation with training and/ IT support
allow for it in system design, and attempts to change behaviour
what is the general dual-process theory of reasoning, problem-solving and decision making?
use 2 kinds of process:
‘system 2’ – slow, chunky, sequential, effortful 0 but rational, logical, general-purpose – conscious reasoning system
constrained by limited WM capacity and other basic limitations of cog machinery
‘system 1’ – intuitive, automatic, largely unconscious, fast-and-frugal, quick and dirty, approx. – but domain-specific – procedures, schemas, rules of thumb/heuristics, that
are adaptive and mostly effective when applied in appropriate domain, but
only approx. – with some built in biases
may lead to error if applied to inappropriate domain
judgements of prob/freq
some facts about freq told to use/can be looked up:
E.g. Lifetime morbid risk of sz = 0.7%
•
but many judgements of prob/freq we make based on experience, e.g.
will it rain today?
if I get a train to Paddington, how likely is it to be late?
availability in memory
availability heuristic: Judge as more probable/freq events/objects of which more examples readily ‘available’ – in memory/env (Tversky and Kahneman, 1973)
works because generally easier to retrieve from memory examples of events/objects that are more freq
unfortunately, retrievability also determined by other factors:
recency
salience
similarity to current state
hence tend to over-estimate prob of events of which know examples that easily retrievable – e.g. because recent, personally salient/similar to present instance – availability bias
examples of availability bias
screening of ‘Jaws’ caused drop in number of people swimming off coast Cali
drivers tend to slow down – for a while – after seeing accident/police car
people tend to
overestimate risks of dying of rarer causes but underestimate risks of dying of common causes (Slovic et al., 1980)
unreasonably fearful about children being murdered in modern Britain, as to, compared to being run over
neglect of base rate and representativeness bias
when evaluate particular cases:
tend to ignore imp source of info:
knowledge of ‘base rates’
overall freqs of particular classes of event
if something/someone has features representative of being X, tend to think they have standard properties of X
may be basis for best guess about category member in absence of other info, but biases us to attribute prototypical properties even when have other info
representativeness bias and sequential events
have difficulty ignoring representativeness (/unusualness) of sequence and focusing on what we know about the probabilities of indv events
functional fixedness in problem-solving
classic exps of Gestalt psychs
E.g. Duncker (1945) asked subjects to find way of supporting lighted candle on vertical wooden wall, given props
less successful than subjects given same problem but with drawing pins tipped out of box
Conservation and confirmation bias in inductive reasoning
in ordinary life and scientific research, try to come up with rule/principle to describe instances have experienced, and test hypothesised rule against further observations
Wason’s (1960) 2 4 6 exp:
‘this sequence – 2, 4, 6 – generated by rule – have to try to guess rule, by trying out other sequences’
P then has to generate further sequences of 3 numbers, receiving feedback: ‘yes: fits the rule’/’no: doesn’t fit’
and declare his/her hyps about what rule is
Ps tended to offer over-specific hyps, e.g. ‘The numbers increase by steps of 2’ and
(a) are reluctant to abandon hyps (conservative)
(b) tend to seek confirmatory rather than disconfirmatory evidence
scientists just as prone
problem solving
problem solving research studies situs where there is start state and goal state and have to get to goal as quickly as possible, using set of avail operators and subject to certain constraints
missionaries and cannibals
Luchins’ water-jug problems
starting with full 8-pint jug, an empty 5-pint jug, and an empty 3-pint jug, end up with exactly 4 pints of water in largest jug
tower of Hanoi
the problem space (Newell and Simon, 1972)
if problem is soluble, at least one path through state space between start and goal states
problem-solver must search for operators that will:
move him/herself through intermediate states on path approaching goal
avoid need for backing up from dead-ends/going around in circles
minimise path length
w/o knowing in advance what optimal path is/what intermediate states will be transversed
WM capacity limits and heuristics in problem-solving
given huge ‘workspace’ and time could exhaustively enumerate all possible legal ‘moves’ and pick shortest path
but don’t have WM capacity for this: Hence must
recognise familiar patterns and retrieve previously effective moves from LTM
hunt for way between initial and goal states in small steps, using heuristics such as mean-end analysis and don’t-repeat-a-move-if-possible
means-end analysis: Pick general means for reaching goal: If that means not yet available, create sub-goal of achieving means until one generated sub-goal that can be satisfied by available operator
requires maintenance of ‘goal stack’ in WM
reprise: Design limitations intrinsic to cog machinery
‘design limitations’ in cog capacities (properties of memory, retrieval, limited WM, difficulty in attending to relevant info, difficulty in shifting cog ‘set’ and general effortfulness of sequential reasoning:)
lead to reliance of heuristics (approx. rules of thumb)
result in intrinsic biases when apply heuristics
mental models and syllogistic reasoning
do we reason with mental version of formal logic?
no! – given set of premises, imagine one/more possible concrete worlds in which premises true – mental models
then generate conclusion/determine whether conclusion offered valid, by examining mental model(s)
errors arise through:
failure to generate all possible mental models for premises
lack of WM capacity for maintaining multiple models
if we construct only first mental model, and find it matches conclusion, think inference is valid
but isn’t: Second model also describes state of affairs consistent with premises, in which conclusion is false