PHIL 250 Flashcards
Necessary and Sufficient Conditions
if x then y -> in this case x is sufficient for y and y is necessary for x. because logically, y is guaranteed if x occurs but if x doesnt occur then y could still occur without x. x can not occur without y also occuring therefore if y is not present then x must also not be present.
Validity Vs Soundness
validity is about the truth preserving nature of an argument. soundness is also concerned with whether the premises are true.
Ampliative vs Non-Ampliative
ampliative also known as inductive/abductive expands the base of knowledge but at the cost of a guaranteed conclusion as they have a lesser degree of logical strength. Deductive guarantee their conclusion but do not expand the base of knowledge ie they make clear what is already implicit in the premises.
Commitments of Decartes
- Substance - the world is composed of substances with unique properties and one attribute that makes the substance what it is
- There are two kinds of substances (dualism). Mind and body and mind pocesses the attribute of thought and body pocesses the attribute of extension.
- Interactionism - Mind and body interact with each other
Princess bohemia and Descartes response
I beseech you tell me how the soul of man (since it is but a thinking
substance) can determine the spirits of the body to produce voluntary
actions. For it seems every determination of movement happens from
an impulsion of the thing moved, according to the manner in which it is
pushed by that which moves it, or else, depends on the qualification and
figure of the superficies of the latter. Contact is required for the first
two conditions, and extension for the third. You entirely exclude
extension from your notion of the soul, and contact seems to me
incompatible with an immaterial thing
DESCARTEs
Thoughts are discrete and singular.
If the mind/body conduit were non-singular
(e.g. a pair of brain hemispheres), then thoughts
would not be singular.
So, the mind/body conduit is not non-singular
(i.e. it is singular).
The pineal gland is the only singular
component of the body/nervous system.
(C) So, the pineal gland is the mind/body conduit.
Empirical: The pineal gland performs no such
function.
Conceptual: No matter what part of the brain or
body Descartes picks, he provides no answer to
the question, how can an immaterial substance
causally interact with a material substance?
Decartes argument of qualitative distinctness
Bodies Minds
Spatial Non-spatial
Mathematically quantifiable Non-quantifiable ‘qualitative’ properties (e.g., feel)
Epistemically public (/objective) Epistemically private (/subjective)
Purposeless/normatively ‘inert’ Purposeful and normatively evaluable
If x and y have different properties, then x and y are not identical (from
Leibniz’s law).
My mind and my body differ in their properties (see previous slide).
Therefore, my mind and my body are distinct.
Decartes argument of concievability
I can conceive, even under ideal conditions (i.e.,
without any restrictions on time, intelligence,
attention, background knowledge, etc.) of a
situation where my mind exists just as it
actually does, but my body does not exist.
Whatever I can conceive of under ideal
conditions, is a possible way the world might
have been (in principle, even if not in practice).
If my mind could (in principle) have existed just
as it actually does without my body, then my
mind is not the same thing as my body.
So, my mind is not the same thing as my body.
Descartes points on animal mindedness
Animal behavior can be explained entirely mechanistically .By the principle of parsimony (/Occam’s Razor), we
have rational grounds for adopting the Mechanistic
Alternative and rejecting the Non-mechanistic
Alternative: the two accounts are matched in their
explanatory and predictive power, but the Non-
mechanistic alternative is less parsimonious.
La mettries response to decartes parsimony argument
La Mettrie noted that, by the same reasoning, we
should be able to infer that humans lack minds. But
that conclusion seems obviously false in our case:
we know that we have minds.
Since we have more confidence in the claim that
we, ourselves, have minds than we do in any
premise in Descartes’ argument, we must reject
the reasoning that led us to this obviously false
conclusion.
The language response + slippery slope
If something has a mind, then it has
the potential to acquire language.
No nonhuman animal has the
potential to acquire language.
Therefore,
No nonhuman animal has a mind.
They would have an immortal soul like us. This
is unlikely, because there is no reason to believe it
of some animals without believing it of all, and
many of them such as oysters and sponges are too
imperfect for this to be credible.”
The Turing Test (Maximal vs minimal)
Minimum Turing Test: Fool a human being, at least once, into thinking you’re human, after a brief
and relaxed text conversation.
* Maximum Turing Test: Reliably perform at 70% at tricking multiple judges into thinking you’re
human, no matter what searching or tricky questions they ask.
The Turing Machine and the toy example
Turing machines are abstract computational devices,
used to formally analyze what can be mechanically
computed.
* The “state machine” includes a set of instructions that
determine how, given any input, the machine changes its
internal states and outputs. This instruction program is
called a machine table.
Suppose a pop machine accepts only nickels (N) and dimes
(D) as input, and the pop costs 15 cents.
* The internal states consist only of (0), (5), (10).
* The machine outputs a pop when it is in state (5) and
receives (D) as input or when it is in state (10) and
receives either (D) or (N) as input; otherwise, the machine
waits.
Gilbert Ryle’s commitments
Consider the property of being fragile. This (perfectly
real) property characterizes how an object is disposed
to behave under various nonactual conditions (e.g., it
will tend to shatter if dropped onto a hard surface).
Likewise, to say that S believes it’s raining is to say that
S is behaviourally disposed to (among other things)
bring an umbrella if they’re going out, to reschedule
their plans to go sit on the beach with friends, to say
“it’s raining!” when asked what it’s like out, etc.
Logical behaviourists claim that we can analytically define
each mental state concept exhaustively in terms of
actual and possible behaviours.
Dogma of the ghost in the machine
Ryle argues that this entire project is
premised on a fundamental error – namely,
the assumption that ‘mind’ refers to a
certain kind of entity, when actually it
belongs to a different logical category. Ryle
calls this a category mistake:
Putnams response to behaviourism
Super Actors are a counterexample to the sufficiency of a given behavioural
disposition for being in pain.
Super Spartans are a counterexample to the necessity of a given behavioural
disposition for being in pain.
Chrisholms response to behaviourism
An early objection of behaviourism concerned an apparent
vicious circularity in its account of dispositional mental states
(Chisholm 1957).
Take the mental state of believing that it’s raining. Earlier, we
suggested that we could analyze this as the disposition to
(among other things) take an umbrella with one when leaving
the house, to cancel one’s beach plans, etc.
But someone who believes that it’s raining outside is disposed to
behave in these ways only if they also want to stay dry and believe
various other things to be true: e.g., that taking the umbrella with
them provides a way to stay dry.
More generally, which behavioral dispositions a given mental
state is associated with crucially depends on what other mental
states the subject possesses.
If so, then it seems we cannot identify how a type of mental
state will manifest behaviourally without presupposing countless
other mental states. We will have analyzed mental properties in
terms of … mental properties!
Commitments of Smart
Identity theory (brain state theory): For any mental state type, M,
there is a brain state type, B, such that M = B. Identity theory asserts a relation of identity between types. In the
case of pain, identity theory asserts that the mental type, PAIN,
just is a certain type of physical state: say, C-fiber activation.
Smarts argument from unification of science
Behavioral observation plus knowledge of physiology: At
least to date, any creature to whom we attribute
mental life has a complex neural structure: a brain.
* Scientifically observed mind-brain correlations: We are
learning more and more every day about the
nervous system and discovering more and more
systematic correlations between certain types of
mental state and certain types of brain state.
Observation: the observational phenomenon we
call ‘water’ perfectly correlates with a certain chemical
compound, H2O.
Inference: Water and H2O aren’t merely correlated.
Rather, water just is the chemical compound, H2O.
As Smart sees it, we must choose between two rival
hypotheses: either the observed correlations
between types of brain state and types of mental
state reflect a causal relationship, or the correlated
types of brain state and types of mental state are
not merely correlated types but the same type.
On the latter hypothesis, what initially appeared to be
two (perfectly correlated) phenomena turn out to be
one and the same phenomenon.
Smart’s argument from parsimony
Smart says “wishes to resist” the
dualist hypothesis because of
“Occam’s razor” – i.e., because the
physicalist hypothesis offers the
“simpler”, more parsimonious
explanation, and (other things being
equal) we should prefer theories that
do that
Multiple realisability
Conceptual worry: Agents with the same types of mental states but
differing brains are easily conceived.
* Empirical worry: It seems “overwhelmingly likely” that we will
discover one creature with the same mental state type but distinct
neural structures. So identity theory looks doomed as a science of the
mind.
Computational functionalism
The mind is the ‘software’
running on the ‘hardware’ of the
brain.
* Mental states are nothing more
than a certain causal-functional
roles within a system, i.e., in
terms of input-output relations
specified by a machine table.
Challenges Comp Func avoids
Avoids human chauvanism
Behavioural equivalence doesn’t entail mental equivalence (super
actor/super spartan thought experiment)
* Functionalists: we agree! Behavioural equivalence ≠ functional equivalence. E.g.,
the super actor’s internal state is functionally distinct from an ordinary person
experiencing pain.
* Mental states explain behaviour holistically, such that no mental state in
isolation has an associated behavioural disposition (Chisholm).
* Functionalists: since functional-causal roles are inter-defined, we can allow that
mental states explain behaviour holistically.
Extension of Comp Func
Don’t chauvinistically privilege the physical states inside
an agent’s skull over physical states outside the agent’s
skull if the states are functionally equivalent.
working through one’s thoughts by
getting one’s thoughts down on paper
* ‘thinking out loud’ with a friend
* gesturing while speaking
* physically re-arranging one’s scrabble
pieces (rather than re-arranging them
using one’s imagination) to more
easily see what words one can form
Otto and Inga
Otto and Inga going to the MoMa: Inga
remembers the address in the normal way
(internally). Otto, an Alzheimer’s patient, has
the address written down in his notebook
(which he automatically consults in the
course of going about his daily activities).