Critique of computationalism Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

T2-3-4 indistinguishability

A

The Turing indistinguishability:
T4: Symbolic, sensorimotor and neuromolecular indistinguishability
T3: Symbolic and sensorimotor indistinguishability
T2: Symbolic indistinguishability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Strong AI

A

Behaves like humans

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Weak AI

A

Has some human like characteristics

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Symbol grounding problem

A

Just manipulating the symbols is not by itself enough to guarantee cognition, perception, understanding, thinking and so forth.
Insisting on implementation independence, means that you can only design T2-indistinguishable systems, which will not have intrinsic semantics and thus not reflect cognition, which seems to be about something. To get semantics, you need at least T3-indistinguishability, losing implementation independence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Chinese room

A

He is a human being in a room and communicating but he does not know what the Chinese symbols mean

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Semantics ≠ syntax

A

Syntax does not give semantics

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Serial processing

A

Everything happens after each other

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Hardware

A

The physical thing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Software

A

The inner thing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Searle’s 3 axioms

A

Axiom 1: Computer programs are formal (syntactic)
Axiom 2: Human minds have mental contents (semantics)
Axiom 3: Syntax by itself is neither constitutive of nor sufficient for semantics

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Searle’s conclusion

A

Programs are neither constitutive of nor sufficient for minds

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Robot argument/reply

A

If the computer program was causally hooked up with the rest of the world, then it would acquire semantics (it would be able to ground symbols by causally interacting with the world)
Searles reply: Imagining the Chinese room inside the robot does not add semantics, even if the robot interacts with the world

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Semantics

A

The meaning of a word, phrase, or text.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Syntax

A

The arrangement of words and phrases to create well-formed sentences in a language.
The structure of statements in a computer language.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Intentionalism

A

Mental states are about something

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Ground (verb)

A

To provide basis for

17
Q

The system reply

A

The room understands it
Compare with a single (language) neuron relative to the whole brain
Searles reply: Saying that the room understands Chinese is analogous to claiming that he (Searle) understands Chinese, if he memorizes all the symbols and all the rules

18
Q

The brain simulator reply

A

If the operations of the brain of a Chinese speaker were simulated, such a system would understand Chinese, just like any Chinese speaker does
We build something that is very similar to the brain structure- If we can do this there must be understanding
Searles reply: (Lau not sure what Searle means) “Computer simulations of brain processes provide models of the formal aspects of these processes. But […] [t]he computational model is no more real than the computational model of any other natural phenomenon”

19
Q

Church/Turing Thesis (CTT)

A

All independent attempts to formalize what mathematicians mean by a “computation” or an “effective procedure,” even when they have looked different on the surface, have turned out to be equivalent

20
Q

Semantic operations

A

Operations that are based on the meanings of symbols, rather than just their shapes

21
Q

Trivial computation

A

Generating uninterpretable symbol systems

22
Q

Nontrivial computation

A

The kind that can be made systematic sense of. These are the ones we are interested in

23
Q

Weak CTTP

A

Claims that every physical system is formally equivalent to a Turing Machine

24
Q

Strong CTTP

A

A plane is a computer
It is wrong or uninteresting
If every physical process is in fact computation, then the cognitive scientist might as well have ignored the computationist’s message and just kept on hunting for what the right physical process(es) might turn out to be.