The Chinese Room Argument [4]
Instantiation of a formal program isn't enough to produce semantic understanding or intentionality. A man who doesn't understand Chinese, can answer written Chinese questions using an English rulebook telling him how to manipulate Chinese symbols.
Robert Horn Map 4: Can Chinese Rooms Think?

The Chinese Room Argument was proposed by John Searle (1980 & 1990):


Imagine that a man who does not speak Chinese sits in a room and is passed Chinese symbols through a slot in the door. To him, the symbols are just so many squiggles and squoggles. But he reads an English-language rule book that tells him how to manipulate the symbols and which ones to send back out.

To the Chinese speakers outside, whoever (or whatever) is in the room is carrying on an intelligent conversation. But the man in the Chinese room does not understand Chinese; he is merely manipulating symbols according to a rulebook. He is instantiating a formal program, which passes the Turing test for intelligence, but nevertheless he does not understand Chinese. This shows that the instantiation of a formal program is not enough to produce semantic understanding or intentionality.

Note: for more on Turing tests, see Map 2. For more on formal programmes and instantiation, see the "Is the brain a computer arguments?" on Map 1, the "Can functional states generate consciousness?" arguments on Map 6, and "Formal systems:an overview" on Map 7.

Intentionality: The property (in reference to a mental state) of being directed at a state of affairs in the world. For example, the belief that Sally is in front of me is directed at a person, Sally, in the world.

Although there are important and subtle distinctions in the definitions of the words, intentionality is sometimes taken in this debate to be synonymous with representation, understanding, consciousness, meaning, and semantics.
Immediately related elementsHow this works
-
Artificial Intelligence Â»Artificial Intelligence
Can computers think? [1] Â»Can computers think? [1]
Yes: physical symbol systems can think [3] Â»Yes: physical symbol systems can think [3]
The Chinese Room Argument [4]
The Syntax-Semantics Barrier Â»The Syntax-Semantics Barrier
Only minds are intrinsically intentional Â»Only minds are intrinsically intentional
Understanding arises from right causal powers Â»Understanding arises from right causal powers
Can't process symbols predicationally or oppositionally Â»Can't process symbols predicationally or oppositionally
Chinese Room refutes strong AI not weak AI Â»Chinese Room refutes strong AI not weak AI
The Combination Reply Â»The Combination Reply
The Systems Reply Â»The Systems Reply
Robot reply: Robots can think Â»Robot reply: Robots can think
The Brain Simulator Reply Â»The Brain Simulator Reply
The Many Mansions Reply  Â»The Many Mansions Reply
The Pseudorealisation Fallacy Â»The Pseudorealisation Fallacy
Searle's Chinese Room is trapped in a dilemma Â»Searle's Chinese Room is trapped in a dilemma
Chinese Room more than a simulation Â»Chinese Room more than a simulation
Man in Chinese Room doesn't instantiate a progam Â»Man in Chinese Room doesn't instantiate a progam
Chinese-speaking too limited a counterexample Â»Chinese-speaking too limited a counterexample
The Chinese Room makes a modularity assumption Â»The Chinese Room makes a modularity assumption
Man in Room understands some Chinese questions Â»Man in Room understands some Chinese questions
The Chinese Room argument is circular Â»The Chinese Room argument is circular
There are questions the Chinese Room can't answer Â»There are questions the Chinese Room can't answer
Yes: because a brain is a computer Â»Yes: because a brain is a computer
Conscious agents can instantiate computer programs  Â»Conscious agents can instantiate computer programs
John Searle Â»John Searle
Can Chinese Rooms think? [4]  Â»Can Chinese Rooms think? [4] 
Overt behavior doesn't demonstrate understanding Â»Overt behavior doesn't demonstrate understanding
+Commentaires (0)
+Citations (0)
+About