The Chinese Room Argument [4]
Instantiation of a formal program isn't enough to produce semantic understanding or intentionality. A man who doesn't understand Chinese, can answer written Chinese questions using an English rulebook telling him how to manipulate Chinese symbols.
Robert Horn Map 4: Can Chinese Rooms Think?

The Chinese Room Argument was proposed by John Searle (1980 & 1990):


Imagine that a man who does not speak Chinese sits in a room and is passed Chinese symbols through a slot in the door. To him, the symbols are just so many squiggles and squoggles. But he reads an English-language rule book that tells him how to manipulate the symbols and which ones to send back out.

To the Chinese speakers outside, whoever (or whatever) is in the room is carrying on an intelligent conversation. But the man in the Chinese room does not understand Chinese; he is merely manipulating symbols according to a rulebook. He is instantiating a formal program, which passes the Turing test for intelligence, but nevertheless he does not understand Chinese. This shows that the instantiation of a formal program is not enough to produce semantic understanding or intentionality.

Note: for more on Turing tests, see Map 2. For more on formal programmes and instantiation, see the "Is the brain a computer arguments?" on Map 1, the "Can functional states generate consciousness?" arguments on Map 6, and "Formal systems:an overview" on Map 7.

Intentionality: The property (in reference to a mental state) of being directed at a state of affairs in the world. For example, the belief that Sally is in front of me is directed at a person, Sally, in the world.

Although there are important and subtle distinctions in the definitions of the words, intentionality is sometimes taken in this debate to be synonymous with representation, understanding, consciousness, meaning, and semantics.
CONTEXT(Help)
-
Artificial Intelligence »Artificial Intelligence
Can computers think? [1] »Can computers think? [1]
Yes: physical symbol systems can think [3] »Yes: physical symbol systems can think [3]
The Chinese Room Argument [4]
The Syntax-Semantics Barrier »The Syntax-Semantics Barrier
Only minds are intrinsically intentional »Only minds are intrinsically intentional
Understanding arises from right causal powers »Understanding arises from right causal powers
Can't process symbols predicationally or oppositionally »Can't process symbols predicationally or oppositionally
Chinese Room refutes strong AI not weak AI »Chinese Room refutes strong AI not weak AI
The Combination Reply »The Combination Reply
The Systems Reply »The Systems Reply
Robot reply: Robots can think »Robot reply: Robots can think
The Brain Simulator Reply »The Brain Simulator Reply
The Many Mansions Reply  »The Many Mansions Reply
The Pseudorealisation Fallacy »The Pseudorealisation Fallacy
Searle's Chinese Room is trapped in a dilemma »Searle's Chinese Room is trapped in a dilemma
Chinese Room more than a simulation »Chinese Room more than a simulation
Man in Chinese Room doesn't instantiate a progam »Man in Chinese Room doesn't instantiate a progam
Chinese-speaking too limited a counterexample »Chinese-speaking too limited a counterexample
The Chinese Room makes a modularity assumption »The Chinese Room makes a modularity assumption
Man in Room understands some Chinese questions »Man in Room understands some Chinese questions
The Chinese Room argument is circular »The Chinese Room argument is circular
There are questions the Chinese Room can't answer »There are questions the Chinese Room can't answer
Yes: because a brain is a computer »Yes: because a brain is a computer
Conscious agents can instantiate computer programs  »Conscious agents can instantiate computer programs
John Searle »John Searle
Can Chinese Rooms think? [4]  »Can Chinese Rooms think? [4] 
Overt behavior doesn't demonstrate understanding »Overt behavior doesn't demonstrate understanding
+Comments (0)
+Citations (0)
+About