Man in Chinese Room doesn't instantiate a progam

A human being (or a homunculus) shuffling symbols in a room is not a proper instantiation of a computer program, and so the Chinese room argument does not refute AI.



Note:
for more multiple realisability arguments see the "Is the brain a computer?" arguments on Map 1, the "Can functional states generate consciousness?" arguments on Map 6 and sidebar "Formal systems: an overview" on Map 7.
RELATED ARTICLESExplain
Artificial Intelligence
Can computers think? [1]
Yes: physical symbol systems can think [3]
The Chinese Room Argument [4]
Man in Chinese Room doesn't instantiate a progam
Implementations of programs must perform reliably
Computers and humans run programs differently
Computers embody programs; they don't obey them
Proper instantiations require the right causal connections
Simulation requires duplication of functional interconnections
The proper algorithm is constitutive of thought
A properly designed Chinese room is Turing complete
The Syntax-Semantics Barrier
Only minds are intrinsically intentional
Understanding arises from right causal powers
Can't process symbols predicationally or oppositionally
Chinese Room refutes strong AI not weak AI
The Combination Reply
The Systems Reply
Robot reply: Robots can think
The Brain Simulator Reply
The Many Mansions Reply
The Pseudorealisation Fallacy
Searle's Chinese Room is trapped in a dilemma
Chinese Room more than a simulation
Chinese-speaking too limited a counterexample
The Chinese Room makes a modularity assumption
Man in Room understands some Chinese questions
The Chinese Room argument is circular
There are questions the Chinese Room can't answer
Graph of this discussion
Enter the title of your article


Enter a short (max 500 characters) summation of your article
Enter the main body of your article
Lock
+Comments (0)
+Citations (0)
+About
Enter comment

Select article text to quote
welcome text

First name   Last name 

Email

Skip