Chinese-speaking too limited a counterexample

The Chinese Room only displays a subset of intentional human behaviours: Chinese speaking behaviour. It doesn't model a system that possesses all essentials of human intentionality; and so isn't a real counterexample to computerised intentionality.

For example, the Chinese Room doesn't model the ability to abstract universals from particulars, to recognise the same melody played in different keys, to perceive resemblances without matching common properties, and so on.

A system that had these abilities would be able to think, but the Chinese room doesn't have these abilities and so isn't a real counterexample to computerised intentionality.

R.J. Nelson, 1989.

Note: also, see "The test is too narrow", Map 2, Box 57.
RELATED ARTICLESExplain
Artificial Intelligence
Can computers think? [1]
Yes: physical symbol systems can think [3]
The Chinese Room Argument [4]
Chinese-speaking too limited a counterexample
The Syntax-Semantics Barrier
Only minds are intrinsically intentional
Understanding arises from right causal powers
Can't process symbols predicationally or oppositionally
Chinese Room refutes strong AI not weak AI
The Combination Reply
The Systems Reply
Robot reply: Robots can think
The Brain Simulator Reply
The Many Mansions Reply
The Pseudorealisation Fallacy
Searle's Chinese Room is trapped in a dilemma
Chinese Room more than a simulation
Man in Chinese Room doesn't instantiate a progam
The Chinese Room makes a modularity assumption
Man in Room understands some Chinese questions
The Chinese Room argument is circular
There are questions the Chinese Room can't answer
Graph of this discussion
Enter the title of your article


Enter a short (max 500 characters) summation of your article
Enter the main body of your article
Lock
+Comments (0)
+Citations (0)
+About
Enter comment

Select article text to quote
welcome text

First name   Last name 

Email

Skip