Chinese-speaking too limited a counterexample
The Chinese Room only displays a subset of intentional human behaviours: Chinese speaking behaviour. It doesn't model a system that possesses all essentials of human intentionality; and so isn't a real counterexample to computerised intentionality.
For example, the Chinese Room doesn't model the ability to abstract universals from particulars, to recognise the same melody played in different keys, to perceive resemblances without matching common properties, and so on.

A system that had these abilities would be able to think, but the Chinese room doesn't have these abilities and so isn't a real counterexample to computerised intentionality.

R.J. Nelson, 1989.

Note: also, see "The test is too narrow", Map 2, Box 57.
Immediately related elementsHow this works
-
Artificial Intelligence Â»Artificial Intelligence
Can computers think? [1] Â»Can computers think? [1]
Yes: physical symbol systems can think [3] Â»Yes: physical symbol systems can think [3]
The Chinese Room Argument [4] Â»The Chinese Room Argument [4]
Chinese-speaking too limited a counterexample
+Kommentare (0)
+Verweise (0)
+About