Chinese-speaking too limited a counterexample
The Chinese Room only displays a subset of intentional human behaviours: Chinese speaking behaviour. It doesn't model a system that possesses all essentials of human intentionality; and so isn't a real counterexample to computerised intentionality.
For example, the Chinese Room doesn't model the ability to abstract universals from particulars, to recognise the same melody played in different keys, to perceive resemblances without matching common properties, and so on.

A system that had these abilities would be able to think, but the Chinese room doesn't have these abilities and so isn't a real counterexample to computerised intentionality.

R.J. Nelson, 1989.

Note: also, see "The test is too narrow", Map 2, Box 57.
PAGE NAVIGATOR(Help)
-
Artificial Intelligence »Artificial Intelligence
Can computers think? [1] »Can computers think? [1]
Yes: physical symbol systems can think [3] »Yes: physical symbol systems can think [3]
The Chinese Room Argument [4] »The Chinese Room Argument [4]
Chinese-speaking too limited a counterexample
+Commentaar (0)
+Citaten (0)
+About