For example, the Chinese Room doesn't model the ability to abstract universals from particulars, to recognise the same melody played in different keys, to perceive resemblances without matching common properties, and so on.
A system that had these abilities would be able to think, but the Chinese room doesn't have these abilities and so isn't a real counterexample to computerised intentionality.
R.J. Nelson, 1989.
Note: also, see "The test is too narrow", Map 2, Box 57. |