The levels of conscious involvement dilemma
Once the man internalizes the system, the Chinese Room argument fails regardless of what level of involvement we imagine the man to have with it.
Either, the man unconscioulsy follows the rules in the rule book, in which case strong AI would not expect the man to understand Chinese any more than it would expect any other element of the Chinese speaking system to understand Chinese.

Or, the man conscioulsy follows the rules in the rule book, in which case as far as we know the man may understand Chinese afterall.

In either case, the predictions of strong AI are not refuted.

John Fisher, 1988.
Immediately related elementsHow this works
-
Artificial Intelligence »Artificial Intelligence
Can computers think? [1] »Can computers think? [1]
Yes: physical symbol systems can think [3] »Yes: physical symbol systems can think [3]
The Chinese Room Argument [4] »The Chinese Room Argument [4]
The Systems Reply »The Systems Reply
The Internalisation Reply »The Internalisation Reply
The levels of conscious involvement dilemma
+Commentaires (0)
+Citations (0)
+About