The levels of conscious involvement dilemma

Once the man internalizes the system, the Chinese Room argument fails regardless of what level of involvement we imagine the man to have with it.

Either, the man unconscioulsy follows the rules in the rule book, in which case strong AI would not expect the man to understand Chinese any more than it would expect any other element of the Chinese speaking system to understand Chinese.

Or, the man conscioulsy follows the rules in the rule book, in which case as far as we know the man may understand Chinese afterall.

In either case, the predictions of strong AI are not refuted.

John Fisher, 1988.
RELATED ARTICLESExplain
Artificial Intelligence
Can computers think? [1]
Yes: physical symbol systems can think [3]
The Chinese Room Argument [4]
The Systems Reply
The Internalisation Reply
The levels of conscious involvement dilemma
A dilemma about cognition and intentionality
Man understands Chinese but can't translate to English
Phenomenology not required
The "part-of" principle is fallacious
The Subsystem understands
Understanding is a result of speed and complexity
We lack intuitions about the internalizing man
Graph of this discussion
Enter the title of your article


Enter a short (max 500 characters) summation of your article
Enter the main body of your article
Lock
+Comments (0)
+Citations (0)
+About
Enter comment

Select article text to quote
welcome text

First name   Last name 

Email

Skip