A dilemma about cognition and intentionality

The internalization reply relies on the notion that cognition and intentionallity are necessarily connected. But this leads to the dilemma described in the Expanded text.

Either, the man engages in cognitive activity without Chinese-speaking intentionality. When he operates the internalized Chinese Room he actively thinks, even though he doesn't understand what the squiggles & squoggles mean. In which case, we can't justifiably deny the presence of some kind of cognition in the internalized Chinese Room.

Or, the man engages in cognitive activity with Chinese-speaking intentionality (e.g. he may know that "Squiggle-squoggle" is generally followed by "squoggle-squoggle"). In which case, that intentionality may carry over to the internalized Chinese Room as well.

In either case, the internalization reply doesn't conclusively refute the idea that machines can think.

Philip Cam, 1990.
RELATED ARTICLESExplain
Artificial Intelligence
Can computers think? [1]
Yes: physical symbol systems can think [3]
The Chinese Room Argument [4]
The Systems Reply
The Internalisation Reply
A dilemma about cognition and intentionality
Man understands Chinese but can't translate to English
Phenomenology not required
The "part-of" principle is fallacious
The levels of conscious involvement dilemma
The Subsystem understands
Understanding is a result of speed and complexity
We lack intuitions about the internalizing man
Graph of this discussion
Enter the title of your article


Enter a short (max 500 characters) summation of your article
Enter the main body of your article
Lock
+Comments (0)
+Citations (0)
+About
Enter comment

Select article text to quote
welcome text

First name   Last name 

Email

Skip