Human understanding isn't reducible to internal semantics
No program, even if it can learn, can possibly have understanding in the human sense. Primitive categories are necessary to understanding, and a program can't generate these categories by itself.
Neil Jahren, 1990.
CONTEXT(Help)
-
Artificial Intelligence »Artificial Intelligence
Can computers think? [1] »Can computers think? [1]
Yes: physical symbol systems can think [3] »Yes: physical symbol systems can think [3]
The Chinese Room Argument [4] »The Chinese Room Argument [4]
The Syntax-Semantics Barrier »The Syntax-Semantics Barrier
Programs that learn can overcome the barrier »Programs that learn can overcome the barrier
Internal semantics in syntactic networks that learn »Internal semantics in syntactic networks that learn
Human understanding isn't reducible to internal semantics
The Chinese Record Book thought experiment »The Chinese Record Book thought experiment
+Comments (0)
+Citations (0)
+About