Past disappointments
AI workers have little more than past disappointments to their credit as the number of facts necessary to capture the entirety of common sense knowledge is insurmountably large.
James Lighthill, 1973.

Machine agents can only cope successfully in limited domains, such as the game of checkers, where a small number of facts exhaustively describe the agent's world. To scale up from such artificial worlds to the real world simply by adding more facts is not possible, because this leads to a combinatorial explosion of the number of ways in which elements can be grouped in a knowledge base.

Supported by "The Lighthill Report" Box 74.

Note: This argument summarizes a number of Lighthill's opinions, and it represents in early form versions of the problem of toy worlds, brittleness, combinatorial explosion, and commonsense knowledge, all of which are well known problems today.
PAGE NAVIGATOR(Help)
-
Artificial Intelligence »Artificial Intelligence
Can computers think? [1] »Can computers think? [1]
Yes: physical symbol systems can think [3] »Yes: physical symbol systems can think [3]
The Knowledge Base Assumption »The Knowledge Base Assumption
The problem of commonsense knowledge »The problem of commonsense knowledge
Past disappointments
+Commentaar (0)
+Citaten (0)
+About