Understanding arises from right causal powers

Systems capable of semantic understanding and intentionality must have at least the same causal powers as brains. Brains have sufficient causal powers to produce understanding: it's an open empirical question whether other materials (eg silicon) do.

John Searle 1980a, 1980b, 1990b.



RELATED ARTICLESExplain
Artificial Intelligence
Can computers think? [1]
Yes: physical symbol systems can think [3]
The Chinese Room Argument [4]
Understanding arises from right causal powers
Biological Naturalism
Brain's causal powers reproduced by a computer
Chinese Room style argument shows causal powers insufficient
Programming may be necessary to understanding
Searle commits fallacy of denying the antecedent
Sufficiency doesn't imply necessary powers
John Searle
The Syntax-Semantics Barrier
Only minds are intrinsically intentional
Can't process symbols predicationally or oppositionally
Chinese Room refutes strong AI not weak AI
The Combination Reply
The Systems Reply
Robot reply: Robots can think
The Brain Simulator Reply
The Many Mansions Reply
The Pseudorealisation Fallacy
Searle's Chinese Room is trapped in a dilemma
Chinese Room more than a simulation
Man in Chinese Room doesn't instantiate a progam
Chinese-speaking too limited a counterexample
The Chinese Room makes a modularity assumption
Man in Room understands some Chinese questions
The Chinese Room argument is circular
There are questions the Chinese Room can't answer
Graph of this discussion
Enter the title of your article


Enter a short (max 500 characters) summation of your article
Enter the main body of your article
Lock
+Comments (0)
+Citations (0)
+About
Enter comment

Select article text to quote
welcome text

First name   Last name 

Email

Skip