The Syntax-Semantics Barrier

Axiom 1: programs are formal (syntactic). Axiom 2: human minds have mental contents (semantics). Axiom 3: syntax by itself is neither constitutive of nor sufficient for semantics. Conc: programs are neither constitutive of nor sufficient for minds.

John Searle, 1990b , p.27.

Syntax & Semantics

Syntax and semantics are widespread notions in linguistics, philosophy, and cognitive science. Traditionally, syntax and semantics are branches of linguistics, syntax being the study of how words are arranged into sentences, semantics being study of meaning of language.

In philosophy, semantics generally involves questions of reference (how terms or names correspond to objects in the world) and truth (whether combinations of terms, as in a statement or sentences, correspond to facts in the world).
 
Syntax generally concerns the abstract structure of formal systems (e.g., rules regarding the ordering & manipulation of terms in predicate logic).

In Searle's Chinese Room argument, syntax is used to refer to formal structure or shape—the abstract ordering of the squiggles and squoggles and the relations the rulebook sets up between them.

Semantics is used to refer to an actual understanding of what the squiggles and squoggles mean in Chinese.

This is not, however, to say that everyone who addresses Searle understands these terms in the same way.

It may be helpful to note that in the context of the Chinese Room debate, syntax is often taken to be synonymous with such phrases as formal structure and computational form, and semantics is often taken to be synonymous with such terms as understanding, meaning, intentionality, and phenomenology.
RELATED ARTICLESExplain
Artificial Intelligence
Can computers think? [1]
Yes: physical symbol systems can think [3]
The Chinese Room Argument [4]
The Syntax-Semantics Barrier
Barrier's a problem for Searle's theory too
Notion of semantic hookup is problematic
Programs that learn can overcome the barrier
Searle's 3rd axiom requires scientific research
Semantics may result from Godelian self-reference
Syntax can generate natural meanings
The Empiricist Reply
The Luminous Room argument
Only minds are intrinsically intentional
Understanding arises from right causal powers
Can't process symbols predicationally or oppositionally
Chinese Room refutes strong AI not weak AI
The Combination Reply
The Systems Reply
Robot reply: Robots can think
The Brain Simulator Reply
The Many Mansions Reply
The Pseudorealisation Fallacy
Searle's Chinese Room is trapped in a dilemma
Chinese Room more than a simulation
Man in Chinese Room doesn't instantiate a progam
Chinese-speaking too limited a counterexample
The Chinese Room makes a modularity assumption
Man in Room understands some Chinese questions
The Chinese Room argument is circular
There are questions the Chinese Room can't answer
Graph of this discussion
Enter the title of your article


Enter a short (max 500 characters) summation of your article
Enter the main body of your article
Lock
+Comments (0)
+Citations (0)
+About
Enter comment

Select article text to quote
welcome text

First name   Last name 

Email

Skip