Simulation duplicates if inputs/outputs are same
If a simulation uses the same kinds of inputs/outputs that the simulated phenomenon uses, then it is also a duplication.
The Chinese Room, for example, uses the same kinds of inputs and outputs (i.e. symbol strings) to model speaking as humans use when they speak, hence it is a duplication as well as a simulation.

Computer simulations of fire, digestion, etc are different—and because they don't use the same kinds of input and output, they are mere simulations.

The Carleton argument

"Here Searle's line of thought derails.  The input to a fire simulation is not oxygen, tinder, and catalyst.  That is, it is not the same input as goes into a natural system which produces fire.  Similarly, the input into our imagined thirst-simulator is anything but the dried-out state of tissues, etc., which goes into a natural system producing thirst.  Paradigmatically, the input into these two examples is only typed messages.  But typed messages are one kind of input that goes into a natural language-understanding system, e.g., a human being.  The fire simulator and the thirsty machine do not receive the right input, and that is why they produce only simulations, not the genuine article.  But typed messages are (one sort of) the right input.  Here the language-understanding simulation differs in a crucial way from the other simulations"   (L. Carleton, 1984, p. 222).

"I have claimed that not just any simulation is claimed to achieve the actual state or phenomena being simulated.  The inputs and outputs have to be actual inputs and outputs.  Typed messages are not oxygen or tinder, nor dried-up cells.  But typed messages are typed messages.  Now, why should this distinction be intuitively acceptable?  It should, I suggest, because it is by virtue of dealing in the right kinds of input  and out put that one system can play the role in a situation that is played by a system we acknowledge to literally undergo the cognition our system simulates"   (L. Carleton, 1984, p. 222-223).

Source: Carleton, Lawrence R.  1984. Programs, language understanding, and Searle. Synthese  59: 219-230.

Note: Supported by "The Chinese Room is More than a Simulation" Box 55, Map 4.
PAGE NAVIGATOR(Help)
-
Artificial Intelligence »Artificial Intelligence
Can the Turing Test determine this? [2]  »Can the Turing Test determine this? [2] 
No: simulated intelligence isn't real intelligence »No: simulated intelligence isn't real intelligence
Simulations are not duplications »Simulations are not duplications
Simulation duplicates if inputs/outputs are same
+Commentaar (0)
+Citaten (0)
+About