Computationalism contradicts itself
Maudlin argues that there is a potential inconsistency between computationalism's nontriviality condition and its supervenience thesis (see detailed text).
Imagine two machines are engaged in the same physical activity and are running the same consciousness program.

One of these machine supports counterfactual states, whereas the other doesn't.

The computationalist must claim based on the nontriviality condition, that the machine capable of supporting counterfactual states is conscious and the other isn't.

But this contradicts the supervenience thesis: each machine exhibits the same physical activity, but according to computationalism only one is conscious

Tim Maudlin (1989).

Notes:

The machines Maudlin describes are actually complex systems of water troughs, hoses, chains, and pipes.

Counterfactual: Counterfactuals are conditional (if-then) statements whose "if" clause runs counter to the facts of reality. For example, the statement "if pigs had wings then they would fly", is a counterfactual because the if clause—that pigs have wings—is false.

Three Premises Of Computationalism

Any computational theory of consciousness assumes the following three premises:

1) Computational condition: A physical system running an appropriately programmed machine table is sufficient for supporting consciousness.

2) Nontriviality condition: It is necessary that the system supports counterfactual states, that is, states the system would have gone into had input been different.

3) Supervenience thesis: Two physical systems engaged in the same physical activity will possess identical mentality (assuming they have any at all). See the definition of physicalism in the "Does physicalism show that computers can be conscious?" arguments on this map.

From Tim Maudlin (1989).

Running a program is not enough to create a conscious mental state, i.e, computationalism is false. We can see that it's false by proving it self-contradictory with an imagined counter-example. It's easy to imagine two machines with identical physical activity running a "consciousness program," one of which supports counterfactual states, while the other doesn't. The computationalist must claim, based on the Non-triviality condition, that one is conscious and the other isn't. But this contradicts the supervenience thesis.

The Maudlin argument

Maudlin writes:

"The burden of the remainder of this essay will be to demonstrate that the sufficiency condition, the necessity condition, and the supervenience thesis form an inconsistent triad, and hence that an acceptable computational theory of consciousness is not possible"
(T. Maudlin, 1989, p. 413).

Maudlin describes the contradiction as follows:

"In short, the computationalist is committed to the claims that the armature moving without the extra machinery hooked up cannot be conscious and that the system composed of the armature moving with the machinery hooked up must be conscious. But the physical activities that occur with and without the idle machinery connected are exactly identical, so these two claims contradict the supervenience thesis" (T. Maudlin, 1989, p. 423).

References

Maudlin, Tim. 1989. Computation and consciousness.The Journal of Philosophy, vol. LXXXVI, no. 8, pp. 407-432.
CONTEXT(Help)
-
Artificial Intelligence »Artificial Intelligence
Can computers think? [1] »Can computers think? [1]
No: computers can't be conscious [6] »No: computers can't be conscious [6]
Implementable in functional system »Implementable in functional system
Computationalism »Computationalism
Computationalism contradicts itself
Causal interaction necessary for consciousness »Causal interaction necessary for consciousness
Reject supervenience »Reject supervenience
The physical activity is different in each system »The physical activity is different in each system
There would be differences between the complex conscious computers »There would be differences between the complex conscious computers
+Comments (0)
+Citations (0)
+About