Rejection of supervenience is unacceptable

Rejection of supervenience—ie that identical physical activity gives rise to identical conscious states—leads to the unacceptable consequence that a brain-o-scope could reveal identical neuronal activity associated with two different mental states.

It could even reveal (down to last atom) the one person is conscious whereas the other isn't.

Tim Maudlin (1989).

The Maudlin argument

Maudlin says:

"Even if mental states supervene on more than just the physical activity of a system, the crucial role of the entirely isolated block remains inexplicable. And, in countenancing the possibility of such effects, the computationalist would cut himself off from the research tradition from which the tradition grew. To see this, let us apply the point directly to brain activity...."

Maudling supports this with:

"Let us suppose that some time in the future the electro-encephalograph is so perfected that it is capable of recording the firing of every single neuron in the brain. Suppose that researchers take two different surveys of a brain which match exactly: the very same neurons fire at exactly the same rate and in exactly the same pattern through a given period. They infer (as surely they should!) that the brain supported the same occurrent conscious state through the two periods. But the computationalist now must raise a doubt. Perhaps some synaptic connection has been severed in the interim. Not a synaptic connection of any of the neurons which actually fired during either period, or which was in any way involved in the activity recorded by the encephalograph. Still, such a change in connection will affect the counterfactuals true of the brain, and so can affect the subjective state of awareness. Indeed, the computationalist will have to maintain that perhaps the person in question was conscious through the first episode but no conscious at all through the second. I admit to a great degree of mystification about the connection between mind and body, but I see no reason to endorse such possibilities that directly contradict all that we do know about brain process and experience" (T. Maudlin, 1989, p. 426).

References

Maudlin, Tim. 1989. Computation and Consciousness. The Journal of Philosophy, vol. LXXXVI, no. 8, p. 407-432.
RELATED ARTICLESExplain
Artificial Intelligence
Can computers think? [1]
No: computers can't be conscious [6]
Implementable in functional system
Computationalism
Computationalism contradicts itself
Reject supervenience
Rejection of supervenience is unacceptable
Graph of this discussion
Enter the title of your article


Enter a short (max 500 characters) summation of your article
Enter the main body of your article
Lock
+Comments (0)
+Citations (0)
+About
Enter comment

Select article text to quote
welcome text

First name   Last name 

Email

Skip