Computationalism
Computationalism locates the mental in abstract computational states embedded in a complex network of inputs, outputs, and other mental states: machine-state functionalism locates it in the various possible machine states that could implement them.
A given computation (2 x 2, for example) can be performed by many different machine table operations (1 + 1 + 1 + 1, 3 + 1, etc).

Jerry Fodor (1975)

Note: Computationalism is also referred to as psychofunctionalism.

A mental state is a computational state. Many different machine tables can perform the same computation (multiplying 2x2, for example). A mental state is not a machine state, but a computational state, which can be instantiated over a variety of machine states, and different machine tables. As such, any mental state is still understood as part of a complex network of stimuli inputs, other mental states, and behavioral outputs.

The Fodor argument

Jerry Fodor's own summary:

"The main argument of this book runs as follows:

1. The only psychological models of cognitive processes that seem even remotely plausible represent such processes as computational.

2. Computation presupposes a medium of computation: a representational system.

3. Remotely plausible theories are better than no theories at all.

4. We are thus provisionally committed to attributing a representation system to organisms. 'Provisionally committed' means: committed insofar as we attribute cognitive processes to organisms and insofar as we take seriously such theories of these processes as are currently available.

5. It is a reasonable research goal to try to characterize the representational system to which we thus find ourselves provisionally committed.

6. It is a reasonable research strategy to try to infer this characterization from the details of such psychological theories as seem likely to prove true.

7. This strategy may actually work: It is possible to exhibit specimen inferences along the lines of item 6 which, if not precisely apodictic, have at least an air of prima facie plausibility"
(J. Fodor, 1975, p. 27)

"What one tries to do in cognitive psychology is to explain the propositional attitudes of the organism by reference to its (hypothetical) computational operations, and that the notion of a computational operation is being taken literally here; viz., as an operation defined for (internal) formulae" (J. Fodor, 1975, p. 76).

References

Fodor, Jerry. 1975. The Language of Thought. Cambridge: Harvard University Press.

Three Premises Of Computationalism

Any computational theory of consciousness assumes the following three premises:

1) Computational condition: A physical system running an appropriately programmed machine table is sufficient for supporting consciousness.

2) Nontriviality condition: It is necessary that the system supports counterfactual states, that is, states the system would have gone into had input been different.

3) Supervenience thesis: Two physical systems engaged in the same physical activity will possess identical mentality (assuming they have any at all). See the definition of physicalism in the "Does physicalism show that computers can be conscious?" arguments on this map.

From Tim Maudlin (1989). 


Immediately related elementsHow this works
-
Artificial Intelligence »Artificial Intelligence
Can computers think? [1] »Can computers think? [1]
No: computers can't be conscious [6] »No: computers can't be conscious [6]
Implementable in functional system »Implementable in functional system
Computationalism
Computationalism contradicts itself »Computationalism contradicts itself
Jerry Fodor »Jerry Fodor
+Komentarai (0)
+Citavimą (1)
+About