Can never have a conscious experience

A robot can behave as if it were having a certain conscious experience. But it can never actually have a conscious experience, because experience and behaviour fall into two separate logical categories.

Stanley Jaki (1969).

A robot can display behavior; it can even display behavior as if it was having a certain conscious experience. But it can never actually have a conscious experience because experience and behavior fall into two separate logical categories, and only behavior can be built into a computer--conscious experience cannot.

The Jaki argument

Jaki supports his claim as follows:

"What is crucial is this respect is to keep in mind the essential difference that exists between behavior and experience. All that we observe in others, whether men, animals, or machines, is behavior. That another being perceived a beam of light we know by inference from its behavior. But for us it is the personal experience that tells about a particular perception. While behavior is essentially to perform something, to do something, subjective experience is not a derivative of one's externally observed performance or behavior. we do not have to observe our behavior of the highly activated state of the optic nerves in our brains before we can conclude that we have seen a light."

"However, in a machine which is to have "visual experience," the "experience" should first be verified by a separate monitoring system indicating that the machine's photoreceptors were properly activated. It is precisely this difference in procedure between a man and a machine that precludes the designing of any machine which could have an experience as we understand it."

"Experience and behavior fall into two different logical classes, and it is this difference that makes us say that man has mental experience or mind, while a machine can at most behave as if it had such an experience. One may add any number of components to such a machine; the act of experience cannot logically be built into it" (S. Jaki, 1969, p. 225).

References

Jaki, S.L. (1969) Brain, Mind, and Computers. p225

RELATED ARTICLESExplain
Artificial Intelligence
Can computers think? [1]
No: computers can't be conscious [6]
Can never have a conscious experience
It's unlikely that humans are conscious
Computers are not introspective
Computers can't have feelings
Consciousness excluded by definition
Consciousness is necessary to thought
Mechanisms can't possess consciousness
Consciousness is physical
Higher-order representational structures
Implementable in functional system
Let's just say robots are conscious
Rejection has undesirable consequences
Rule following doesn't deny consciousness
Graph of this discussion
Enter the title of your article


Enter a short (max 500 characters) summation of your article
Enter the main body of your article
Lock
+Comments (0)
+Citations (0)
+About
Enter comment

Select article text to quote
welcome text

First name   Last name 

Email

Skip