Views
Graph
Explorer
Focus
Down
Load 1 level
Load 2 levels
Load 3 levels
Load 4 levels
Load all levels
All
Dagre
Focus
Down
Load 1 level
Load 2 levels
Load 3 levels
Load 4 level
Load all levels
All
Tree
SpaceTree
Focus
Expanding
Load 1 level
Load 2 levels
Load 3 levels
Down
All
Down
Radial
Focus
Expanding
Load 1 level
Load 2 levels
Load 3 levels
Down
All
Down
Box
Focus
Expanding
Down
Up
All
Down
Page ✓
Article
Outline
Document
Down
All
Canvas
Time
Timeline
Calendar
Request email digest
Past 24 hours
Past 2 days
Past 3 days
Past week
Add
Add page
Add comment
Add citation
Edit
Edit page
Delete page
Share
Link
Bookmark
Embed
Social media
Login
Member login
Register now for a free account
🔎
Machines can't have emotions
Component
1
#108
Machines can never have emotional states.
CONTEXT
(Help)
-
Artificial Intelligence »
Artificial Intelligence
Artificial Intelligence☜A collaboratively editable version of Robert Horns brilliant and pioneering debate map Can Computers Think?—exploring 50 years of philosophical argument about the possibility of computer thought.☜F1CEB7
▲
Can computers think? [1] »
Can computers think? [1]
Can computers think? [1]☜Can a computational system possess all important elements of human thinking or understanding? ☜FFB597
▲
No: computers can't have emotions »
No: computers can't have emotions
No: computers can't have emotions☜Machines can never be in emotional states (they can never be angry, joyous, fearful etc), and emotions are necessary for thought.☜59C6EF
■
Machines can't have emotions
Machines can't have emotions☜Machines can never have emotional states.☜9FDEF6
●
Lack physiological components »
Lack physiological components
Lack physiological components☜Machines lack human physiology that is essential to emotions; for example, ability to secrete hormones and neuroregulators. Because machines cant reproduce such features through abstract computational processes, they cant possess emotions.☜98CE71
●
Machines can't love or be loved »
Machines can't love or be loved
Machines can't love or be loved☜Machines, which are mere collections of parts, cant love or be loved. Only unified wholes that govern their parts—e.g. humans, have the capacity to love what is lovable or be loved by those who love. Machines cant love and hence lack minds.☜98CE71
●
Machines can't think dialectically »
Machines can't think dialectically
Machines can't think dialectically☜Emotions are experienced in complicated dialectical circumstances, which require the ability to make judgements about others and gauge oppositions. Machines cant reason like this, and so cant experience emotions.☜98CE71
●
Only living organisms feel »
Only living organisms feel
Only living organisms feel☜The concept of feeling only applies ot living organisms. Because robots are mechanistic artifacts, not organisms, they cannot have feelings.☜98CE71
●
Emotion is a type of information processing »
Emotion is a type of information processing
Emotion is a type of information processing☜We will be able to build computers with emotions once we fully understand the biochemical and cybernetic aspects of human emotion.☜EF597B
●
Emotions are cognitive evaluations »
Emotions are cognitive evaluations
Emotions are cognitive evaluations☜Emotions are determined by the structure, content and organisation of knowledge representations and the processes operating on them. A machine equipped with the correct knowledge handling mechanisms, resulting in appropriate behaviour, has emotions.☜EF597B
●
Emotions are cognitive schemata »
Emotions are cognitive schemata
Emotions are cognitive schemata☜Machines can have emotions if i) they can model the complex schema involved in concepts such as pride and shame, and ii) these concepts can be (partially) responsible for the behaviour of the system.☜EF597B
●
Emotions are solution to a design problem »
Emotions are solution to a design problem
Emotions are solution to a design problem☜Emotions are the solution to the problem how to cope intelligently with a rapidly changing environment, given established goals and limited processing resources. The problem can be solved in humans and machines via computational strategies.☜EF597B
●
Emotions can be modelled »
Emotions can be modelled
Emotions can be modelled☜Modeling emotions involves programming a system to: 1) understand emotions (semantic task) and 2) behave emotionally through the interaction of emotional states and other cognitive states such as planning, learning and recall (behavioral task). ☜EF597B
●
Emotions manifestations of concern realisation system »
Emotions manifestations of concern realisation system
Emotions manifestations of concern realisation system☜Emotional states result from a concern realisation system that matches internal representations against actual circumstances in order to cope with an uncertain environment. Computers that can implement such a system go through emotional states.☜EF597B
●
Feelings are information signals »
Feelings are information signals
Feelings are information signals☜A robot could have feelings if it can process 2 kinds of signal: a) needs, which arise from lower-level distributed processes that monitor internal aspects of the body and b) emotions, cognitive interpretations of external events—esp. social events.☜EF597B
●
Honest robot talk about feelings equals feelings »
Honest robot talk about feelings equals feelings
Honest robot talk about feelings equals feelings☜If a robot can honestly talk about its feelings, it has feelings. Thus, configure robot to 1) use English as humans do, 2) distinguish truth from falsehood, 3) answer honestly—and then ask Are you concious of your feelings?. If it says yes, it is.☜EF597B
●
Motivational processes »
Motivational processes
Motivational processes☜Emotions are the product of emotional representations and arise from interactions between motives and other cognitive states. Motives represent world states to be achieved, prevented etc—a robot with proper motivational processes will have emotions.☜EF597B
●
Robot pain is theoretically possible »
Robot pain is theoretically possible
Robot pain is theoretically possible☜Our current understanding of pain is incoherent and self-contradictory, but once we have a coherent theory of pain, a robot could in principle be constructed to instantiate that theory and thereby feel pain.☜EF597B
●
Turing Test provides evidence for emotions »
Turing Test provides evidence for emotions
Turing Test provides evidence for emotions☜Because behaviours a key part of determining if a system has emotions, the Test provides evidence for emotional capacity as well as intelligence. If a robot passes the test and has a cognitively plausible internal structure, it can have emotions.☜EF597B
Heading
Summary
Click the button to enter task scheduling information
Open
Details
Enter task details
Message text
Select assignee(s)
Due date (click calendar)
RadDatePicker
RadDatePicker
Open the calendar popup.
Calendar
Title and navigation
Title and navigation
<<
<
December 2024
>
<<
December 2024
S
M
T
W
T
F
S
48
24
25
26
27
28
29
30
49
1
2
3
4
5
6
7
50
8
9
10
11
12
13
14
51
15
16
17
18
19
20
21
52
22
23
24
25
26
27
28
53
29
30
31
1
2
3
4
Reminder
No reminder
1 day before due
2 days before due
3 days before due
1 week before due
Ready to post
Copy to text
Enter
Cancel
Task assignment(s) have been emailed and cannot now be altered
Lock
Cancel
Save
Comment graphing options
Choose comments:
Comment only
Whole thread
All comments
Choose location:
To a new map
To this map
New map options
Select map ontology
Options
Standard (default) ontology
College debate ontology
Hypothesis ontology
Influence diagram ontology
Story ontology
Graph to private map
Cancel
Proceed
+Comments (
0
)
- Comments
Add a comment
Newest first
Oldest first
Show threads
+Citations (
0
)
- Citations
Add new citation
List by:
Citerank
Map
+About
- About
Entered by:-
David Price
NodeID:
#108
Node type:
Component
Entry date (GMT):
5/30/2006 8:58:00 AM
Last edit date (GMT):
10/20/2008 5:40:00 AM
Show other editors
Incoming cross-relations:
0
Outgoing cross-relations:
0
Average rating:
0
by
0
users
x
Select file to upload