Views
Graph
Explorer
Focus
Down
Load 1 level
Load 2 levels
Load 3 levels
Load 4 levels
Load all levels
All
Dagre
Focus
Down
Load 1 level
Load 2 levels
Load 3 levels
Load 4 level
Load all levels
All
Tree
SpaceTree
Focus
Expanding
Load 1 level
Load 2 levels
Load 3 levels
Down
All
Down
Radial
Focus
Expanding
Load 1 level
Load 2 levels
Load 3 levels
Down
All
Down
Box
Focus
Expanding
Down
Up
All
Down
Page â
Article
Outline
Document
Down
All
Canvas
Time
Timeline
Calendar
Request email digest
Past 24 hours
Past 2 days
Past 3 days
Past week
Add
Add page
Add comment
Add citation
Edit
Edit page
Delete page
Share
Link
Bookmark
Embed
Social media
Login
Member login
Register now for a free account
đ
Machines can't have emotions
Component
1
#108
Machines can never have emotional states.
PAGE NAVIGATOR
(Help)
-
Artificial Intelligence »
Artificial Intelligence
Artificial IntelligenceâA collaboratively editable version of Robert Horns brilliant and pioneering debate map Can Computers Think?âexploring 50 years of philosophical argument about the possibility of computer thought.âF1CEB7
▲
Can computers think? [1] »
Can computers think? [1]
Can computers think? [1]âCan a computational system possess all important elements of human thinking or understanding? âFFB597
▲
No: computers can't have emotions »
No: computers can't have emotions
No: computers can't have emotionsâMachines can never be in emotional states (they can never be angry, joyous, fearful etc), and emotions are necessary for thought.â59C6EF
■
Machines can't have emotions
Machines can't have emotionsâMachines can never have emotional states.â9FDEF6
●
Lack physiological components »
Lack physiological components
Lack physiological componentsâMachines lack human physiology that is essential to emotions; for example, ability to secrete hormones and neuroregulators. Because machines cant reproduce such features through abstract computational processes, they cant possess emotions.â98CE71
●
Machines can't love or be loved »
Machines can't love or be loved
Machines can't love or be lovedâMachines, which are mere collections of parts, cant love or be loved. Only unified wholes that govern their partsâe.g. humans, have the capacity to love what is lovable or be loved by those who love. Machines cant love and hence lack minds.â98CE71
●
Machines can't think dialectically »
Machines can't think dialectically
Machines can't think dialecticallyâEmotions are experienced in complicated dialectical circumstances, which require the ability to make judgements about others and gauge oppositions. Machines cant reason like this, and so cant experience emotions.â98CE71
●
Only living organisms feel »
Only living organisms feel
Only living organisms feelâThe concept of feeling only applies ot living organisms. Because robots are mechanistic artifacts, not organisms, they cannot have feelings.â98CE71
●
Emotion is a type of information processing »
Emotion is a type of information processing
Emotion is a type of information processingâWe will be able to build computers with emotions once we fully understand the biochemical and cybernetic aspects of human emotion.âEF597B
●
Emotions are cognitive evaluations »
Emotions are cognitive evaluations
Emotions are cognitive evaluationsâEmotions are determined by the structure, content and organisation of knowledge representations and the processes operating on them. A machine equipped with the correct knowledge handling mechanisms, resulting in appropriate behaviour, has emotions.âEF597B
●
Emotions are cognitive schemata »
Emotions are cognitive schemata
Emotions are cognitive schemataâMachines can have emotions if i) they can model the complex schema involved in concepts such as pride and shame, and ii) these concepts can be (partially) responsible for the behaviour of the system.âEF597B
●
Emotions are solution to a design problem »
Emotions are solution to a design problem
Emotions are solution to a design problemâEmotions are the solution to the problem how to cope intelligently with a rapidly changing environment, given established goals and limited processing resources. The problem can be solved in humans and machines via computational strategies.âEF597B
●
Emotions can be modelled »
Emotions can be modelled
Emotions can be modelledâModeling emotions involves programming a system to: 1) understand emotions (semantic task) and 2) behave emotionally through the interaction of emotional states and other cognitive states such as planning, learning and recall (behavioral task). âEF597B
●
Emotions manifestations of concern realisation system »
Emotions manifestations of concern realisation system
Emotions manifestations of concern realisation systemâEmotional states result from a concern realisation system that matches internal representations against actual circumstances in order to cope with an uncertain environment. Computers that can implement such a system go through emotional states.âEF597B
●
Feelings are information signals »
Feelings are information signals
Feelings are information signalsâA robot could have feelings if it can process 2 kinds of signal: a) needs, which arise from lower-level distributed processes that monitor internal aspects of the body and b) emotions, cognitive interpretations of external eventsâesp. social events.âEF597B
●
Honest robot talk about feelings equals feelings »
Honest robot talk about feelings equals feelings
Honest robot talk about feelings equals feelingsâIf a robot can honestly talk about its feelings, it has feelings. Thus, configure robot to 1) use English as humans do, 2) distinguish truth from falsehood, 3) answer honestlyâand then ask Are you concious of your feelings?. If it says yes, it is.âEF597B
●
Motivational processes »
Motivational processes
Motivational processesâEmotions are the product of emotional representations and arise from interactions between motives and other cognitive states. Motives represent world states to be achieved, prevented etcâa robot with proper motivational processes will have emotions.âEF597B
●
Robot pain is theoretically possible »
Robot pain is theoretically possible
Robot pain is theoretically possibleâOur current understanding of pain is incoherent and self-contradictory, but once we have a coherent theory of pain, a robot could in principle be constructed to instantiate that theory and thereby feel pain.âEF597B
●
Turing Test provides evidence for emotions »
Turing Test provides evidence for emotions
Turing Test provides evidence for emotionsâBecause behaviours a key part of determining if a system has emotions, the Test provides evidence for emotional capacity as well as intelligence. If a robot passes the test and has a cognitively plausible internal structure, it can have emotions.âEF597B
Heading
Summary
Click the button to enter task scheduling information
Open
Details
Enter task details
Message text
Select assignee(s)
Due date (click calendar)
RadDatePicker
RadDatePicker
Open the calendar popup.
Calendar
Title and navigation
Title and navigation
<<
<
November 2024
>
<<
November 2024
S
M
T
W
T
F
S
44
27
28
29
30
31
1
2
45
3
4
5
6
7
8
9
46
10
11
12
13
14
15
16
47
17
18
19
20
21
22
23
48
24
25
26
27
28
29
30
49
1
2
3
4
5
6
7
Reminder
No reminder
1 day before due
2 days before due
3 days before due
1 week before due
Ready to post
Copy to text
Enter
Cancel
Task assignment(s) have been emailed and cannot now be altered
Lock
Cancel
Save
Comment graphing options
Choose comments:
Comment only
Whole thread
All comments
Choose location:
To a new map
To this map
New map options
Select map ontology
Options
Standard (default) ontology
College debate ontology
Hypothesis ontology
Influence diagram ontology
Story ontology
Graph to private map
Cancel
Proceed
+Commentaar (
0
)
- Commentaar
Voeg commentaar toe
Newest first
Oldest first
Show threads
+Citaten (
0
)
- Citaten
Voeg citaat toe
List by:
Citerank
Map
+About
- About
Gemaakt door:
David Price
NodeID:
#108
Node type:
Component
Gemaakt op (GMT):
5/30/2006 8:58:00 AM
Laatste bewerking (GMT):
10/20/2008 5:40:00 AM
Show other editors
Inkomende kruisrelatie
0
Uitgaande kruisrelatie
0
Gemiddelde waardering:
0
by
0
gebruikers
x
Select file to upload