Views
Graph
Explorer
Focus
Down
Load 1 level
Load 2 levels
Load 3 levels
Load 4 levels
Load all levels
All
Dagre
Focus
Down
Load 1 level
Load 2 levels
Load 3 levels
Load 4 level
Load all levels
All
Tree
SpaceTree
Focus
Expanding
Load 1 level
Load 2 levels
Load 3 levels
Down
All
Down
Radial
Focus
Expanding
Load 1 level
Load 2 levels
Load 3 levels
Down
All
Down
Box
Focus
Expanding
Down
Up
All
Down
Page â
Article
Outline
Document
Down
All
Canvas
Time
Timeline
Calendar
Request email digest
Past 24 hours
Past 2 days
Past 3 days
Past week
Add
Add page
Add comment
Add citation
Edit
Edit page
Delete page
Share
Link
Bookmark
Embed
Social media
Login
Member login
Register now for a free account
đ
John Searle
Deelnemer
1
#2769
Arguments advanced by John Searle.
PAGE NAVIGATOR
(Help)
-
Artificial Intelligence »
Artificial Intelligence
Artificial IntelligenceâA collaboratively editable version of Robert Horns brilliant and pioneering debate map Can Computers Think?âexploring 50 years of philosophical argument about the possibility of computer thought.âF1CEB7
▲
Protagonists »
Protagonists
ProtagonistsâThe contributions of over 300 protagonists can be explored via a surname search, or using the growing list developing here.âD3B8AB
■
John Searle
John SearleâArguments advanced by John Searle.âD3B8AB
►
The Connection Principle »
The Connection Principle
The Connection PrincipleâTheres a necessary connection between consciousness and mentality. All thinkingâas a form of mentalityâis a least in principle accessible to consciousness. Even unconscious thoughts involve consciousness, as they can potentially become conscious.âFFFACD
►
Mental states have aspectual shape »
Mental states have aspectual shape
Mental states have aspectual shapeâA thought about water differs in aspectual shape from a thought about H2O, even though the two thoughts are about the same thing. The most we can say is that brain states have the causal capacity to produce aspectually shaped experiences.âFFFACD
►
Green Bay Packers Backup Quarterback »
Green Bay Packers Backup Quarterback
Green Bay Packers Backup Quarterback âTwo beliefsâone unconscious, one consciousâare identical if they possess the same mode and content, and are based on the same causal mechanisms.âFFFACD
►
Consciousness-pill unhelpful »
Consciousness-pill unhelpful
Consciousness-pill unhelpful âA consciousness pill doesnt act as a means of differentiating the mental from the non mentalâan essential benefit of the connection principle. Hence, the pill isnt an extreme version of the connection principle and isnt helpful.âFFFACD
►
Water-thought must be accessible to consciousness »
Water-thought must be accessible to consciousness
Water-thought must be accessible to consciousnessâThe purported counterexample to the connection principle requires the connection principle.âFFFACD
►
Unconscious intentional zombies are impossible »
Unconscious intentional zombies are impossible
Unconscious intentional zombies are impossibleâAn unconscious zombies thoughts cant become conscious, so cant have a perspectiveâaspectual shapeâand hence lack intentionality. So, there cant be a unconscious zombie also capable of having a thought about somethingâan intentional state.âFFFACD
►
Normal ontological reductionism doesn't work »
Normal ontological reductionism doesn't work
Normal ontological reductionism doesn't workâIn explanations of consciousness, the subjective, first person components cant be reduced away as nothing but physiological activity, because then the essential feature of consciousness will be left out.âFFFACD
►
Causal reductionism »
Causal reductionism
Causal reductionismâMental states can at the same time be physical states. Physical brain states cause mental states in the same way that a lattice structure of molecules causes liquidity. Mental states are causallyâbut not ontologicallyâreducible to physical state. âFFFACD
►
Nothing is intrinsically a digital computer »
Nothing is intrinsically a digital computer
Nothing is intrinsically a digital computerâThe syntactic structues that define computers are realisable in any physical system. Hence, the question is a brain a digital computer? is ill-defined because the relevant syntax can be ascribed to any sufficiently complex system.âFFFACD
►
Connectionist networks are formal systems »
Connectionist networks are formal systems
Connectionist networks are formal systemsâArguments used against the formal character of symbol manipulators apply equally well to connectionist networks [CNs]. Functions computed on a CN can also be computed on a serial machine and CNs can implement classical serial processing.âFFFACD
►
The Chinese Gym Argument »
The Chinese Gym Argument
The Chinese Gym Argument âThe Chinese Gym Argument (see detailed text) shows that instantiating a connectionist network is not enough to produce an understanding of Chinese.âFFFACD
►
The Chinese Room Argument [4] »
The Chinese Room Argument [4]
The Chinese Room Argument [4]âInstantiation of a formal program isnt enough to produce semantic understanding or intentionality. A man who doesnt understand Chinese, can answer written Chinese questions using an English rulebook telling him how to manipulate Chinese symbols.âFFFACD
►
Chinese Room refutes strong AI not weak AI »
Chinese Room refutes strong AI not weak AI
Chinese Room refutes strong AI not weak AIâThe Chinese Room argument refutes strong AI by showing that running a program is not enough to generate any real understanding or intentionality. But it doesnt refute weak AI, which simply uses computers as tools to study the mind. âFFFACD
►
Read what they've written »
Read what they've written
Read what they've writtenâThere are lots of cognitive scientists who are adherents of strong AIâjust read what people have written. They have identified themselves in their published commentaries on the Chinese Room argument.âFFFACD
►
Only minds are intrinsically intentional »
Only minds are intrinsically intentional
Only minds are intrinsically intentionalâMental states alone are intrinsically intentional. Computational systems are only intentional relative to some observer who treats them as if they had intentional states.âFFFACD
►
3rd axiom is a logical not an empirical truth »
3rd axiom is a logical not an empirical truth
3rd axiom is a logical not an empirical truthâThat syntax is not sufficient for semantics is a logical truth, not an empirical question. To see this, notice that its converse raises inconsistencies (described in the detailed text).âFFFACD
►
Luminous Room isn't the same as Chinese Room »
Luminous Room isn't the same as Chinese Room
Luminous Room isn't the same as Chinese RoomâThe luminous room argument exploits a problematic analogy between syntax and electromagnetism. The problem with this analogy is set out in the detailed text.âFFFACD
►
Understanding arises from right causal powers »
Understanding arises from right causal powers
Understanding arises from right causal powersâSystems capable of semantic understanding and intentionality must have at least the same causal powers as brains. Brains have sufficient causal powers to produce understanding: its an open empirical question whether other materials (eg silicon) do.âFFFACD
►
Biological Naturalism »
Biological Naturalism
Biological NaturalismâConsciousness and intentionality are caused by and realised in the brain. The brain has the right causal powers to produce intentionality.âFFFACD
►
Intentionality is both abstract and biological »
Intentionality is both abstract and biological
Intentionality is both abstract and biologicalâJacquette fails to recognise that intentionality can be both biological and abstract. Jacquettes position is a result of a lingering dualism that disembodies the abstract features of intentionality.âFFFACD
►
Chinese Room doesn't assume locus of control »
Chinese Room doesn't assume locus of control
Chinese Room doesn't assume locus of controlâSearle gives examples of programsâbeer-can systems, water pipes, etcâthat dont need a central locus of control. His only Chinese Room assumptions are programs are syntactical, syntax is not sufficient for semantics, and minds have semantics.âFFFACD
►
No level of formal program can produce semantics »
No level of formal program can produce semantics
No level of formal program can produce semanticsâSearle accepts a program could be a mind, if implemented in a medium with the right causal powers. The key point though is that no program, no matter how low level it is, could ever produce semantics only by virtue of syntactic symbol manipulations.âFFFACD
►
Chinese Water Pipe Brain Simulator »
Chinese Water Pipe Brain Simulator
Chinese Water Pipe Brain SimulatorâChinese Room mans given a new rulebook noting which valves to switch on/off in a vast system of waterpipes in response to Chinese inputs. The pipes simulate the neural properties of a Chinese speakers brain, but the man doesnt understand Chinese.âFFFACD
►
Causal powers maintained if simulation low-level »
Causal powers maintained if simulation low-level
Causal powers maintained if simulation low-levelâThe person whose neurotransmitters have been replaced still understands Chinese because she still has the right causal powers; she just needs some help from the demon.âFFFACD
►
The Chinese Water-pipe brain simulator »
The Chinese Water-pipe brain simulator
The Chinese Water-pipe brain simulatorâChinese Room man given a new rulebook indicating which valves to switch on/off in vast system of waterpipes in response to Chinese inputs. The pipes simulate the neural properties of a Chinese speakers brain but the man doesnt understand Chinese.âFFFACD
►
Knowing how the robot works »
Knowing how the robot works
Knowing how the robot worksâIf a robot looked and behaved like a human, it would be rational to explain its actions in terms of intentionality. But if we knew it acted on the basis of formal symbol manipulations wed no longer appeal to intentionality to explain its behaviour.âFFFACD
►
Conscious agents can instantiate computer programs »
Conscious agents can instantiate computer programs
Conscious agents can instantiate computer programs âFodors strengthened notion of instantiation doesnt rule out the possibility of a conscious agent performing the symbol manipulations. The causal connections Fodor asks for are supplied by the man in the room.âFFFACD
►
The many mansions reply trivialises strong AI »
The many mansions reply trivialises strong AI
The many mansions reply trivialises strong AIâIts entirely possible that someday some device will possess intentionality, but thats irrelevant because the Chinese Room argument is meant to refute the more specific thesis that formal computational processes can produce intentionality.âFFFACD
►
Chinese Room can be placed inside the robot »
Chinese Room can be placed inside the robot
Chinese Room can be placed inside the robotâMerely putting a computer program in a robot is inadequate because we can put the Chinese Room insde a robot and apply the same thought experiement as before.âFFFACD
►
Symbols and causal connections insufficient »
Symbols and causal connections insufficient
Symbols and causal connections insufficientâSymbols and causal connections arent sufficient to produce meaning (or intentionality). If theres meaning in a system, it can only be because theres more to the system than just the fact of a symbol and the fact of a causal connection.âFFFACD
►
The Internalisation Reply »
The Internalisation Reply
The Internalisation ReplyâSuppose the man in the Room memorizes the rule book and all the symbols and does the matching in his head not on paper. He incorporates the entire Room system, but still doesnt understand Chinese. So the whole system doesnt understand Chinese.âFFFACD
►
The Chinese Room passes the test »
The Chinese Room passes the test
The Chinese Room passes the testâThe Chinese Room thought experiment involves a system that passes the Turing test by speaking Chinese but fails to understand Chinese.âFFFACD
►
Simulations are not duplications »
Simulations are not duplications
Simulations are not duplicationsâA simulation of a fire will not burn. A simulation of metabolism in digestion will not nourish. A simulation of an automobile engine will not get anywhere. Similarly, a simulation of the understanding is not actual understanding.âFFFACD
►
Overt behavior doesn't demonstrate understanding »
Overt behavior doesn't demonstrate understanding
Overt behavior doesn't demonstrate understandingâThe Chinese Room argument shows that a system can engage in Chinese-speaking behavior yet fail to understand Chinese. The Chinese Room is a system that could past the Turing Test without thinking in Chinese.âFFFACD
Heading
Summary
Click the button to enter task scheduling information
Open
Details
Enter task details
Message text
Select assignee(s)
Due date (click calendar)
RadDatePicker
RadDatePicker
Open the calendar popup.
Calendar
Title and navigation
Title and navigation
<<
<
November 2024
>
<<
November 2024
S
M
T
W
T
F
S
44
27
28
29
30
31
1
2
45
3
4
5
6
7
8
9
46
10
11
12
13
14
15
16
47
17
18
19
20
21
22
23
48
24
25
26
27
28
29
30
49
1
2
3
4
5
6
7
Reminder
No reminder
1 day before due
2 days before due
3 days before due
1 week before due
Ready to post
Copy to text
Enter
Cancel
Task assignment(s) have been emailed and cannot now be altered
Lock
Cancel
Save
Comment graphing options
Choose comments:
Comment only
Whole thread
All comments
Choose location:
To a new map
To this map
New map options
Select map ontology
Options
Standard (default) ontology
College debate ontology
Hypothesis ontology
Influence diagram ontology
Story ontology
Graph to private map
Cancel
Proceed
+Commentaar (
0
)
- Commentaar
Voeg commentaar toe
Newest first
Oldest first
Show threads
+Citaten (
0
)
- Citaten
Voeg citaat toe
List by:
Citerank
Map
+About
- About
Gemaakt door:
David Price
NodeID:
#2769
Node type:
Protagonist
Gemaakt op (GMT):
7/20/2007 6:03:00 PM
Laatste bewerking (GMT):
7/20/2007 6:03:00 PM
Show other editors
Inkomende kruisrelatie
0
Uitgaande kruisrelatie
34
Gemiddelde waardering:
0
by
0
gebruikers
x
Select file to upload