computing & information sciences kansas state university wednesday, 10 dec 2008cis 530 / 730:...
TRANSCRIPT
![Page 1: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/1.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Lecture 41 of 42
Wednesday, 10 December 2008
William H. Hsu
Department of Computing and Information Sciences, KSU
KSOL course page: http://snipurl.com/v9v3
Course web site: http://www.kddresearch.org/Courses/Fall-2008/CIS730
Instructor home page: http://www.cis.ksu.edu/~bhsu
Reading for Next Class:
Chapters 1-14, 22 – 23, 26, Russell & Norvig 2nd edition
Philosophy of MindDiscussion: Final Exam Review
![Page 2: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/2.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
PHILOSOPHY OF MINDPHILOSOPHY OF MIND
© 2006 Hilary Greaveshttp://www.rci.rutgers.edu/~hgreaves/teaching/phil103/lectures.htm
![Page 3: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/3.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
The Two Central Problems in the Philosophy of Mind
The Two Central Problems in the Philosophy of Mind
The Problem of Other Minds How can I know that other minds exist?
Vs.
ThoughtsFeelings
Sensory experiencesetc.
![Page 4: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/4.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
The Problem of Other Minds (cont’d)The Problem of Other Minds (cont’d)
How can I know what is going on in other minds?
AfraidLooking
forward to next
summer’s holiday
Vs.
![Page 5: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/5.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
The Mind – Body ProblemThe Mind – Body Problem
How are minds and their contents related to the physical, chemical & biological world?
MIND
MIND
![Page 6: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/6.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Two (apparently unique) aspects of mental phenomena
Two (apparently unique) aspects of mental phenomena
Consciousness Your mind is conscious. But as far as we know, ordinary bits of physical matter are not
conscious. “What is consciousness?”
You know!
CONSCIOUS-NESS
CONSCIOUS-NESS
![Page 7: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/7.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Intentionality (or aboutness)Intentionality (or aboutness)
Mental states can be about other things in the world. Your thought that Bush is a jerk is about George Bush.
Bush is a jerk
aboutness
![Page 8: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/8.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Intentionality/aboutnessIntentionality/aboutness
Ordinary physical objects aren’t about anything. The desk is not about the chair.
aboutness
![Page 9: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/9.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Intentionality/aboutnessIntentionality/aboutness
(A painting can be about something...
aboutness
![Page 10: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/10.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Intentionality/aboutnessIntentionality/aboutness
... But that’s only because it was painted by someone who was thinking about the thing he was painting.)
aboutness
![Page 11: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/11.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Terminological noteTerminological note
‘Intentionality’ has no special connection to intentions. Bush’s knowledge that Saddam Hussein is alive is about Saddam. Tom’s fear of the dentist is about dentists. My desire that Australia will be fun is about Australia. Your intention to eat dinner this evening is about your dinner.
Mental states that have ‘aboutness’ (=‘intentionality’)
![Page 12: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/12.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Intentionality/aboutnessIntentionality/aboutness
The challenge: an adequate explanation of what minds are should explain how mental goings-on can be about things, since ordinary physical goings-on do not have this feature.
![Page 13: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/13.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Why explaining ‘aboutness’ is difficultWhy explaining ‘aboutness’ is difficult
thoughts about things that exist There's no such thing as 'Intentional string'.
The asymmetry problem: Aboutness can't be resemblance
Bush is a jerk
aboutness
![Page 14: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/14.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
An attempt to explain what ‘aboutness’ isAn attempt to explain what ‘aboutness’ is
My thought is about Bush = my thought was ‘caused in the right sort of way’ by Bush.
Bush is a jerk
aboutness
Causal chain
![Page 15: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/15.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Why the causal account doesn’t workWhy the causal account doesn’t work
We can also think about things that don’t exist.
It would be fun to
ride a unicorn.
Causal chain??
![Page 16: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/16.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Cartesian DualismCartesian Dualism
Minds and Matter are two fundamentally different kinds of “substances”
ThoughtsFeelings
Sensory experiencesetc.
Physical stuff (matter)
Mental stuff (minds)
Consciousness
Intentionality
Thought
![Page 17: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/17.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Problems with Cartesian Dualism (I)Problems with Cartesian Dualism (I)
It's mysterious what the "mind substance" is. What it’s not:
Not made of matterNot located in spaceNot ‘extended’ ...
![Page 18: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/18.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Problems with Cartesian Dualism (II)Problems with Cartesian Dualism (II)
It’s a mystery how minds can interact causally with things made of matter. Mental causing physical:
I’m going to raise my arm.
Causes
Mental substance (mind) Matter
![Page 19: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/19.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Problems with Cartesian Dualism (II)Problems with Cartesian Dualism (II)
Physical causing mental:
Sound experienc
e
sound waves
Causes
![Page 20: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/20.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Problems with Cartesian Dualism (III)Problems with Cartesian Dualism (III)
Are dualism and gradual evolution consistent?
??
![Page 21: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/21.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Problems with Cartesian Dualism (IV)Problems with Cartesian Dualism (IV)
It makes the Other Minds problem very hard to solve.
Vs.
ThoughtsFeelings
Sensory experiencesetc.
![Page 22: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/22.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
“Materialist” solutions to the Mind – Body Problem
“Materialist” solutions to the Mind – Body Problem
Materialism: The claim that everything in the universe is made up of matter.
![Page 23: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/23.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
The advance of materialismThe advance of materialism
The advance of science has made materialism look very plausible to many philosophers & scientists. Astronomy: The heavenly bodies are made of matter and obey the laws of physics.
earth
“heavens”
![Page 24: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/24.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
The advance of materialismThe advance of materialism
Evolutionary biology: No non-material processes or forces (e.g. God) are needed to explain the design in the biological world.
Spooky stuff?
Spooky stuff?
![Page 25: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/25.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
The advance of materialismThe advance of materialism
Advances in physiology & understanding the genetic code: There is no need for a “life force” ("elan vital").
“life force”
“life force”
Live tiger
Dead tiger
![Page 26: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/26.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
The advance of materialismThe advance of materialism
Advances in understanding the way the brain works – helped by imaging technology – makes it increasingly plausible to believe that mental phenomena like thought and consciousness might be explained in terms of brain processes
Spooky stuff??
Spooky stuff??
![Page 27: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/27.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
The “Reductive Materialism” hypothesisThe “Reductive Materialism” hypothesis
The “Reductive Materialism” hypothesis: social sciences can be reduced to (i.e. explained by appeal to)
psychology psychology can be reduced to biology biology can be reduced to chemistry chemistry can be reduced to physics
![Page 28: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/28.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Behaviorism Behaviorism
The way we tell that (e.g.) you're in pain is by observing your behavior. The inspiration for behaviorism: maybe what it means to say you're in pain also involves your behavior.
![Page 29: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/29.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
‘Behaviorism’ in psychology‘Behaviorism’ in psychology
Methodological (or: psychological) behaviorists included B. F. Skinner Methodological behaviorism: If there is anything more to mental states than dispositions
to behave in certain ways, that 'extra bit' has nothing to do with science.
I am feeling angry
![Page 30: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/30.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
‘Behaviorism’ in philosophy‘Behaviorism’ in philosophy
Analytical (or: philosophical) behaviorists included Ludwig Wittgenstein and Gilbert Ryle: there is nothing more.
I am feeling angry
EQUALS
![Page 31: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/31.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Analytical behaviorismAnalytical behaviorism
Analytical behaviorism: Claims about mental states and processes can be “translated” into claims about patterns of behavior.
Example:
“Tom has a toothache” = “Tom moans; Tom says his tooth hurts; if someone touches Tom’s tooth, Tom screams; etc."
![Page 32: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/32.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Analytical behaviorismAnalytical behaviorism
“Jenny is hungry” = ?? “Jason wishes he could quit school” = ??
![Page 33: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/33.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Behaviorism and the Problem of Other MindsBehaviorism and the Problem of Other Minds
I am feeling angry
EQUALS
![Page 34: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/34.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Problems for behaviorism (I): Undefinable mental states
Problems for behaviorism (I): Undefinable mental states
Some mental states don't seem to be definable in this way. Listening to Bob Dylan Thinking about how big the universe is
![Page 35: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/35.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Problems for behaviorism (II): Circular definitions
Problems for behaviorism (II): Circular definitions
Behaviorist “definitions” of mental state terms all turned out to be circular (or just plain wrong).
‘Tom believes it will rain today’ = ‘Tom will either stay at home or drive to school today’
IF Tom wants to stay dry, and Tom doesn’t believe there’s a shelter at the bus stop, and...
![Page 36: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/36.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Problems for behaviorism (II): Circular definitions
Problems for behaviorism (II): Circular definitions
“James believes the exam will be hard”= ....??
![Page 37: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/37.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Problems for behaviorism (III):The 'inverted spectrum' problemProblems for behaviorism (III):
The 'inverted spectrum' problem
It seems possible for one person to have their spectrum of color experiences 'inverted' relative to another's. But according to behaviorism, this is not possible.
The flower looks yellow
The flower looks yellow
![Page 38: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/38.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Problems for behaviorism (IV): Some wrong predictions
Problems for behaviorism (IV): Some wrong predictions
For behaviorists, two people who behaved in just the same way would have the same mental states. But there are cases in which this is clearly crazy. Dennett’s thought experiment: curare plus “amnestic”
“How did it feel?”
Administer general
anaesthetic
operation
![Page 39: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/39.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Problems for behaviorism (IV): Some wrong predictions
Problems for behaviorism (IV): Some wrong predictions
“Curare”: paralyses all voluntary muscles “Amnestic”: Has no effect until 2 hours after ingestion, whereupon it wipes out
memory of those two hours
“How did it feel?”
Administer curare + amnestic
operation
![Page 40: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/40.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
... The problem (an argument against behaviorism):
... The problem (an argument against behaviorism):
(P1) Behaviorism predicts that the patient who is given general anaesthetic has the same experiences as the patient who is given curare + amnestic.
(P2) But that’s wrong! One is unconscious, and the other is in excruciating pain.
Therefore,
(C) Behaviorism is false.
![Page 41: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/41.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
The (Type-Type) Identity TheoryThe (Type-Type) Identity Theory
The type – token distinctionthe the
Greaves's belief that snow is white & your belief that snow is white.
Snow is white
Snow is white
![Page 42: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/42.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
The (Type-Type) Identity TheoryThe (Type-Type) Identity Theory
The type-type identity theory: Mental state types are identical with brain state types
Examples of mental state types: the belief that snow is white a burning pain in the index finger the thought that 17 is a prime number
Example of a type-type identification: "Pain is c-fibers firing"
pain
Same event (type)
![Page 43: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/43.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
How the theory deals with the Other Minds Problem
How the theory deals with the Other Minds Problem
..??
![Page 44: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/44.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Problem for the (Type) Identity Theory: “Chauvinism” about the mental
Problem for the (Type) Identity Theory: “Chauvinism” about the mental
According to type-type identity theory: Animals with brains significantly different from
ours can’t feel pain or have other mental states If there are organisms in the universe whose
chemical composition is different from ours, they can’t feel pain or pleasure, and they can’t think.
Extra-terrestrials can’t even think about math. pain
7x5=35
![Page 45: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/45.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Functionalism (aka “machine functionalism”)
Functionalism (aka “machine functionalism”)
The emergence of computers and the computer model of the mind. Computers are symbol manipulators Programs specify how the symbols are to be manipulated.
ADDING PROGRAM
2
35
![Page 46: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/46.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Programs and physical devicesPrograms and physical devices
Many different sorts of physical devices can run the same program...
![Page 47: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/47.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Minds and programsMinds and programs
Minds are to brains as programs are to computers “The mind is the brain’s program”
Inputs (mouse clicks, keyboard
strokes)
Outputs (screen displays; printouts)
Inputs (light rays; hammers hitting
thumb)
Outputs (body movements; screams;
sentences spoken)
PROGRAM
PROGRAM
![Page 48: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/48.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
‘Functional states’‘Functional states’
Mental state concepts are functional concepts; mental states are functional states (not physical states)
‘Functional states’ for the adding program:
Input 2 Input 3
Output 5
Computer state: ‘remember that a 2 has been input, and get ready to
add a 2nd number’
Computer state: ‘ready to do an add calculation’
Functional states
![Page 49: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/49.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
‘Functional states’ cont’d‘Functional states’ cont’d
Desire to stay dry
Belief that there is no
shelter at the bus stop
Input: See that it’s raining
Belief that it’s raining
Output: Look for car keys
Functional states
![Page 50: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/50.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
What is...What is...
The functionalist solution to the Other Minds Problem?
The functionalist solution to the chauvinism problem?
pain
pain
?
?
![Page 51: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/51.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Strong AIStrong AI
A radical (?) implication of functionalism: “Strong AI” -- It is possible to build artificial minds with real mental states. A computer running the same program that your brain
is running would have the same mental states that you have.
It would be conscious, and thus feel pains and pleasures, have emotions, etc.
It would have thoughts with real intentionality.
![Page 52: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/52.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Can machines think?Can machines think?
If they can, then the fact that functionalism predicts that they can counts in favor of the functionalist theory. If not, it counts as an objection to functionalism.
The Turing test – if a machine passes the Turing test, we cannot tell that it isn't really thinking It is a further step to say that if a machine passes the
Turing test, then it is thinking. But perhaps (?) this extra step is very plausible.
“What do you think about Saddam’s
trial?”
“I don’t normally approve of the death penalty, but this guy deserves everything he gets.”
![Page 53: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/53.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Searle’s Critique of Functionalism and Strong AI
Searle’s Critique of Functionalism and Strong AI
Roger Schank’s project: Getting a computer to understand stories and answer questions about them the way people do.
A man went into a restaurant and ordered a hamburger. When the hamburger arrived it was burnt to a crisp, and the man stormed out of the restaurant angrily, without paying for the burger or leaving a tip.
Did the man eat the hamburger?
No
![Page 54: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/54.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
What Schank’s computer is doingWhat Schank’s computer is doing
It’s manipulating symbols. Does this mean that it understands what the symbols mean (i.e.
understands the story, and understands its replies to the questions)?
Thinking of a restaurant
sceneaboutness
![Page 55: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/55.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Searle’s argumentSearle’s argument
Understanding is an ‘intentional state’ – when you understand a story, you understand what it is about.
Searle’s going to argue that the computer could be manipulating symbols in all the right ways, without understanding the story (= without having ‘intentionality’).
The "Chinese room argument": An argument that passing the Turing test is not sufficient or thinking.
![Page 56: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/56.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
The setup of the "Chinese room”The setup of the "Chinese room”
Input: Output:
![Page 57: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/57.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Searle’s argumentSearle’s argument
(P1) Neither Searle nor any other part of the Chinese Room really understands Chinese.Therefore,(C1) The Chinese Room [i.e. the system] does not understand Chinese. (From (P1))(P2) But the Chinese Room perfectly simulates someone who does understand Chinese.Therefore,(C2) Simulating understanding is not sufficient for having understanding. (From (P2), (C1))Therefore,(C3) Even if Schank's computer perfectly simulates human understanding of stories, it does not
follow that Schank's computer really understands stories.
![Page 58: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/58.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Searle’s Account of IntentionalitySearle’s Account of Intentionality
It is a “causal product” of the right kind of biological system.
ADDING PROGRAM
2
35
adding
Not a “causal product” of the
symbol manipulation
![Page 59: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/59.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Searle’s Account of IntentionalitySearle’s Account of Intentionality
Pick up brick and throw at window
Window
breaks
Causal product of brick being
thrown
![Page 60: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/60.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Searle’s Account of IntentionalitySearle’s Account of Intentionality
Searle’s Account of Intentionality It cannot be created simply by
symbol manipulation.
Searle makes the same claims for consciousness. intentionalit
y
intentionality
intentionality
intentionality
???
![Page 61: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/61.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
A Problem for Searle’s ViewA Problem for Searle’s View
Either intentionality and consciousness are restricted to brains like ours in which case he is committed to chauvinism
Or brains quite different from ours can also produce intentionality & consciousness in which case the Other Minds Problem looks to be unsolvable since
we can’t tell which brains just simulate consciousness & intentionality & which really have it.
![Page 62: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/62.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Some questions for YOU to ponderSome questions for YOU to ponder
Can a suitably sophisticated computer which is NOT made out of “meat” like the human brain have real intentionality (and thus real thoughts)? have real consciousness (feel real pain & pleasure, and know what it
is like to experience colors & tastes)?
Does it matter? If so, why?
![Page 63: Computing & Information Sciences Kansas State University Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 41 of 42 Wednesday, 10 December](https://reader036.vdocuments.mx/reader036/viewer/2022081603/56649e855503460f94b884e3/html5/thumbnails/63.jpg)
Computing & Information SciencesKansas State University
Wednesday, 10 Dec 2008CIS 530 / 730: Artificial Intelligence
Some questions for YOU to ponderSome questions for YOU to ponder
Can an extra-terrestrial with a suitably sophisticated brain that is very different from our brain have real intentionality (and thus real thoughts)? have real consciousness (feel real pain & pleasure, and know what it is like to
experience colors & tastes)?
How can we KNOW whether the computer or the extra-terrestrial has REAL consciousness and REAL intentionality?