the symbolic vs subsymbolic debate h. bowman (ccncs, kent)

Post on 18-Dec-2015

221 Views

Category:

Documents

3 Downloads

Preview:

Click to see full reader

TRANSCRIPT

The Symbolic vs Subsymbolic Debate

H. Bowman

(CCNCS, Kent)

Disclaimer

• Serious simplification of a complex debate.

• Emphasize extreme positions to clarify basic points of controversy.

• What I present is not necessarily what I personally believe!!

The Mind-Body Problem

subsymbolic• inspired by neurobiology• how cognition emerges from

neurobiology

I am aCartesianDualist!

symbolic• putative characteristics of

cognition• information processing metaphor

The Symbolic Tradition

The Computer Metaphor

• Takes inspiration from,– programming languages & computational logic

• data structures & knowledge representation

• link to programming langs. such as Lisp & Prolog

– computer architectures• Von Neumann architectures: centralized processing

– computability theory• the Church – Turing hypothesis

Key Assumptions

• Symbols are available to the cognitive system

• Symbol processing engine, characteristics,– Systematic, i.e. combinatorial symbol systems

and compositionality– Recursive knowledge structures

Syntax: Grammars 1

recursion

Sentence: S ::= NP VP

Noun phrase: NP ::= det AL N | N

Verb phrase: VP ::= V NP

Adjective list: AL ::= A AL | A

A set of “rules”

S, NP, VP, AL : molecules

det, N, V, A : atoms

Syntax: Grammars 2S

NP VP

det AL N V NP

the boy eats N

ice creamhappy

A

A Tree Data Structure

Compositionality

• Plug constituents in according to rules• Structure of expressions indicates how they should

be interpreted

• Semantic Compositionality, “the semantic content of a (molecular) representation is

a function of the semantic contents of its syntactic parts, together with its constituent structure”

[Fodor & Pylyshyn,88]

Compositionality in Semantics 1

• Meanings of items plugged in as defined by syntax

MM[ John loves Jane ]=

MM[ John ] + MM[ loves ] + MM[ Jane ]+

composed together appropriately

M[ X ] denotes meaning of X

Compositionality in Semantics 2

• Same example in more detail

MM[ John loves Jane ]

=

……………………. . MM[ loves ] ..………..

MM[ John ] MM[ Jane ]

Compositionality in Semantics 3

• Meanings of atoms constant across different compositions

MM[ Jane loves John ]

=

……………………. . MM[ loves ] ..………..

MM[ Jane ] MM[ John ]

Compositionality in Semantics 4

• Also, meanings of molecules constant across different compositions

MM[ Jane loves John and Jane hates James ]

=

……..…..….……. ..…..….……. MM[ and ] ……….………..

MM[ Jane loves John ] MM[ Jane hates James ]

Compositionality in Semantics 5

MM[ Jane hates James and Jane loves John ]

=

……..…..….……. ..…..….……. MM[ and ] ……….………..

MM[ Jane loves John ]MM[ Jane hates James ]

Caveat

• Compositionality of course not absolute, e.g. Idioms: “kicked the bucket”

Compositionality: non-linguistic examples

• Not just an issue for language

• Reasoning / planning / deductive thought

• Representation of knowledge:– hierarchical / superordinate structures

From Marr’s theory ofobject Recognition

Representation of Visual Objects

Whole-part Hierarchies

S SS SS SS S S S S S S S SS SS SS S

Production System Architectures of the Mind

• Most detailed and complete realisation of symbolic tradition, e.g.– SOAR (Unified Theories of Cognition) [Newell]

– ACT-R [Anderson]

– EPIC [Kieras]

• GOFAI (Good Old Fashioned Artificial Intelligence)– Based upon expert systems technology

Symbol Systems and Nature vs Nurture

• Learning theories of symbolic architectures are rather limited– although, chunking-based theories do exist

• Where does the symbolic processing engine come from?

THEREFORE• Evolutionary explanations, e.g.

– Chomsky’s Universal Grammars

Symbol Systems and the Brain• For symbolists, the algorithmic / specification

levels are critical, the implementation level is insignificant (using Marr’s terminology)

• “… for a [Symbolist], neurons implement all cognitive processes in precisely the same way, viz., by supporting the basic operations that are required for symbol-processing … [i.e.] … all that is required is that you use your [neural] network to implement a Turing machine” [Fodor&Pylyshyn,88]

• A sort of compilation step.

• Computers used as an analogy, where software is the interesting thing and the hardware mapping is fixed and automatic.

• “… one should be deeply suspicious of the heroic sort of brain modelling that purports to address the problems of cognition. We sympathize with the craving for biologically respectable theories that many psychologists seem to feel. But, given a choice, truth is more important than respectability.” [Fodor&Pylyshyn,88]

The Sub-symbolic Tradition

Connectionism

• Inspiration from neurobiology

• Long tradition [at least from 50’s], e.g. Hebb, Rosenblatt, Grossberg, Rumelhart, McClelland, O’Reilly.

• Nodes, links, activation, weights, learning algorithms

Activation in Classic Artificial Neural Network Model

output - yj

net input - j

activationvalue - yjnode j

w1j w2j wnj

x1 x2 xn

inputs

ijii

wxj

integrate(weighted sum)

sigmoidalje

y j

11

Sigmoidal Activation Function

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

-4 -3 -2 -1 0 1 2 3 4net input ( )

ac

tiv

ati

on

(y)

Responsive around net input of 0

Unresponsive at extreme net inputs

Threshold: unresponsive at low net inputs

jey j

1

1

Connectionism and Nature vs Nurture

• Powerful learning algorithm– directed weight adjustment– extracting regularities (Hebbian learning)– supervised learning (Back-propagation)

• Typically ascribe more to learning than to evolution

Example Connectionist Model

• Word reading as an example– Orthography to Phonology

• Words of four letters or less

• Need to represent order of letters, otherwise, e.g. slot and lots the same

• Slot coding

A (Rather Naïve) Reading Model

A.1 B.1 Z.1 A.2 B.2 Z.2 A.3 B.3 Z.3 A.4 B.4 Z.4

/p/.1 /b/.1 /u/.1 /p/.2 /b/.2 /u/.2 /p/.3 /b/.3 /u/.3 /p/.4 /b/.4 /u/.4

SLOT 1ORTHOGRAPHY

PHONOLOGY

Connectionism & Compositionality

• Highly non-compositional, e.g.– a in ant and cat completely unrelated representations– no sense to which plug in constituent representations– maximally affected by context

• Same argument would generalise to semantic compositionality

• Alternative connectionist models do better, e.g. activation gradient models, but not clear that any model is truly systematic in sense of symbolic processing

Spectrum of Approaches

Systematic / symbolic

Unsystematic / subsymbolic

Distributed Representations / Back-prop., e.g. [Seidenberg]

Localist / Competitive Learning, e.g. IA model [McClelland]

Localist Models with Serial Order, e.g. Solaris [Davis]

Centralised Production SystemsArchitectures, e.g. SOAR [Newell]

Parallel Production Systems, e.g. EPIC [Kieras]

Hybrid Approaches, e.g. [Gabbay]

(Symbolic) Distributed Control, e.g. Actors [Hewett], Agents [Kokinov], ICS [Barnard], Society of Minds [Minsky]

• Introduction to connectionism– O’Reilly & Munakata, 2000

• Production system architectures– ACT-R [Anderson,93]

• Connectionism: Strengths and Weaknesses– Fodor & Pylyshyn, 88– McClelland, 92 and 95

• Symbolic-like Connectionism– Hinton, 90

Possible Topics 1

• Past tense debate– Pinker et al, 2003

• Localist vs distributed debate– Bowers, 2002 and Page, 2000.

• Dual process theory – system 1 (neural) – system 2 (symbolic)– Evans, 2003.

Possible Topics 2

References• Anderson, J. R. (1993). Rules of the Mind. Hillsdale, NJ: Erlbaum.• Bowers, J. S. (2002). Challenging the widespread assumption that connectionism and distributed

representations go hand-in-hand. Cognitive Psychology., 45, 413-445.• Evans, J. S. B. T. (2003). In Two Minds: Dual Process Accounts of Reasoning. Trends in Cognitive Sciences,

7(10), 454-459.• Fodor, J. A., & Pylyshyn, Z. W. (1988). Connectionism and Cognitive Architecture: A Critical Analysis.

Cognition, 28, 3-71.• Hinton, G. E. (1990). Special Issue of Journal Artificial Intelligence on Connectionist Symbol Processing (edited

by Hinton, G.E.). Artificial Intelligence, 46(1-4).• O'Reilly, R. C., & Munakata, Y. (2000). Computational Explorations in Cognitive Neuroscience: Understanding

the Mind by Simulating the Brain.: MIT Press.• McClelland, J. L. (1992). Can Connectionist Models Discover the Structure of Natural Language? In R. Morelli,

W. Miller Brown, D. Anselmi, K. Haberlandt & D. Lloyd (Eds.), Minds, Brains and Computers: Perspectives in Cognitive Science and Artificial Intelligence (pp. 168-189). Norwood, NJ.: Ablex Publishing Company.

• McClelland, J. L. (1995). A Connectionist Perspective on Knowledge and Development. In J. J. Simon & G. S. Halford (Eds.), Developing Cognitive Competence: New Approaches to Process Modelling (pp. 157-204). Mahwah, NJ: Lawrence Erlbaum.

• Page, M. P. A. (2000). Connectionist Modelling in Psychology: A Localist Manifesto. Behavioral and Brain Sciences, 23, 443-512.

• Pinker, S., Ullman, M. T., McClelland, J. L., & Patterson, K. (2002). The Past-Tense Debate (Series of Opinion Articles). Trends Cogn Sci, 6(11), 456-474.

top related