john searle - end of the revolution

14
End of the Revolution FEBRUARY 28, 2002 John R. Searle New Horizons in the Study of Language and Mind by Noam Chomsky Cambridge University Press, 230 pp., $60.00; $20.00 (paper) 1. Almost three decades ago I reviewed in these pages a striking development in the study of language that I called “Chomsky’s Revolution in Linguistics.” 1 After such a long time it would seem appropriate to assess the results of the revolution. This article is not by itself such an assessment, because to do an adequate job one would require more knowledge of what happened in linguistics in these years than I have, and certainly more than is exhibited by Chomsky’s new book. But this much at least we can say. Judged by the objectives stated in the original manifestoes, the revolution has not succeeded. Something else may have succeeded, or may eventually succeed, but the goals of the original revolution have been altered and in a sense abandoned. I think Chomsky would say that this shows not a failure of the original project but a redefinition of its goals in ways dictated by new discoveries, and that such redefinitions are typical of ongoing scientific research projects. The research project of the revolution was to work out for each natural language a set of syntactical rules that could “generate” all the sentences of that language. The sense in which the rules could generate the infinite number of sentences of the language is that any speaker, or even a machine, that followed the rules would produce sentences of the language, and if the rules are complete, could produce the potentially infinite number of its sentences. The rules require no interpretation and they do more than just generate patterns. Applied mechanically, they are capable of generating the infinite number of sentences of the language. Syntax was regarded as the heart of linguistics and the project was supposed to transform linguistics into a rigorous science. A “grammar,” in the technical sense used by linguists, is a theory of a language, and such theories were called “generative Font Size: A A A

Upload: charlesdale

Post on 03-Apr-2015

439 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: John Searle - End of the Revolution

End of the RevolutionFEBRUARY 28, 2002

John R. Searle

New Horizons in the Study of Language and Mind by Noam ChomskyCambridge University Press, 230 pp., $60.00; $20.00 (paper)

1.

Almost three decades ago I reviewed in these pages a striking development in the studyof language that I called “Chomsky’s Revolution in Linguistics.” 1 After such a longtime it would seem appropriate to assess the results of the revolution. This article is notby itself such an assessment, because to do an adequate job one would require moreknowledge of what happened in linguistics in these years than I have, and certainlymore than is exhibited by Chomsky’s new book. But this much at least we can say.Judged by the objectives stated in the original manifestoes, the revolution has notsucceeded. Something else may have succeeded, or may eventually succeed, but thegoals of the original revolution have been altered and in a sense abandoned. I thinkChomsky would say that this shows not a failure of the original project but aredefinition of its goals in ways dictated by new discoveries, and that such redefinitionsare typical of ongoing scientific research projects.

The research project of the revolution was to work out for each natural language a setof syntactical rules that could “generate” all the sentences of that language. The sense inwhich the rules could generate the infinite number of sentences of the language is thatany speaker, or even a machine, that followed the rules would produce sentences of thelanguage, and if the rules are complete, could produce the potentially infinite numberof its sentences. The rules require no interpretation and they do more than just generatepatterns. Applied mechanically, they are capable of generating the infinite number ofsentences of the language.

Syntax was regarded as the heart of linguistics and the project was supposed totransform linguistics into a rigorous science. A “grammar,” in the technical sense usedby linguists, is a theory of a language, and such theories were called “generative

Font Size: A A A

Page 2: John Searle - End of the Revolution

grammars.” Stated informally, some rules of English are that a sentence can becomposed of a noun phrase plus a verb phrase, that a verb phrase can consist of a verbplus a noun phrase and that a noun phrase can be composed of a “determiner” plus anoun, that nouns can be “woman,” “man,” “ball,” “chair”…; verbs can be “see,” “hit,”“throw”…; determiners can be “the,” “a”…. Such rules can be represented formally inthe theory as a set of instructions to rewrite a symbol on the left side as the symbols onthe right side. Thus,

S ! NP + VP

VP ! V + NP

NP ! Det + N

N ! man, woman, ball…

V ! hit, see, throw…

Det ! a, the…

This small fragment of an English grammar would be able to generate, for example,the sentence

The man hit the ball.

Such rules are sometimes called “rewrite rules” or “phrase structure rules” because theydetermine the elementary phrase structure of the sentence. Chomsky argued that suchrules are inadequate to account for the complexities of actual human languages likeEnglish, because some sentences require that a rule apply to an element not just invirtue of its form, but in virtue of how it got that form, the history of how it wasderived. Thus, for example, in the sentence

The chicken is ready to eat

even though the words are not ambiguous, the sentence as a whole is syntacticallyambiguous depending on whether “chicken” is the subject or the object of “eat.” Thesentence can mean either the chicken is ready to eat something, or the chicken is readyfor something to eat it. To account for this ambiguity it seems, Chomsky argued, thatwe have to suppose that the sentence is the surface expression of two different

Page 3: John Searle - End of the Revolution

underlying sentences. The sentence is the result of applying rules that transform twodifferent underlying, or deep, structures. Such rules are called transformational rules,and Chomsky’s version of generative grammar was often called “transformationalgrammar” because of the argument for the necessity of transformational rules. In theclassical versions of the theory, the phrase structure rules determined the “deepstructure” of the sentence, the bearer of meaning; the transformational rules converteddeep structure into surface structure, something that could be uttered. In the example ofthe chicken above, there is one surface structure, the sentence I have quoted, and twodeep structures, one active, one passive.

It was a beautiful theory. But the effort to obtain sets of such rules that could generateall and only the sentences of a natural language failed. Why? I don’t know, though Iwill suggest some explanations later. But seen from outside a striking feature of thefailure is that in Chomsky’s later work even the apparently most well-substantiatedrules, such as the rule for forming passive sentences from active sentences, have beenquietly given up. The relation between “John loves Mary” and “Mary is loved byJohn” seemed elegantly explained by a transformational rule that would convert thefirst into the second. Apparently nobody thinks that anymore.

Another feature of the early days was the conviction that human beings were born withan innate brain capacity to acquire natural human languages. This traditional view—itgoes back at least to the seventeenth century—seemed inescapable, given that a normalinfant will acquire a remarkably complex system of rules at a very early age with nosystematic teaching and on the basis of impoverished and even defective stimuli. Smallchildren pick up a highly competent knowledge of a language even though they get noformal instruction and the utterances they hear are limited and often not evengrammatical.

The traditional objection to this “innateness hypothesis” (Chomsky always objected tothis term, but it seems reasonable enough) was that languages were too various to beaccounted for by a single brain mechanism. Chomsky’s answer was that the surfacevariety of languages concealed an underlying structure common to all humanlanguages. This common structure is determined, he wrote, by an innate set of rules ofUniversal Grammar (UG). The innate mechanism in the brain that enables us to learnlanguage is so constituted that it embodies the rules of UG; and those rules, accordingto Chomsky, are not rules we can consciously follow when we acquire or uselanguage. I think the official reason for the abandonment of the research program wasthat the sheer complexity of the different rule systems for the different languages washard to square with the idea that they are really all variations on a single underlying set

Page 4: John Searle - End of the Revolution

of rules of UG.

There were, as might be expected, a number of objections to Chomsky’s proposals. I,for one, argued that the innate mechanism that enables the child to acquire languagecould not be “constituted” by—i.e., made up of—rules. There are no rules of universalgrammar of the sort that Chomsky claimed. I argued this on a number of grounds, thechief being that no sense had been given to the idea that there is a set of rules that noone could possibly consciously follow: if you can’t follow them consciously then youcan’t follow them unconsciously either. I also argued that, precisely to the extent thatthe mechanism was innate and applied automatically, it was empty to suppose that itsapplication consisted in rule-governed behavior, No sense, I wrote, had been given tothe idea of rules so deeply buried in unconscious brain processes that they were noteven the sort of things that could be consciously followed.

Just as a child does not follow a rule of “Universal Visual Grammar” that prohibits itfrom seeing the infrared or ultraviolet parts of the electromagnetic spectrum, so thechild does not follow rules of Universal Linguistic Grammar that prohibit it fromacquiring certain sorts of languages but not others. The possibilities of vision andlanguage are already built into the structure of the brain and the rest of the nervoussystem. Chomsky attempted to answer my arguments in a number of places, includingthe book under review. But in the case of UG he has given up the idea that there arerules of universal grammar.

In his recent book, as well as in other works (most importantly, The MinimalistProgram 2 ), Chomsky advances the following, much more radical, conception oflanguage: the infant is indeed born with an innate language faculty, but it is not madeup of any set of rules; rather it is an organ in the brain that operates according to certainprinciples. This organ is no longer thought of as a device for acquiring language,because in an important sense it does not so much acquire as produce any possiblehuman language in an appropriate environment. Chomsky writes,

We can think of the initial state of the faculty of language as a fixed networkconnected to a switch box; the network is constituted of the principles oflanguage, while the switches are the options to be determined by experience.When the switches are set one way, we have Swahili; when they are set anotherway, we have Japanese. Each possible human language is identified as a particularsetting of the switches—a setting of parameters, in technical terminology. If theresearch program succeeds, we should be able literally to deduce Swahili fromone choice of settings, Japanese from another, and so on through the languages

Page 5: John Searle - End of the Revolution

that humans can acquire. [my italics]

According to this view, the possibility of all human languages is already in the humanbrain before birth. The child does not learn English, French, or Chinese; rather, itsexperiences of English set the switches for English and out comes English. Languagesare neither learned nor acquired. In an important sense they are already in the“mind/brain” at birth.

What happens, then, to the rules of grammar? Chomsky writes that

This “Principles and Parameters” approach, as it has been called, rejected theconcept of rule and grammatical construction entirely: there are no rules forforming relative clauses in Hindi, verb phrases in Swahili, passives in Japanese,and so on. The familiar grammatical constructions are taken to be taxonomicartifacts, useful for informal description perhaps but with no theoretical standing.They have something like the status of “terrestrial mammal” or “household pet.”

The overall conception of language that emerges is this: a language consists of alexicon (a list of elements such as words) and a set of computational procedures. Thecomputational procedures map strings of lexical elements onto a sound system at oneend and a meaning system at the other. But the procedures themselves don’t representanything; they are purely formal and syntactical. As Chomsky says,

The computational procedure maps an array of lexical choices into a pair ofsymbolic objects…. The elements of these symbolic objects can be called“phonetic” and “semantic” features, respectively, but we should bear in mind thatall of this is pure syntax and completely internalist.

Chomsky is eager to emphasize that the principles and parameters approach is atentative research project and not an established result, but it is pretty clear that hethinks the original project of thirty-five years ago has failed. For years he has told usthat the interest of the study of language was that it was a “window on the mind” andthat from it we could identify a great many of the mind’s properties. One of hisfavorites: the mind uses “structure-dependent” rules, for example the transformationalrules I earlier described. 3 Now he has given all that up. Language is a specific facultywith no general mental implications; and there are no rules, hence no structure-dependent rules. In an important sense there aren’t even any languages. All each personhas is what he calls an “I-language,” “I” for internal, individual, and intensional. 4 Aneutral scientist, a “Martian scientist” in Chomsky’s thought experiment, “mightreasonably conclude that there is a single human language, with differences only at the

Page 6: John Searle - End of the Revolution

margins.”

What about words and their meanings? Well, Chomsky speculates, maybe all possibleconcepts are also in the brain and what we call learning the meaning of a word is reallyjust learning a label for a concept we have always had. “However surprising theconclusion may be that nature has provided us with an innate stock of concepts, andthat the child’s task is to discover their labels, the empirical facts appear to leave openfew other possibilities.” So, to take two examples discussed by Chomsky, on this viewevery human child that ever lived had at birth the concepts of “bureaucrat” and“carburetor”; indeed children born to the cave men twenty thousand years ago hadthese concepts, and all of us would still have them even if carburetors had never beeninvented and bureaucrats had never existed. In the face of the sheer implausibility ofthis claim Chomsky likes to appeal to the example of the immune system. Nature hasprovided us with the capacity to produce a huge stock, literally millions, of antibodies,even antibodies against antigens that have been artificially synthesized. So why not ahuge stock of innate concepts, ready for any word we could conceivably invent? Onthis view, the only part of language that depends on stored conventions is the soundsof the words used to label the innate concepts.

2.

To people who take the study of language seriously I think all this ought to seem moredisquieting than it does to Chomsky. For all those years he was telling us that we hadoverwhelming evidence that speakers of a language were following certain types ofrules, and that we even had evidence about which rules they were following. Whathappened to all that evidence? If the rules are all thrown out, what was the “evidence”evidence for?

Let us start with Chomsky’s idea of a neutral Martian scientist arriving on Earth andfinding our languages an object of study for “natural science.” The point of imagininga Martian, he said, is to free us of our local prejudices. The scientist will find that we allspeak the same language, except “at the margins,” and that the I-language with itsvariations is the proper object of study for natural science. Does that sound right toyou? It doesn’t to me. First, any such scientist has to have a language of her, his, or itsown. No language, no science. So the scientist’s first step is to compare our languageswith her own. How are they like and unlike Martian? The only way I can imagine thescientist doing this is to imagine that she learns one of our languages, say English. Shedoes that as anyone, scientist or otherwise, would, by figuring out how to translate herexpressions into English and English expressions into Martian.

Page 7: John Searle - End of the Revolution

Let us suppose she is so good at it that soon she is bilingual. Then she will discover aninteresting fact. Knowledge of English is not much use to her when she is confrontedwith monolingual Finnish speakers. For example she will eventually find out that theFinnish single-word sentence, “Juoksentelisinkohan,” appropriately pronounced,translates into English as “I wonder if I should run around a little bit without aparticular destination.” So to learn Finnish she has to start all over again. And the samesequence repeats itself when she tries to converse in Arabic, Swahili, or Japanese. Isthere really only one language on earth? Not in her experience.

Worse yet, she will soon discover that language is not an object of “natural” scienceand could not be. The distinction, rough as it is, between the so-called “natural”sciences and the “social” sciences is based on a more fundamental distinction inontology, between those features of the world that exist independently of humanattitudes, like force, mass, gravitational attraction, and photosynthesis, on the one hand,and, on the other, those whose existence depends on human attitudes, like money,property, marriage, and government. There is a distinction, to put it in very simpleterms, between those features of the world that are observer-independent and those thatare observer-relative or observer-dependent. Natural sciences like physics, chemistry,and biology are about features of nature that exist regardless of what we think; andsocial sciences like economics, political science, and sociology are about features of theworld that are what they are because we think that is what they are.

Where, then, do language and linguistics fit in? I think it is obvious that a group ofletters or sounds can be called a word or a sentence of English or Finnish only relativeto the attitudes of English and Finnish speakers. You can see this quite clearly in thecase of linguistic changes. Pick up a text of Chaucer and you will find sentences thatare no longer a part of English, though they once were, and you can produce Englishsentences that were not part of Chaucerian English. Of course, Chomsky is right toinsist that “English” is not a well-defined notion, that the word has all sorts oflooseness both now and historically. I am a native English speaker, yet I cannotunderstand some currently spoken dialects of English. All the same, the point remains:a group of letters or sounds is a sentence, or a word, or other element of a languageonly relative to some set of users of the language.

The point has to be stated precisely. There is indeed an object of study for naturalscience, the human brain with its specific language components. But the actuallanguages that humans learn and speak are not in that way natural objects. They arecreations of human beings. Analogously humans have a natural capacity to socializeand form social groups with other humans. But the actual social organizations they

Page 8: John Searle - End of the Revolution

create, such as governments and corporations, are not natural, observer- independentphenomena, they are human creations and have an observer- dependent existence. Astheir speakers develop or disappear, languages change or die out.

There is a deep reason why languages like English or Finnish must be rule-governed.The sentences and other elements only exist as part of the language because we regardthem as such. Language is in an important sense a matter of convention. But if so,there must be some principles by which we regard some strings as sentences of Englishand others not. Being a sentence of English is not a natural fact like being a mountainor a waterfall; it is relative to the observer. Functional phenomena that are relative to anobserver divide into two kinds, those like knives, chairs, and tables, which canfunction as such because of their physical structure, and those like money, language,and government, which function the way they do because we assign to them a certainstatus and with that status a function that can only be performed because of thecollective acceptance of the entities as having a certain status and with that status afunction. 5

The second class, the status functions, require systems of rules (conventions, acceptedprocedures, principles, etc.). Human languages, like money, property, marriage,baseball games, and government, are constituted by certain sorts of rules that, yearsago, I baptized “constitutive rules.” 6 Such rules do not merely regulate existingactivities, like the rules of driving, but they create the very possibility of such activities.There are no purely physical properties that are sufficient to determine all and onlysentences of English (or money, baseball, US congressmen, married couples, or privateproperty). But why not, since all these are physical phenomena? Because the physicalphenomena satisfy these descriptions only relative to some set of conventions and ofpeople’s attitudes operating within the conventions. Something is money, property, asentence of English, etc., only relative to the attitudes people have within systems ofrules. That language is constituted by rules cannot be legitimately denied, as nowChomsky tries to do, on the theoretical ground that it is hard to square with a certainconception of the innate language faculty.

But why did the attempt by linguists to get descriptively and explanatorily adequategenerative grammars fail? I said I did not know, but here is one hypothesis. Theywanted rules of a very unrealistic kind. They wanted rules for sentence formation thatcould be stated without any reference to the meanings of words or sentences and theywanted rules that generated sentences algorithmically, i.e., according to a set ofprecisely statable steps, without any need for interpretation and without any “otherthings being equal” conditions. The model was based on the formation rules for

Page 9: John Searle - End of the Revolution

artificially created logical and mathematical systems. But human social rules are almostnever like that. The history of the passive transformation is illustrative. You canformulate a transformational rule that converts sentences of the form

NP1 verbs NP2

into

NP2 is verbed by NP1.

Thus it converts

John loves Mary

into

Mary is loved by John.

But what about sentences like

John weighs one hundred and sixty pounds

or

John resembles Eisenhower.

These do not yield

One hundred and sixty pounds is weighed by John

or

Eisenhower is resembled by John.

Why not? I think any child recognizes that the passive does not work in these casesbecause of the meanings of the words. Resembling and weighing are not things thatcan be done by someone to someone or something in the way that loving, seeing,hitting, and desiring can be. So you can passivize sentences with “loves,” “sees,”“hits,” and “desires” but you can’t turn sentences into the passive voice with “weighs”and “resembles.” Perhaps in other languages sentences with verbs synonymous to these

Page 10: John Searle - End of the Revolution

permit conversion into the passive, but not in English. The point is not that I havegiven a correct explanation, but rather that this sort of explanation was not permissiblein generative grammar. The proponents of generative grammar required explanationsusing only syntactical rules—no meanings allowed—operating on syntactical entities.

The correct picture seems to me this. There are indeed innate mechanisms in the humanbrain for acquiring and using language. That is why we have languages and our closerelatives, the chimpanzees, do not. The mechanisms work according to certainprinciples, like any other organ. But it is not a matter of rules, and learning a languageis not a matter of following rules of Universal Grammar, any more than seeingsomething is a matter of following rules of Universal Visual Grammar.

There are indeed rules of specific languages, but the effort to find generative grammarsfor these languages is bound to fail, precisely because the aim was to obtain rigorous,strict, exceptionless rules of the sort that you get for constructing formal systems suchas the predicate calculus, or axiomatic set theory, and such rules make no reference towhat the entities were to be used for. The rules were to be stated without any referenceto the meanings or the uses of the sentences generated. Natural human phenomenaalmost never have rules like that. There will often be exceptions to a rule; there willtypically be semantic considerations in the formulation and application of the rule; andthere will in general be an “other things being equal” clause understood in theapplication of the rule.

When Chomsky suggests that the concepts expressed by words like “carburetor” and“bureaucrat” must be innately known by every child, and that learning the meanings ofthe words is just a matter of applying labels to concepts the child already has, youknow that something has gone radically wrong. He has a very unrealistic conception oflearning. It is as if he supposed that learning the meanings of these words would haveto consist in having one’s nerve endings stimulated by passing bureaucrats andcarburetors, and because there is no way such passing stimuli could ever give us themeanings of these words, it looks like the meanings must be innate.

This argument is called the argument from the “poverty of the stimulus” and it occursover and over in Chomsky’s work. But a more realistic conception is the following: inorder to understand, for example, the word “bureaucrat,” a child has to be introducedto a culture, a culture that includes governments, bureaus, departments, powers,employment, and a host of other things. A child does not learn a set of discreteconcepts, but learns to master a culture, and once that culture is mastered, it is notdifficult for him to understand the word “bureaucrat.” Similar remarks could be made

Page 11: John Searle - End of the Revolution

about “carburetor.” This concept only makes sense within the context of someknowledge of internal combustion engines. Once you have the basic understanding ofhow such engines work it is not hard to understand that a carburetor is a device formixing air and fuel.

Furthermore, one often has a partial or imperfect knowledge of a concept. Chomsky’sanalogy with the immune system thus seems grossly inadequate. Concepts are seldomall or nothing, and they are almost always systematically related to other concepts. Youcannot have the concept of “carburetor” or “bureaucrat” without having a great manyother logically related concepts. But chemical compounds are both all-or-nothing anddiscrete. Each antibody is distinct from every other antibody, and for any antibody youeither have it or you don’t. For concepts you can have a partial grasp of the concept,and there is no way you can have a concept without having many other concepts.

3.

I do not wish to give the impression that Chomsky’s entire book is concerned withthese issues. On the contrary, most of the book is concerned with debates about currentissues in philosophy. I will discuss one of them, the question of unconscious rules ofhuman cognition, which is related to the question of language. A standard explanatorydevice in Chomsky’s earlier work, and in cognitive science in general, is to claim thatwe are unconsciously following rules. The importance of this can hardly beoverestimated. Once we have the possibility of explaining particular forms of humanbehavior as following rules, we have a very rich explanatory apparatus that differsdramatically from the explanatory apparatus of the natural sciences. When we say weare following rules, we are accepting the notion of mental causation and the attendantnotions of rationality and existence of norms.

So, for example, if we explain my driving behavior by saying that I am following therule “Drive on the right- hand side of the road,” even when I am following this ruleunconsciously, we have a mode of explanation that is quite different from saying thatthe car follows the rule “Force equals mass times acceleration.” Both “rules” describewhat is happening, but only the first actually is a case of following a rule. The contentof the rule does not just describe what is happening but plays a part in making ithappen. In order to make an explanation of behavior as following rules work, we needto be able to distinguish cases which are guided by a rule from cases which are merelydescribed by a rule. One condition of rule-guided explanations is that the rules have tobe the sorts of things that one could actually follow. If you spell out those conditions,you find that unconscious rules have to be the sort of things that at least could beconscious. So, for example, I can follow the rule “Drive on the right” unconsciously,

Page 12: John Searle - End of the Revolution

but it is the sort of rule I could bring to consciousness. For a number of reasons rulesmay be unconscious, and in some cases, such as brain damage or repression, a personmay be unable to bring the rule to consciousness. But an unconscious rule has to havethe kind of content which could be consciously understood, interpreted, followed, orviolated.

Chomsky’s rules do not meet that condition. For him the rules of language are“computational” rules, but what exactly is the definition of computation, according towhich these rules are computational? On the standard definition of computation, we areto think of computations as reducing to vast sets of zeroes and ones zapping throughthe computer. Is that how we are to think of unconscious rule-following onChomsky’s model? Lots of zeroes and ones in the child’s head? That can hardly beright because the zeroes and ones are in the mind of the programmer. In actualcommercial computers, the only reality independent of the observer consists of rapid—millions per second—transitions in complex electrical circuits. Commercial computersdon’t literally follow rules because they do not have the mental apparatus necessary forrule-following. We program the computers to behave automatically as if they werefollowing rules, and thus we can get the same results as human rule-followingbehavior.

So there is a dilemma: if we are to think of computational rule-following in thetechnical sense of reducing to binary symbols, then there is literally no rule-followingindependent of an observer and we have lost the explanatory power of psychologicalexplanation. If we are to think of computation in the common-sense meaning,according to which, when we say, for example, that the child computed the meaning ofthe sentence we just mean that he figured it out, then the unconscious rules do not meetthe condition of being thinkable.

Chomsky has now given up on the idea that there are rules of particular languages, butthe difficulty about computation remains. This is an absolutely crucial point at issueand I want to make it completely clear. Chomsky insists that the study of language is abranch of natural science and the key notion in his new conception of language iscomputation. On his current view, a language consists of a lexicon plus computations.But my objection to this is that computation is not a notion of natural science likeforce, mass, or photosynthesis. Computation is an abstract mathematical notion that wehave found ways to implement in hardware. As such it is entirely relative to theobserver. And so defined, in this observer-relative sense, any system whatever can bedescribed as performing computations. The stone falling off a cliff computes thefunction “The distance I fall has to equal half the square of gravity multiplied by the

Page 13: John Searle - End of the Revolution

function “The distance I fall has to equal half the square of gravity multiplied by thetime I fall”: S = 1/2(gt2). The water flowing over a dam at the rate of one gallon persecond computes the addition function 2 + 2 = 4 every four seconds, and so on witheverything else in the world. Unlike, say, electrical charge, computation is notdiscovered in nature, rather it is assigned to physical processes. Natural processes canbe interpreted or described computationally. In this observer-relative sense there cannotbe a natural science of computation.

In the original definition of computation, before the invention of “computingmachinery” by Alan Turing and others, 7 “computing” meant figuring something outin arithmetic or mathematics, and a “computer” was a person who computed. In thesense in which I solve mathematical problems I really am intrinsically computing andthe attribution of computational features to my conscious thought processes is notrelative to an observer. Now, when Chomsky says that language is a matter ofcomputation, which is it? Is it the observer-relative sense of zeroes and ones? If so, theproject is no longer natural science and the computations do not explain thephenomena but merely describe processes whose causal explanation has to be found inneurobiology. Is it the observer-independent sense in which human beings figurethings out? If so, then the unconscious rules don’t meet the conditions necessary forrule-following explanations. In neither case do we get an account of language that is atall like natural science. Chomsky says, “John Searle and I have discussed these issuesfor some years.” Indeed. And I expect the discussions to continue.

In any case, as I noted above, Chomsky has now given up on the idea that UniversalGrammar is a matter of unconscious rule-following. But he also dismisses the idea thatreal human languages are governed by rules. That, I believe, cannot be right.

I would not wish my criticisms of Chomsky to be misunderstood. At a time whenvarious embarrassingly incompetent accounts of language are widespread in universityhumanities departments under such names as “literary theory,” “deconstruction,” and“postmodernism” it is worth emphasizing that his work in linguistics is at the highestintellectual level. 8

—This is the first of two articles about linguistics.

1. The New York Review , June 29, 1972. ↩

2. MIT Press, 1995. ↩

Page 14: John Searle - End of the Revolution

3. Reflections on Language (Pantheon, 1975), pp. 31–33. ↩

4. By "intensional" he means that the language is identified not by the sentences itproduces (the extension) but by the means according to which they are produced (theintension). In principle two different I-languages might produce the samesentences. ↩

5. For more on this point see John R. Searle, The Construction of Social Reality (FreePress, 1995). ↩

6. See "How to Derive 'Ought' from 'Is,'" Philosophical Review , Vol. 73 (January 1964),pp. 43–58. ↩

7. Alan Turing, "Computing Machinery and Intelligence," Mind , Vol. 59 (1950), pp.433–460. ↩

8. I wish to thank Stephen Neale, Barry Smith, and Dagmar Searle for criticisms of anearlier version of this article. ↩

Copyright © 1963-2010 NYREV, Inc. All rights reserved.Copyright © 1963-2010 NYREV, Inc. All rights reserved.