uncorrected proofs - princeton university

14
UNCORRECTED PROOFS Melissa Donnelly, Production Editor Phone: 201-748-6438 E-mail:[email protected] Refer to journal acronym and article production number (i.e.,WCS for WIRES: Cognitive Science).

Upload: others

Post on 10-Jan-2022

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: UNCORRECTED PROOFS - Princeton University

UNCORRECTED PROOFS

Melissa Donnelly, Production EditorPhone: 201-748-6438E-mail:[email protected] to journal acronym and article production number(i.e.,WCS for WIRES: Cognitive Science).

Page 2: UNCORRECTED PROOFS - Princeton University

UNCORRECTED PROOFS

Page 3: UNCORRECTED PROOFS - Princeton University

UNCORRECTED PROOFS

WIRES: Cognitive Science (WCS)

To: Melissa Donnelly

Production Editor

Phone: 201-748-6438

From:

Date:

Pages including this cover page:

re:

Page 4: UNCORRECTED PROOFS - Princeton University

UNCORRECTED PROOFS

Overview

Construction grammarAdele Goldberg∗ and Laura Suttle

Construction grammar, or constructionist approaches more generally, emphasizethe function of particular constructions as well as their formal properties.Constructions vary in their degree of generality, from words to idioms to moreabstract patterns such as argument structure constructions, topicalization, andpassive. There is also no division drawn between semantics and pragmatics, asall conventional aspects of constructions are encoded within the constructionsthemselves; thus constructions can include information about informationstructure, register, or genre. The majority of constructionist approaches are alsousage based, in that they recognize that we retain a great deal of item-specificinformation. An important desideratum of constructionist approaches is that theyinterface naturally with what we know about language acquisition, languagechange, and language typology. In order to capture generalizations within agiven language, constructions are related via an inheritance hierarchy, with moreabstract, productive constructions being directly related to their more idiomaticinstantiations. The functions of particular constructions as well as domain generalcognitive and social cognition are appealed in order to capture cross-linguisticallyvalid typological generalizations . 2010 John Wiley & Sons, Ltd. WIREs Cogni Sci 2010 1000–0000

123456789101112131415161718192021222324252627282930

313233343536373839404142434445464748495051525354555657585960

What is the nature of our knowledge of language?How do learners acquire generalizations such

that they can produce an open-ended number ofnovel utterances based on a finite amount ofinput? Why are languages the way they are? Inorder to address these long-standing questions, manylinguists with varying backgrounds have convergedon several key insights that have given rise toa family of constructionist approaches includingvarious versions of construction grammar. Theseapproaches emphasize that speakers’ knowledgeof language consists of systematic collections ofform–function pairings, or constructions, at varyinglevels of generality and complexity. Our knowledgeof language is understood to form a network ofinterrelated constructions; these constructions arerelated to one another via (default) inheritance links.Creative, novel utterances are formed by combiningconstructions on the fly as long as they do not haveconflicting specifications: constructions often haveopen slots that are filled by words or other phrasalpatterns.

∗Correspondence to: [email protected]

Psychology Department, Princeton University, Princeton, NJ08540, USA

DOI: 10.1002/wcs.022

On the constructionist approach, no innate,domain-specific principles are assumed. The nullhypothesis is that constructions are learned on thebasis of the input, together with domain-generalprocesses including attentional biases, principlesof cooperative communication, general processingdemands, and processes of categorization.

The term constructionist is intended to evokeboth the notion of ‘construction’ and the notion thatour knowledge of language is ‘constructed’ on thebasis of the input together with general cognitive,pragmatic, and processing constraints. It is intended tobe a more inclusive term than construction grammar,as the latter is a particular instance of a constructionistapproach.

Constructionist approaches have been used ina variety of interdisciplinary endeavors to shed lighton a variety of issues, from computer modeling1–4

to detailed accounts of language acquisition.5 Theapproach predicts a relatively high degree of cross-linguistic variation.6–8 At the same time, nontrivialcross-linguistic generalizations are recognized to exist;these are accounted for in terms of the functions ofthe constructions, general diachronic processes, anddomain-general factors, instead of by appeals to a‘universal grammar’.6,7,9–12

Volume 1, January /February 2010 2010 John Wiley & Sons, L td. 1

Page 5: UNCORRECTED PROOFS - Princeton University

UNCORRECTED PROOFS

Overview wires.wiley.com/cogsci

1234567891011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556575859

60616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118

WHAT ARE CONSTRUCTIONS?

Constructions are defined to be learned pairings ofform and function, including words and idioms aswell as phrasal linguistic patterns. Examples are givenin Table 1.

Common patterns such as passive, topicaliza-tion, and relative clauses are understood to be learnedpairings of form and function, as are words, idioms,and various minor patterns; all are constructions,although they clearly vary in size, complexity, andgenerality.

The constructionist approach recognizes thatlanguage is fundamentally a means of communication.This means that subtle facts about the functions ofconstructions are emphasized, including their semanticproperties, their information structure or discourseproperties, and their conditions of use (e.g., register,genre). The entire grammar is composed of theseform–meaning pairings: its constructions all the waydown.

It is clear we need to posit a construction whenthere is something noncompositional that needs tobe learned. Since the basic ‘rules of composition’for a given language are somewhat unique, evensimple patterns, such as the transitive, are recognizedto be constructions. For example, the transitiveconstruction can specify particulars of case-marking ina language that has case-marking and can also specifyany constraints on animacy or definiteness. Morespecific constructions are then posited in addition,whenever something more than a combination ofregular, simple constructions is needed. In this way,any linguistic pattern is recognized as a construction

TABLE 1 Examples of constructions at varying levels of size andcomplexity

Construction Form/example

Word Example: ornithology or ornery

Partially filled word(aka morpheme)

Example: anti-N, pre-N, V-ing

Complex word (filled) Example: daredevil, shoo-in

Idiom (filled) Example: trip the light fantastic;what’s up?

Idiom (partially filled) Example: jog someone’s memory

Covariational-conditionalconstruction

Form: the Xer the Yer (e.g., themore you think about it, the lessyou understand)

Ditransitive (double-object)construction

Form: Subj, V, Obj1, Obj2 (e.g., Hebaked her a muffin)

Passive construction Form: Subj aux Vpp (PPby) (e.g., Thehedgehog was struck by lightning)

as long as some aspect of its form or function is notstrictly predictable from its component parts or fromother constructions recognized to exist.

In addition, highly frequent form–functionpairings are also recognized as constructions evenwhen they are compositional, by those who advocatea usage-based constructionist approach.1,5,11,13–15

While at first it may seem that the additionof fully compositional constructions unnecessarilyinflates the size of the grammar, there is muchevidence that knowledge about language includes suchredundant information. The presence of frequency-based knowledge in language has been clearlyestablished in phonetic research,16 in that speakersof a given language statistically reproduce the subtlephonetic distinctions as they have been witnessed inthe input. We also know that phonological reductionof forms is based on how frequently the formsare used. For example, Bybee17 observes that high-frequency words such as every and family are typicallypronounced with only two syllables instead of three,while less frequent words such as mammary andsummary are pronounced with three syllables andwords that are of intermediate frequency such asmemory, camera, and family can be pronounced eitherwith two or three syllables (cf. also Ref 18).

There is evidence for usage-based knowledge atthe level of grammatical category as well. For example,a definition of adjective would likely include thefollowing: (1) adjectives semantically modify nouns,(2) adjectives can appear attributively (prenominallyin English), (3) adjectives can appear predicatively,for example, after the copular or verbs like ‘seem’.This definition works for many adjectives, such aspretty and boisterous. However, there are adjectivesthat violate each of these generalizations, and many ofthe exceptions require learning of item-specific facts.A simple case is the fact that the adjective, blithering,only appears in one context: attributively before thenoun idiot, as in the collocation, blithering idiot.Another case involves the class of English schwa-initial ‘a-adjectives’ including alive, awake, asleep,afloat, and afraid. This class systematically resistsprenominal attributive use:

1. a. ??The alive/awake/asleep/afloat/afraid boy

b. The boy seems alive/awake/asleep/afloat.

Goldberg and Boyd19 argue that this pattern isnot the result of any general semantic or phonologicaldispreferrence. There exists historical motivation forthe pattern in that most a-adjectives were historicallyprepositional phrases, but speakers are unaware of

2 2010 John Wiley & Sons, L td. Volume 1, January /February 2010

Page 6: UNCORRECTED PROOFS - Princeton University

UNCORRECTED PROOFS

WIREs Cognitive Science Construction grammar

1234567891011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556575859

60616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118

the historical facts. The pattern needs to be learned bygeneralizing over the input. Goldberg and Boyd19

further show that speakers continue to generalizethe pattern, extending the distribution to novel a-adjectives, particularly when the novel a-adjectivesare witnessed in a preemptive context.

The usage-based model presupposes that some atleast implicit memory traces of utterances or phrasesexist, so that frequency information can be accruedand generalizations over the input become possible.In fact, speakers do have access to a degree of evenexplicit long-term memory for clause-level utterancesheard in naturalistic contexts.20

There is in fact a massive and growing amountof evidence that speakers are aware of statisticalpatterns in the input. These findings make a usage-based account of our knowledge of language virtuallyimpossible to avoid. Knowledge about language isthus knowledge about both item-specific informationas well as generalizations: both the items and thegeneralizations are represented within the interrelatednetwork of constructions: the ‘construct-i-con’.

A final note about the form of constructionsis in order. It has become the norm amongmainstream generative linguists to posit ever moreabstract representations of language, including manyphonologically null (invisible) nodes and even invisiblewords.21,22 Moreover, surface strings often bearlittle relationship to the ‘underlying’ syntax posited,requiring recourse to ‘movement’, despite the fact thatthe movement is acknowledged to not occur in realtime. On this view, surface patterns only hint at thecomplex underlying formal structures. Constructionistapproaches, on the other hand, adopt a ‘what you seeis what you get’ model of language. Although mostlanguages clearly do involve hierarchical structure, nomovement nor invisible elements are assumed.

COMBINING CONSTRUCTIONS

Utterances are rarely composed of a single con-struction. An actual expression typically involves thecombination (‘unification’) of at least half a dozendifferent constructions. For example, (2) contains averb phrase (VP), noun phrase (NP), transitive, andsubject–predicate constructions as well as the individ-ual constructions corresponding to each of the wordsinvolved:

2. The squirrel cracked his nut.

Constructions can be combined freely to formactual expressions as long as they are not in conflict.

•For example, the restriction on the ditransitive that itAQ1

conveys transfer from an actor to a potential recipientrenders examples such as (3a) below ungrammatical:

3. a. * The man sent storage a box.

b. (cf. The man sent a box to storage.)

SURFACE GENERALIZATIONS

Constructionist theories do not derive one construc-tion from another, because different surface patternsare typically associated with differences in mean-ing or different discourse properties. The relationbetween the ditransitive construction (4a) and thecaused motion (‘dative’) (4b), then, is simply one ofparaphrase and partial lexical overlap.

4. a. She gave him a book (ditransitive construc-tion).

b. She gave a book to him (caused motion or‘dative’ construction).

Once• we eschew derivations, it becomes clearAQ2

that 4b, repeated below as 5a, has more in commonwith 5b–e than it does with 4a, save for the sharedverb, give:

5. a. She gave a book to him.

b. She tossed a book to him.

c. She tossed a book toward him.

d. She tossed a book toward the fireplace.

e. She moved the book toward the fireplace.

That is, the English ‘dative’ construction is partof a much broader generalization: the construction isused for all kinds of caused motion, not necessarilytransfer of an object from an agent to an animaterecipient.23

Recognizing argument structure constructionsallows us to explain how examples are interpretedwith novel verbs (6) or with familiar verbs used innew ways (7):

6. She mooped him something (‘moop’ interpretedto imply transfer25).

7. She sneezed the foam off the cappuccino(interpreted to imply caused motion).

Volume 1, January /February 2010 2010 John Wiley & Sons, L td. 3

Page 7: UNCORRECTED PROOFS - Princeton University

UNCORRECTED PROOFS

Overview wires.wiley.com/cogsci

1234567891011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556575859

60616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118

RELATING CONSTRUCTIONS:DEFAULT INHERITANCE HIERARCHY

Constructions• that are motivated by other construc-AQ3

tions in that related forms and functions are linked toeach other in a default inheritance hierarchy, of thetype long found useful for representing all types ofrelational knowledge.24,25 Linguistic generalizationswithin a particular language are naturally capturedin this way. That is, broad generalizations are rep-resented by constructions that are inherited by manyother constructions; more limited patterns are cap-tured by constructions at various midpoints in thenetwork. For example, the ‘What’s X doing Y?’ con-struction has a fixed form and connotes some sortof unexpectedness.26 As Kay and Fillmore26 observe,this construction is not wholly idiosyncratic: it inher-its properties from several more general constructions,including the left isolation, the subject–auxiliary inver-sion, the subject–predicate and the verb–phrase con-structions. The resultative construction as in (8) islikewise motivated by the caused-motion construc-tion (e.g., 5a–e); viewing the former as a grammaticalextension of the latter based on a systematic metaphorallows for various distributional properties to beaccounted for.25

8. She drove him crazy.

Thus, productivity is allowed for by the freecombination of constructions and language-internalgeneralizations are captured by the inheritancehierarchy.

LANGUAGE LEARNING

Constructions are understood to be learned onthe basis of positive input and general cognitiveand social abilities. This is a major differencebetween this approach and the traditional approachadvocated by Chomsky et al.21,27,28 Chomsky hadargued that language is too complex and the inputtoo impoverished to be learned, concluding thathumans must come hard-wired with principles specificto a language faculty, also known as a ‘universalgrammar’.27 Universal grammar was thus positedin order to constrain the set of possible languagesthat a learner would consider when trying to acquiretheir own language. While Chomsky’s poverty of thestimulus arguments appeared convincing at the timethey were proposed (1965), most agree that it isnow worth revisiting the innateness assumption. Wecurrently have a better understanding of both the

natures of the input (via the availability of largecorpora of child directed speech), and it is clearthat it tends to be highly repetitive.29–31 Moreover,we now recognize powerful domain-general statisticaland rational learning abilities that children are able toemploy.32,33

A close look at children’s early learning suggeststhat it begins with specific, concrete examples: earlyon it is item-based.5,34 That is, there have beenmany demonstrations that children fail to fullygeneralize many grammatical patterns until aroundage 3.5 or older.35–47 This work suggests that becauseconstructions are acquired so late and in such apiecemeal fashion, they must be learned. Whileit has been widely assumed that children’s earlyconservativism is because of the fact that they havenot witnessed a critical mass of items, recent workindicates that young children may simply be lessskilled at recognizing regularities in the input48(butsee also Ref 49).

Constructionists argue that grammatical gen-eralizations are ultimately learned on the basis ofdomain-general abilities. For example, Ambridgeet al.50 have shown that distributed input is morefacilitory than massed input, an effect that is wellknown from the nonlinguistic categorization litera-ture. Another case is that input that is skewed suchthat one particular verb accounts for the lion’s share ofinstances facilitates the generalization of novel argu-ment structure constructions.51,52 Here again, thistype of skewed input has also been recognized as facil-itory in nonlinguistic categorization tasks as well.53,54

All linguists recognize that a wide rangeof semi-idiosyncratic constructions exist in everylanguage, constructions that cannot be accountedfor by general, universal, or innate principles orconstraints. Mainstream generative theory has takenthe position that these constructions exist only onthe ‘periphery’ of language and that, therefore, theyneed not be the focus of linguistic or learningtheorists.28 Constructionist approaches, on the otherhand, have homed in on these constructions, arguingthat whatever means we use to learn these patternscan easily be extended to account for so-called‘core’ phenomena. In fact, by definition, the corephenomena are more regular and also tend to occurmore frequently within a given language. Therefore, ifanything, they are likely to be easier to learn and lessin need of language-specific innate learning rules.

The powerful domain-general learning mecha-nisms, the fact that the input is highly repetitive andin many ways well-suited to induction, the evidencefor piecemeal, gradual acquisition, and the existenceof complex but uncontrovertibly learned patterns all

4 2010 John Wiley & Sons, L td. Volume 1, January /February 2010

Page 8: UNCORRECTED PROOFS - Princeton University

UNCORRECTED PROOFS

WIREs Cognitive Science Construction grammar

1234567891011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556575859

60616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118

suggest that language can be learned on the basis ofthe input, after all.

LANGUAGE UNIVERSALSConstructions typically do not exist sui generis; theyare generally not fully arbitrary: relationships betweenform and meaning are often motivated, and thuswe find recurrent patterns cross-linguistically. Butbecause Constructionist approaches do not rely oninnate universal linguistic principles, constructions areexpected to vary in their specifics cross-linguisticallyand this does seem to be the case.6,8 What may becalled a passive in one language may differ from apassive construction in another language in a numberof subtle ways including the presence or choice ofauxiliary, the presence or choice of adposition orcase that marks the agent argument, possible semanticor discourse restrictions, and overall frequency inthe language. Passives in unrelated languages aregenerally identified by their related functions; theyare constructions in which the topic and/or agentiveargument is essentially ‘demoted’, appearing as lessprominent adjunct or not at all.55 The fact that topicsand agents are typically expressed in syntacticallyprominent slots is motivated by their function; thefact that special constructions exist that allow them tobe expressed in nonprominent slots allows speakers adegree of flexibility and is therefore also motivated.Finding two constructions in two different languagesthat are absolutely identical in form, function, anddistribution is a rare occurrence outside of cases ofshared diachronic history or language contact.56,57

Croft6, for example, notes that words thattranslate into English as nouns, adjectives, andadverbs, as well as verbs, are inflected for person,aspect, and mood in Makah, an American Indianlanguage, and that no words are inflected for thesecategories in Vietnamese. Croft points out thattense–mood–aspect inflection cannot be taken ascritical for determining the category of V cross-linguistically (unless of course one is willing to say thatall words are verbs in Makah, and no words are verbsin Vietnamese). Croft goes on to point out that nosyntactic test will pick out all and only entities that onemight wish to call verbs, nouns, adjectives, subjects,objects, and so on across all languages. Moreover,he observes that even within a single language, agiven criterion often only applies within certain largerconstructions. For example,

‘If one takes passivizability as the criterionfor Direct Object in English, then one’s conclu-sions will tell us something about the Passive,not about some allegedly global category Direct

Object. . .(C)onstructions, not categories or rela-tions, are the basic, primitive units of syntacticrepresentation. . .This is radical construction gram-mar.’ (Ref 6, p. 46).

This is not to say that there are no stronguniversal tendencies or implicational universals tobe found across languages. Constructionists arguethat such cross-linguistic generalizations are betterexplained via grammar-external explanations such asuniversal functional pressures, iconic principles, andprocessing and learning constraints. Let us consider afew examples in order to illustrate the point.

Languages in which verbs appear at the end ofsentences have been shown to generally have postposi-tions and postnominal modifiers while languages withverbs appearing before their nonsubject complementstend to have prepositions and pronominal modifiers.This is shown in (9) below:

9.

Head Initial Languages :

VP[V . . .]NP[N . . .]PP[P . . .]

Head Final Languages :

VP[. . . V]NP[. . . N]PP[. . . P]

This ‘head-direction parameter’ has long beenused as an example of a purely syntacticgeneralization that requires an innate universalgrammar.28,58 Children would then only haveto determine where in the sentence the verbs intheir language appear and could then deducefrom that where to expect all other types ofmodifiers.

However, this generalization is not withoutexceptions. Persian, for example, is a verb-finallanguage but has prepositions instead of postpositions.In addition, it is not clear that the generalization posesany sort of acquisition problem, since children muststill learn the forms and meanings of the words in theirlanguage, including verbs, adpositions, and nouns.The ordering of elements in the sentence is apparentduring this learning, which calls into question theconceptual necessity of any innate parameter.

Still, the strong cross-linguistic tendency requiressome sort of explanation. Diachronic processes maywell provide a better account for the relationshipbetween verbs and adpositions, since adpositions typ-ically develop from verbs.59 It has also been hypoth-esized that the tendency for heads to systematicallyeither precede or follow their complements lends aprocessing advantage60,61. Reali and Christiansen62

have supported this idea by demonstrating that

Volume 1, January /February 2010 2010 John Wiley & Sons, L td. 5

Page 9: UNCORRECTED PROOFS - Princeton University

UNCORRECTED PROOFS

Overview wires.wiley.com/cogsci

1234567891011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556575859

60616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118

simple-recurrent networks found consistent head-complement orderings easier to learn than mixedsystems; that is, given the chance to adapt to their‘learners’, the languages in their simulation, over time,came to have a consistent head-complement ordersimilar to that found in real-world languages.

Other generalizations that have been arguedto require recourse to innate principles have alsofound better explanations elsewhere. Lidz et al.63

proposed that children come hard wired with specificknowledge that the number of overtly expressedcomplements should match the number of semanticparticipants. The fact that learners pay attention tothe number of referring expressions in a clause asan indicator of the propositional meaning is notcontested. However, the tendency for the numberof nouns overtly expressed to reflect the numberof semantic participants finds a natural explanationfrom the domain of Gricean pragmatics. Any referringexpression should be assumed to be relevant to thetopic at hand, and any argument that is relevantand nonrecoverable from discourse needs to beindicated in some way.10 Beyond this pragmaticgeneralization, no syntactic stipulation is needed.The pragmatic generalizations say nothing aboutarguments that are irrelevant or recoverable; this is anadvantage, since languages and constructions withina given language treat recoverable and irrelevantarguments differently. For example, many of theworld’s languages, including Chinese and Korean,readily allow recoverable arguments to be omitted.

Another generalization is that agents and under-goers are expressed in prominent syntactic positions.This is captured in the linking generalizations ofDowty64 and can be summed up as follows: in sim-ple active clauses, if there is a subject and an object,and if there is an agent and a patient, then the agentrole will be expressed by the subject and the under-goer role as direct object. This is a modest proposalthat has been taken by some to express an innatelinguistic universal. In fact, the facts are even moremodest: there are syntactically ergative languages inwhich agents are not generally expressed as subjects,there are many languages that do not have canonicalsubjects, and there are many constructions within agiven language that violate the generalizations (e.g.,passive which expresses the agent argument as anoblique). But again, there is something to the general-ization. The facts can be restated as follows: semanticactors and undergoers tend to be expressed in formallyprominent slots. Once stated this way, the generaliza-tion is much less mysterious: actor and undergoerarguments are generally expressed in prominent slotscross-linguistically because human beings’ attention

is naturally drawn to the actors and undergoers inevents.11

Other generalizations about how form andmeaning tend to be linked across languages canbe explained by appeal to iconic and ana-logical processes65–70 and to Gricean pragmaticgeneralizations.71 Constraints on long-distance depen-dency constructions (traditional ‘island constraints’)appear to yield to processing explanations thattake into account the function of the constructionsinvolved.72–76 This shift of perspective from seekingexplanations in terms of syntactic, innate stipula-tions to trying to account for generalizations byappealing to independently motivated general cog-nitive mechanisms has been echoed to some extentwithin mainstream generative grammar as well. Forexample, the fact that all languages appear to havenoun and verb categories may be explained by the exis-tence of corresponding basic semantic categories.77 Ina recent paper, Chomsky goes so far as to suggest thatthe only language-specific innate ability that is abso-lutely required is recursion, and the point is raisedthat even recursion might turn out not to be spe-cific to language.78 In fact, the claim that recursion isdomain-specific is already hotly contested.79,80.

A RANGE OF CONSTRUCTIONISTAPPROACHES

There are many varieties of the construction-ist approach, differing primarily in emphasis andnotation. These include, for example, the follow-ing: SBCG, sign-based construction grammar26,81,82;CG, cognitive grammar83,84; FCxG, fluid con-struction grammar85; ECxG, embodied constructiongrammar86,87; RCxG, radical construction grammar6;CCxG, cognitive construction grammar11,24,25,88; SS,simpler syntax89.

Charles Fillmore and Paul Kay first coined theterm, ‘Construction Grammar’. Their early workon idioms and idiomatic phrasal patterns such aslet alone, even, and What’s X doing Y? laid thefoundation for many of the variations of ConstructionGrammar that have since developed. SBCG asdeveloped by Paul Kay, Charles Fillmore, Ivan Sagand Laura Michaelis, in line with most traditionalgrammatical frameworks, aims to account for thegeneralizations in language without redundancy.Patterns or expressions that are predictable fromother generalizations are assumed not to be part of aspeaker’s knowledge of language. The frequencies ofparticular grammatical patterns are also not explicitlyrepresented within SBCG. Instead a strict division

6 2010 John Wiley & Sons, L td. Volume 1, January /February 2010

Page 10: UNCORRECTED PROOFS - Princeton University

UNCORRECTED PROOFS

WIREs Cognitive Science Construction grammar

between grammar and the use of the grammar ismade.

On the other hand, CG, CCxG, and RCxG areall usage-based frameworks as described above. Theaim of these frameworks is to represent grammati-cal knowledge in such a way that it can interfacetransparently with theories of processing, acquisition,and historical change. This desideratum has led to therecognition that even fully regular patterns may bestored if such patterns occur with sufficient frequency.

CCxG11,24,25,90 seeks to provide motivation foreach construction that is posited. Motivation aims toexplain why it is at least possible and at best nat-ural that a particular form-meaning correspondenceshould exist in a given language. Motivation is dis-tinct from prediction: recognizing the motivation fora construction does not entail that the constructionmust exist in that language or in any language. Itsimply explains why the construction ‘makes sense’ oris natural.24,25,91 Functional and historical generaliza-tions count as explanations, but they are not predictivein the strict sense, just as parallel generalizations inbiology are not predictive. That is, language, like bio-logical evolution, is contingent, not deterministic. Justas is the case with species, particular constructions arethe way not because they have to be that way, butbecause their phylogenetic and ontogenetic evolutionwas motivated by general forces.

Perhaps, the primary difference between SBCxGand CCxG is one of the emphasis. Both approachesintend the grammars proposed to be psychologicallyvalid, and both strive to be explicit and to capture rel-evant generalizations. Still, arguably CCxG ranks thedesideratum of psychological validity higher than thegoal of being formal or maximally general, whereasSBCxG generally has the opposite priorities.

RCxG extends work in Construction Gram-mar by investigating in detail the cross-linguistic

divergences among what many assume are atomic,universal syntactic categories, and relations. FCxGoffers computational models of how conventionaliza-tion in language can get off the ground. ECxG offers aformal notation for capturing rich semantics. SS offersbroad coverage of a range of traditional constructionsand a useful formalism for it. CG offers another richformalism for semantics as well as insightful analy-ses of a wide range of linguistic constructions. Theseapproaches are to a great extent complementary asthey have focused on different aspects of our knowl-edge of language.

CONCLUSION

Constructionist approaches, which include the variousversions of Construction Grammar, represent a broadbased alternative to mainstream generative grammarthat has held sway in the field for the past 50 years.Constructionists recognize that language is primarilya means of communication and that language involveslearning an interrelated network of conventional waysof pairing form with function. Most constructionistapproaches have adopted a usage-based approach togrammar, recognizing that information about frequen-cies and particulars of usage are part of our knowledgeof language. A growing body of work in languageacquisition indicates that language is learnable on thebasis of domain-general abilities. A growing body ofwork in typology indicates that purely syntactic uni-versals are vanishingly rare, while functional pressuresand cognitive constraints put a limit on possible varia-tion and account for recurrent general patterns. Manyissues remain outstanding, as it is a relatively newapproach. But constructionist approaches hold outthe promise that we may be able to ultimately accountfor the complexities of language without appeal tomysterious stipulations.

REFERENCES

1. Bod• R. Beyond Grammar: An Experience-Based The-AQ4

ory of Language. University of Chicago Press; 1998.

2. Morris• WC, Cottrell GW, Elman JL. A connectionistAQ5

simulation of the empirical acquisition of grammati-cal relations. Pp 175–193 in Stefan Wermter and RonSun, eds. Hybrid Neural Symbolic Integration. 2000.Springer-Verlag.

3. Borovsky A, Elman JL. Language input and seman-tic categories: a relation between cognition and early

word learning. J Child Lang 2006, 33:759–790.doi:10.1017/S0305000906007574.

4. Alishahi A, Stevenson S. A computational model ofearly argument structure acquisition. Cogn Sci 2008,32:789–834. doi: 10.1080/03640210801929287.

5. Tomasello• M. Constructing a Language: A UsageAQ6

Based Theory of Language Acquisition. Harvard Uni-versity Press; 2003.

6. Croft• W. Radical Construction Grammar. OxfordAQ7

University Press; 2001.

Volume 1, January /February 2010 2010 John Wiley & Sons, L td. 7

Page 11: UNCORRECTED PROOFS - Princeton University

UNCORRECTED PROOFS

Overview wires.wiley.com/cogsci

7. Haspelmath M. Parametric versus functional explana-tion of syntactic universals. In: Biberauer T, ed. Thelimits of syntactic variation. Amsterdam: John Ben-jamins; 2008, 75–107.

8. Evans• N, Levinson S. The myth of language univer-AQ8

sals: language diversity and its importance for cognitivescience. Behav Brain Sci. In press.

9. Hawkins• JA. Efficiency and Complexity in Grammars.AQ9

Oxford University Press; 2004.

10. Goldberg AE. But do we need Universal Grammar?A comment on Lidz et al. (2003). Cognition 2004,94:77–84. doi:10.1016/j.cognition.2004.03.003.

11. Goldberg• AE. Constructions at Work: The Nature ofAQ10

Generalization in Language. Oxford University Press;2006.

12. Newmeyer• FJ. Possible and Probable Languages. AAQ11

Generative Perspective on Linguistic Typology. OxfordUniversity Press; 2005.

13. Bybee J. Morphology: A Study of the Relation betweenMeaning and Form. Amsterdam: John Benjamins;1985.

14. Langacker RW. A usage-based model. In: Rudzka-Ostyn B, ed. Topics in Cognitive Linguistics. Amster-dam: John Benjamins; 1988, 127–161.

15. Kemmer S, Barlow M. Usage-Based Models of Lan-guage. Stanford: CSLI; 2000.

16. Pierrehumbert J. Exemplar dynamics: word frequency,lenition, and contrast. In: Bybee J, Hopper P, eds.Frequency and the Emergence of Linguistic Structure.Amsterdam: John Benjamins; 2001, 137–157.

17. Bybee J. The phonology of the lexicon: evidence fromlexical diffusion. In: Barlow M, Kemmer S, eds. Usage-based Models of Language. Stanford: CSLI; 200065–86.

18. Gahl S, Garnsey S. Knowledge of grammar, knowledgeof usage: syntactic probabilities affect pronunciationvariation. Language 2004, 80:748–775.

19. Goldberg• AE, Boyd J. A-adjectives, Historical Persis-AQ12

tence and Generalization. Princeton University; Forth-coming.

20. Gurevich• O, Johnson M, Goldberg AE. IncidentalAQ13

Verbatim Memory for Language. Princeton University;Forthcoming.

21. Chomsky N. The Minimalist Program. Cambridge:MIT Press; 1995.

22. Kayne• RS. Movement and Silence. Oxford UniversityAQ14

Press; 2005.

23. Goldberg AE. Surface generalizations: an alternativeto alternations. Cogn Linguist 2002, 13:327–356. doi:10.1515/cogl.2002.022.

24. Lakoff• G. Women, Fire, and Dangerous Things:AQ15

What Categories Reveal About the Mind. Universityof Chicago Press; 1987.

25. Goldberg• AE. Constructions: A Construction Gram-AQ16

mar Approach to Argument Structure. University ofChicago Press; 1995.

26. Kay P, Fillmore CJ. Grammatical constructions andlinguistic generalizations: the What’s X doing Y? con-struction. Language 1999, 75:1–34.

27. Chomsky N. Aspects of the Theory of Syntax. Cam-bridge: MIT Press; 1965.

28. Chomsky N. Lectures on Government and Binding.Berlin: Walter de Gruyter; 1981.

29. Mintz TH, Newport EL, Bever T. The distribu-tional structure of grammatical categories in speechto young children. Cogn Sci 2002, 26:393–424.doi: 10.1016/S0364-0213(2).

30. Dabrowska• E. Language, Mind and Brain. George-AQ17

town University Press; 2004.

31. Cameron-Faulkner T, Lieven EVM, TomaselloM. A construction based analysis of childdirected speech. Cogn Sci 2003, 27:843–873. doi:10.1016/j.cogsci.2003.06.001.

32. Saffran JR, Aslin RN, Newport E. Statistical learningby 8-month-old infants. Science 1996, 274:1926–1928.doi: 10.1126/science.274.5294.1926.

33. Perfors A, Tenenbaum JB, Regier T. Poverty of the stim-ulus? A rational approach. In: Sun R, ed. Proceedingsof the Twenty-eighth Annual Conference of the Cogni-tive Science Society. Mahwah, NJ: Lawrence ErlbaumAssociates; 2006, 663–668.

34. Tomasello M. Do young children have adult syntac-tic competence? Cognition 2000, 74:209–253. doi:10.1016/S0010-0277 (99) 00069-4.

35. Akhtar N, Tomasello M. Young children’s productivitywith word order and verb morphology. Dev Psychol1997, 33:952–965. doi: 10.1037/0012-1649.33.6.952.

36. Baker CL. Syntactic theory and the projection problem.Linguist Inq 1979, 10:533–581.

37. Bates E, MacWhinney B. Competition, variation, andlanguage learning. In: MacWhinney B, ed. Mechanismsof Language Acquisition. Hillsdale, NJ: Lawrence Erl-baum Associates; 1987, 157–193.

38. Bowerman• M. Reorganizational processes in lexicalAQ18

and syntactic development. In: Wanner E, GleitmanLR, eds. Language Acquisition: The State of the Art.Cambridge University Press; 1982, 319–346.

39. Braine• MDS. Children’s first word combinations.AQ19

Monographs of the Society for Research in Child Devel-opment 1976, 41 University of Chicago Press.

40. Brooks P, Tomasello M. How children constraintheir argument structure constructions. Language 1999,75:720–738.

41. Gropen J, Pinker S, Hollander M, Goldberg R, Wil-son R. The learnability and acquisition of the dativealternation in English. Language 1989, 65:203–257.

8 2010 John Wiley & Sons, L td. Volume 1, January /February 2010

Page 12: UNCORRECTED PROOFS - Princeton University

UNCORRECTED PROOFS

WIREs Cognitive Science Construction grammar

42. Ingram D, Thompson W. Early syntactic acquisition inGerman: evidence for the modal hypothesis. Language1996, 72:97–120.

43. Lieven EVM, Pine JM, Baldwin G. Lexically-basedlearning and early grammatical development. J ChildLang 1997, 24:187–219.

44. MacWhinney B. Basic syntactic processes. In: KuczajS, ed. Language Development, Volume 1: Syntax andSemantics. Hillsdale, NJ: Lawrence Erlbaum Publishers;1982, 73–136.

45. Olguin R, Tomasello M. Twenty-five-month-old chil-dren do not have a grammatical category of verb.Cogn Dev 1993, 8:245–272. doi: 10.1016/S0885-2014(93)80001-A.

46. Schlesinger IM. Steps to Language: Toward a The-ory of Language Acquisition. Hillsdale, NJ: LawrenceErlbaum Publishers; 1982.

47. Tomasello• M. First Verbs: A Case Study of Grammat-AQ20

ical Development. Cambridge University Press; 1992.

48. Boyd• J, Goldberg AE. Children are More ConservativeAQ21

than Adults when Exposed to the Same Input and itsBenefits for Language Learning. Princeton University;Forthcoming.

49. Hudson Kam CL, Newport EL. Regularizing unpre-dictable variation: the roles of adult and child learnersin language formation and change. Lang Learn Dev2005, 1:151–195. doi: 10.1207/s15473341lld0102 3.

50. Ambridge B, Theakston AL, Lieven EVM, TomaselloM. The distributed learning effect for children’sacquisition of an abstract syntactic construction.Cogn Dev 2006, 21:174–193. doi: 10.1016/j.cogdev.2005.09.003.

51. Goldberg AE, Casenhiser D, Sethuraman N. Learn-ing argument structure generalizations. Cogn Linguist2004, 15:289–316. doi: 10.1515/cogl.2004.011.

52. Casenhiser D, Goldberg AE. Fast mapping of a phrasalform and meaning. Dev Sci 2005, 8:500–508. doi:10.1111/j.1467-7687.2005.00441.x.

53. Elio R, Anderson JR. The effects of information orderand learning mode on schema abstraction. Mem Cognit1984, 12:20–30.

54. Goldberg AE, Casenhiser D. Learning argument struc-ture generalizations. In: Clark EV, Kelly BF, eds. Con-structions in Acquisition. Stanford: CSLI Publications;2006, 185–204.

55. Perlmutter DM, Postal PM. Toward a universal char-acterization of passivization. Proceedings of the ThirdAnnual Meeting of the Berkeley Linguistics Society,University of California, Berkeley; 1977.

56. Birner B, Ward G. Information Status and Non-canonical Word Order in English. Amsterdam: JohnBenjamins; 1998.

57. Zhang N. The interactions between constructionmeaning and lexical meaning. Linguistics 1998,36:957–980.

58. Jackendoff• R. Semantic Interpretation in GenerativeAQ22

Grammar. MIT Press; 1972.

59. Nichols J. Head-marking and dependent-markinggrammar. Language 1986, 62:56–119.

60. Hawkins JA. A parsing theory of word order universals.Linguist Inq 1990, 21:222–261.

61. Hawkins• JA. A Performance Theory of Order andAQ23

Constituency. Cambridge University Press; 1994.

62. Reali• F, Christiansen MH. Sequential learning and theAQ24

interaction between biological and linguistic adapta-tion in language evolution: a computational approach.Interact Stud. In press.

63. Lidz J, Gleitman H, Gleitman L. Understanding howinput matters: verb learning and the footprint of uni-versal grammar. Cognition 2003, 87:151–178. doi:10.1016/S0010-0277 (02) 00230-5.

64. Dowty D. Thematic proto-roles and argument selec-tion. Language 1991, 67:547–619.

65. Lambrecht• K. Information Structure and SentenceAQ25

Form. Cambridge University Press; 1994.

66. Verhagen A. From parts to wholes and backagain. Cogn Linguist 2002, 13:403–439. doi:10.1515/cogl.2002.024.

67. Haiman• J. Iconicity in Syntax. Cambridge UniversityAQ26

Press; 1985.

68. Givon T. Isomorphism in the grammatical code: cog-nitive and biological considerations. Stud Lang 1991,15:333–377.

69. Kemmer• S, Verhagen A. The grammar of causativesAQ27

and the conceptual structure of events. In: MoutonClassics: From Syntax to Cognition, from Phonologyto Text; 2002, 451–491.

70. Blevins• J, Blevins J. Introduction: Analogy in gram-AQ28

mar. In: Blevins JP, Blevins J, eds. Analogy in Grammar:Form and Acquisition. Oxford University Press; toappear, 1–12.

71. Levinson• SC. Pragmatics. Cambridge University Press;AQ29

1983.

72. Van Valin• RDJ. The acquisition of wh-questions andAQ30

the mechanisms of language acquisition. In: TomaselloM, ed. The New Psychology of Language: Cognitiveand Functional Approaches to Language Structure;1998, 221–249.

73. Kluender• R. On the distinction between strong andAQ31

weak islands: a processing perspective. Syntax Semant29:241–280.

74. Kluender R, Kutas M. Subjacency as a processing phe-nomenon. Lang Cogn Processes 1993, 8:573–633. doi:10.1080/01690969308407588.

75. Erteschik-Shir• N. The Dynamics of Focus Structure.AQ32

Cambridge University Press; 1997.

Volume 1, January /February 2010 2010 John Wiley & Sons, L td. 9

Page 13: UNCORRECTED PROOFS - Princeton University

UNCORRECTED PROOFS

Overview wires.wiley.com/cogsci

76. Ambridge B, Goldberg AE. The island status of clausalcomplements: evidence in favor of an information struc-ture explanation. Cogn Linguist 2008, 19:349–381.doi:10.1515/COGL.2008.014.

77. Baker• M. Lexical Categories: Nouns, Verbs and Adjec-AQ33

tives. Cambridge University Press; 2003.

78. Hauser M, Chomsky N, Fitch T. The fac-ulty of language: what is it, who has it, andhow did it evolve? Science 2002, 298:1569–1579.doi:10.1126/science.298.5598.1569.

79. Gentner T, Fenn KM, Margoliash D, Nus-baum HC. Recursive syntactic pattern learn-ing by songbirds. Nature 2006, 440:1204–1207.doi:10.1038/nature04675.

80. Pinker S, Jackendoff R. The faculty of language:what’s special about it? Cognition 2005, 95:201–236.doi:10.1016/j.cognition.2004.08.004.

81. Kay• P, Michaelis LA. Constructional meaning andAQ34

compositionality. In: Maienborn C, von HeusingerK, Portner P, eds. Semantics: An International Hand-book of Natural Language Meaning. Berlin: Moutonde Gruyter; Forthcoming.

82. Sag• IA. Sign-Based Construction Grammar: An Infor-AQ35

mal Synopsis Stanford University; 2007, Manuscript.

83. Langacker• R. Foundations of Cognitive Grammar,AQ36

Volume 1. Stanford University Press; 1987.

84. Langacker• R. Foundations of Cognitive Grammar,AQ37

Volume 2. Stanford University Press; 1991.

85. Steels L, De Beule J. A (very) brief introduction to fluidconstruction grammar In: Proceedings of the 3rd Inter-national Workshop on Scalable Natural Language.Madison, WI: Omnipress Inc.; 2006 73–80.

86. Feldman• J. From Molecule to Metaphor. MIT Press;AQ38

2006.

87. Bergen BK, Chang N. Embodied construction gram-mar in simulation-based language understanding. In:Ostman J-O, Fried M, eds. Construction Grammars:Cognitive Grounding and Theoretical Extensions. Ams-terdam: John Benjamins; 2005. (Reprinted in Evans V,Bergen B and Zinken J, eds. The Cognitive LinguisticsReader. London, Equinox; 2007.).

88. Stefanowitsch A, Gries ST. Collostructions: investigat-ing the interaction of words and constructions. IntJCorpus Linguist 2003, 8.2:209–243.

89. Culicover• P, Jackendoff R. Simpler Syntax. OxfordAQ39

University Press; 2005.

90. Goldberg AE. Constructions: a new theoreticalapproach to language. Trends Cogn Sci 2003,7:219–224. doi: 10.1016/S1364-6613 (03) 00080-9.

91. Haiman• J. Natural Syntax: Iconicity and Erosion.AQ40

Cambridge University Press; 1985.

10 2010 John Wiley & Sons, L td. Volume 1, January /February 2010

Page 14: UNCORRECTED PROOFS - Princeton University

UNCORRECTED PROOFS

QUERIES TO BE ANSWERED BY AUTHOR

IMPORTANT NOTE: Please mark your corrections and answers to these queries directly onto the proof at therelevant place. Do NOT mark your corrections on this query sheet.

Queries from the Copyeditor:

AQ1 Please rephrase the sentence ‘For example, the restriction on the ditransitive that it conveys transferfrom an actor to a potential recipient renders examples such as (3a) below ungrammatical:..’ for clarity.

AQ2 Please rephrase the sentence ‘Once we eschew derivations, it becomes clear that 4b, repeated below as5a, has more in common with 5b-–e than it does with 4a, save for the shared verb, give:. . .’ for clarity.

AQ3 Please rephrase the sentence ‘Constructions that are motivated by other constructions in that relatedforms and functions are linked to each other in a default inheritance hierarchy, of the type long founduseful for representing all types of relational knowledge’ for clarity.

AQ4 Please provide the place of publication for Reference 1.AQ5 Please provide the place of publication for Reference 2.AQ6 Please provide the place of publication for Reference 5.AQ7 Please provide the place of publication for Reference 6.AQ8 Please update Ref 8 with page range, volume, and year of publication.AQ9 Please provide the place of publication for Reference 9.

AQ10 Please provide the place of publication for Reference11.AQ11 Please provide the place of publication for Reference12.AQ12 Please provide the place of publication and the year of publication for Reference 19.AQ13 Please provide the place of publication and the year of publication for Reference 20.AQ14 Please provide the place of publication for Reference 22.AQ15 Please provide the place of publication for Reference 24.AQ16 Please provide the place of publication for Reference 25.AQ17 Please provide the place of publication for Reference 30.AQ18 Please provide the place of publication for Reference 38.AQ19 Please provide page number for the Reference 39.AQ20 Please provide the place of publication for Reference 47.AQ21 Please provide the place of publication and the year of publication for Reference 48.AQ22 Please provide the place of publication for Reference 58.AQ23 Please provide the place of publication for Reference 61.AQ24 Please update the volume number, page range and the year of publication for Reference 62.AQ25 Please provide the place of publication for Reference 65.AQ26 Please provide the place of publication for Reference 67.AQ27 Please provide editor’s names, publisher’s name and location for Reference 69.AQ28 Please update with the place of publication year of publication for Reference 70.AQ29 Please provide the place of publication for Reference 71.AQ30 Please provide the publisher’s name and location for Reference 72.AQ31 Please provide the year of publication for Reference 73.AQ32 Please provide the place of publication for Reference 75.AQ33 Please provide the place of publication for Reference 77.AQ34 Please update and provide the year of publication for Reference 81.AQ35 Please provide the place of publication for Reference 82.AQ36 Please provide the place of publication for Reference 83.AQ37 Please provide the place of publication for Reference 84.AQ38 Please provide the place of publication for Reference 86.AQ39 Please provide the place of publication for Reference 89.AQ40 Please provide the place of publication for Reference 91.