(2016) mixed computations for mixed phrase structure: theoretical and empirical perspectives

14
Mixed computations for mixed phrase structure Theoretical and empirical perspectives Diego Gabriel Krivochen University of Reading, CINN [email protected]

Upload: independentresearcher

Post on 22-Nov-2023

0 views

Category:

Documents


0 download

TRANSCRIPT

Mixed computations for mixed phrase structure

Theoretical and empirical perspectives

Diego Gabriel Krivochen University of Reading, CINN [email protected]

Structure building in the Minimalist Program

• “In its most elementary form, a generative system is based on an operation that takes structures already formed and combines them into a new structure. Call it Merge. (…) Suppose X and Y are merged. Evidently, efficient computation will leave X and Y unchanged (the No-Tampering Condition NTC).” Chomsky (2007: 3, 5)

• Merge “…takes objects X, Y already constructed and forms a new object Z.” (Chomsky, 2013: 40).

• Merge takes two syntactic objects and combines them into a single syntactic object. […] This is the basic structure building operation of syntax. (Collins & Stabler, 2016: 47)

{X}, {Y} X Y

Merge(X, Y)

Binarity Constraint: the Antisymmetry agenda, and Labeling

• Kayne’s (1994) LCA

• Di Sciullo’s (2014) I-Morphology

• Boeckx (2014)

• Collins & Stabler (2016)

• …and many more.

A bit of history:

The basic recursive devices in the grammar are the generalized transformations that produce a string from a pair of underlying strings.

(Chomsky and Miller 1963: 304)

In their own terms… • ‘The crucial fact about Merge – the “almost true generalization” about

Merge for language is that it is a head plus an XP. That is virtually everything. (Chomsky, 2009: 52)

• ‘phrase structure (…) always completely determines linear order […]’ (Kayne, 1994: 3) ‘Linear Correspondence Axiom: d(A) is a linear ordering of T.’ [A a set of non-terminals, T a set of terminals, d a terminal-to-nonterminal relation] (Kayne, 1994: 6)

• Vehicle Requirement on Merge (VRM): If α and β merge, some feature F of α must probe F on β (Pesetsky and Torrego, 2007)

• I adopt the basic idea that Merge is licensed under Agree, but I follow Chomsky (2000) and Collins (2002) in that Merge does require feature satisfaction, which I assume is feature valuation as stated in the Merge Condition: […] Merge α and β if α can value a feature of β. (Wurmbrand, 2014: 130)

Note: recall that Agree is necessarily a relation between two syntactic objects at a time.

Phrase Structure: Divide and…Conquer? Major assumptions of Mainstream Generative Grammar (Jackendoff, 2011: 275):

• The organization of syntactic structure is to be characterized in terms of ordered derivations that put pieces together one after another. That is, the grammar is conceived of as derivational or proof-theoretic […].

• Semantics is strictly locally compositional (or Fregean): the meanings of sentences are built up word by word, and the combination of word and phrase meanings is dictated by syntactic configuration.

To which we can add…

• Linear order is also dictated by syntactic configuration, following asymmetric c-command paths

Problems: non-monotonic dependencies

• Auxiliary Chains in Spanish (Bravo et al., 2015)

Predicted pattern of modification:

a) All auxiliaries c-command the VP (‘Va a haber sido asesinado’: ‘will have been murdered’)

b) Structure building is monotonic

But…we also get:

[Aux 1 [Aux 2 [Aux 3 [Lexical Verb]]]]

[Aux 1 [Aux 2 [Aux 3 [Lexical Verb]]]]

[Aux 1 [Aux 2 [Aux 3 [Lexical Verb]]]]

Crossing –context sensitive- dependency (‘Ha tenido que ser ayudado’: ‘has had to be helped’)

Auxiliaries as arguments in a finite state sequence (‘solía poder empezar a trabajar’: ‘used to be able to start working’)

Different orders would be derived via Head-to-Head movement…but scope relations are more complex!

Coordination and phrase structure (see what I did there?)

Two Latin examples (Krivochen and Schmerling, 2015):

• Perdiderint cum me duo crimina, carmen et error (Ov. Tristia II, 207) ruin3PlPastPerf with me two crimePl, poem and error

• effodiuntur opes, inritamenta malorum. arise3PlPastPerfImpers wealthPl, incitements of-the-bad iamque nocens ferrum ferroque nocentius aurum and now harmful iron and ironDat more-harmful gold prodierat […] (Ov. Met. I, 140-142) 5 come-forth3SgPastPerf

• The sudden rise and (the) equally sudden fall of the stock market have economists worried.

• The sudden rise and (the) equally sudden fall of the stock market has economists worried.

• One more can of beer and I’m leaving. (Culicover, 2013)

• A thousand cans of beer or I’m leaving. (Op. Cit.)

• That Mary will in fact agree to go out with me and that we will have a nice time and that ultimately she will be my girlfriend is/*?are a vile lie.

• His and his wife's death is terribly sad.

Can we represent this variety within a strictly binary, static model of phrase structure? (Hint: no, at least if we want to avoid stipulations over elements and operations)

…and, even if we could, would it be desirable? (Hint: also, no. It’d be procrustean at best)

In their own terms…(again)

• ‘a constituent-structure grammar necessarily imposes too rich an analysis on sentences because of features inherent in the way P-markers are defined for such sentences.’ (Chomsky, 1963: 298)

So…any ideas?

• ‘what we need should be, as it were, ‘dynamic flatness’. But this is the sort of concept that sounds incomprehensible in a classical computational view, while making sense to those for whom syntactic computations are psychologically real’ (Lasnik and Uriagereka, 2012: 21)

Such a proposal has not been actually implemented by MGG, that we know of. Can we do something about it?

Expanding on ‘dynamical flatness’

• A structure building operation manipulates n objects of arbitrary complexity at once, being blind to their inner characteristics

• An object of arbitrary complexity can remain ‘unlabeled’ before semantic interpretation, including specifications for category, case, and theta-role (what is √water without a context?)

• Labels arise from the dynamics of the system; they are neither created by some syntactic algorithm, nor are they substantive objects drawn from a set provided by UG (contra Adger, 2013). A label is the way of encoding diacritically how an object is to be interpreted for the purposes of further computations.

• Phrase structure dependencies are not uniform: when we change the kind of computational dependency among elements, we define cycles.

More on ‘syntax’: what it is and what it does

• Syntax is guided by Semantics: a substring S is assigned the computationally simplest structural description that captures the semantic relations among its members, as in the Aux Chains examples.

(i.e., if pure iteration can be modelled by a Markov chain, why go up the hierarchy?)

• Syntax is not an autonomous component, nor are Semantics and Morphophonology passive receivers of transferred trees: even if we wanted to keep all three separate for the purposes of the model, there are computational advantages in having the ‘interfaces’ accessing the syntactic workspace and co-opting domains –call them ‘cycles’-.

(i.e., cycles are minimally interpretable, maximally generable units)

• Cycles are minimally interpretable units because nothing smaller than a cycle can be assigned an interpretation in terms of sortality, eventivity, or relations between them.

• And, they are maximally generable, because once a critical value for the manifold being generated is reached, a dimensional attractor fades due to the inherent limitations of the physical system we’re dealing with, and no further structure can be built up in that derivational current. (Saddy, 2016; Krivochen, 2016) A system with finite resources cannot hold on to a transferrable object.

Cycles’ sizes are variable and cannot be defined a priori (contra phases, barriers, …), because they are delimited by the change in

computational dependencies.

• All of this ‘dynamical phrase structure’ thing is much more fun when we consider structure in cognition as manifold creation and manipulation, and ‘syntax’ as a set of topological transformations over well-defined spaces with specific properties…

• …it is also necessary to acknowledge the limitations of models and the fact that models are not objects (e.g., Bohr’s model is not an atom; trees are not sentences or their structures)

• A theory based on the physical properties of the substratum and the topological properties of constructs is:

• More explicit (because we are not dealing with metaphors)

• More dynamic (because we model language as a nonlinear system in real time)

• Closer to neurocognitive research (e.g., Neural Field Theory; Neurophysics in general…)

than most of what we have now.

Physics of Language: a teaser trailer

References Adger, David (2013) A Syntax of Substance. Cambridge, Mass.: MIT Press.

Boeckx, Cedric (2014) Elementary Syntactic Structures. Oxford: OUP

Bravo, Ana, Luis García Fernández, and Diego Krivochen (2015) On Auxiliary Chains: Auxiliaries at the Syntax-Semantics Interface. Borealis. 4(2). 71-101.

Chomsky, Noam (1963) Formal Properties of Grammars. In R. D. Luce, R. R. Bush, and E. Galanter (eds.), Handbook of Mathematical Psychology. New York: John Wiley & Sons. 323–418.

(2007) Approaching UG from Below. In Uli Sauerland and Hans- Martin Gartner (eds.), Interfaces + recursion = language? Chomsky’s minimalism and the view from syntax-semantics. Berlin: Mouton de Gruyter. 1-29.

(2009) Opening Remarks. In Piatelli Palmarini, Massimo, Juan Uriagereka, and Pello Salaburu (eds.) Of Minds and Language. Oxford: OUP. 13-43.

(2013) Problems of Projection. Lingua. Special Issue in Syntax and cognition: core ideas and results in syntax. 33–49.

Chomsky, Noam & George Miller. 1963. Introduction to the Formal Analysis of Natural Languages. In Duncan R. Luce, Robert R. Bush, and Eugene Galanter (eds.), Handbook of Mathematical Psychology 2. New York: John Wiley & Sons. 269–321.

Collins, Chris and Edward Stabler (2016) A formalization of Minimalist syntax. Syntax 19(1). 43-78.

Culicover, Peter W. (2013) OM-sentences. In Explaining syntax. New York: Oxford University Press. 15–52.

Di Sciullo, Anna-Maria (2014) Minimalism and I-Morphology. In Kosta, P. et al. (eds) Minimalism and Beyond: Radicalizing the Interfaces. Amsterdam: John Benjamins. 267-286.

Jackendoff, Ray (2011) Alternative minimalist visions of language. In R. Borsley & K. Börjars (eds.), Non-transformational syntax. Formal and explicit models of grammar. London: Blackwell. 268–296.

Kayne, Richard (1994) The Antisymmetry of Syntax. Cambridge, Mass.: MIT Press.

Krivochen, Diego (2015) On Phrase Structure building and Labeling algorithms: towards a non-uniform theory of syntactic structures. The Linguistic Review 32(3). 515-572.

(2016) Aspects of Emergent Cyclicity in Language, Physics, and Computation. PhD Thesis, University of Reading.

Krivochen, Diego and Susan Schmerling (2015) Two kinds of coordination and their theoretical implications: An Aspect-Based Approach. Ms. Submitted.

Lasnik, Howard and Juan Uriagereka (2012) Structure. In R. Kempson, T. Fernando and N. Asher (eds.), Handbook of philosophy of science volume 14: Philosophy of linguistics. Elsevier. 33–61.

Pesetsky, David & Esther Torrego (2007) The syntax of valuation and the interpretability of features. In S. Karimi, V. Samiian, and W. Wilkins (eds) Phrasal and Clausal Architecture. Amsterdam: John Benjamins. 262–294.

Saddy, James Douglas (2016) Syntax and Uncertainty. Ms. University of Reading.

Wurmbrand, Susi (2014) The Merge Condition: A syntactic approach to selection. In Kosta, P. et. al. (eds.) Minimalism and Beyond: Radicalizing the interfaces. Amsterdam: John Benjamins. 130-167.