computing textual inferences cleo condoravdi palo alto research center georgetown university...

Post on 19-Jan-2016

215 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Computing Textual InferencesComputing Textual Inferences

Cleo Condoravdi

Palo Alto Research Center

Georgetown University

Halloween, 2008

OverviewOverviewMotivation

Local Textual Inference Textual Inference Initiatives and refinements

PARC’s BRIDGE systemXLE Abstract Knowledge Representation (AKR)

Conceptual and temporal structureContextual structure and instantiability

Semantic relationsEntailments and presuppositionsRelative polarity

Entailment and Contradiction (ECD)

(Interaction with external temporal reasoner)

Access to content: existential claimsAccess to content: existential claimsWhat happened? Who did what to whom?What happened? Who did what to whom?

Microsoft managed to buy Powerset.

Microsoft acquired Powerset.

Shackleton failed to get to the South Pole.

Shackleton did not reach the South Pole.

The destruction of the file was not illegal.

The file was destroyed.

The destruction of the file was averted.

The file was not destroyed.

Access to content: monotonicityAccess to content: monotonicityWhat happened? Who did what to whom?What happened? Who did what to whom?

Every boy managed to buy a small toy.

Every small boy acquired a toy.

Every explorer failed to get to the South Pole.

No experienced explorer reached the South Pole.

No file was destroyed.

No sensitive file was destroyed.

The destruction of a sensitive file was averted.

A file was not destroyed.

Access to content: temporal domainAccess to content: temporal domainWhat happened when? What happened when?

Ed visited us every day last week.

Ed visited us on Monday last week.

Ed has been living in Athens for 3 years.

Mary visited Athens in the last 2 years.

Mary visited Athens while Ed lived in Athens.

The deal lasted through August, until just before the government took over Freddie. (NYT, Oct. 5, 2008)

The government took over Freddie after August.

Grammatical analysis for access to Grammatical analysis for access to contentcontent

Identify “Microsoft” as the buyer argument of the verb “buy”Identify “Shackleton” as the traveler argument of the verb “get to”

Identify lexical relation between “destroy” and “destruction”Identify syntactic relation between verbal predication “destroy the file” and

nominal predication “destruction of the file”

Identify the infinitival clause as an argument of “manage” and “fail”

Identify the noun phrases “every”, “a”, “no” combine with

Identify the phrases “every day”, “on Monday”, “last week” as modifiers of “visit”

Identify “has been living” as a present progressive

Knowledge about words for access to Knowledge about words for access to contentcontent

The verb “acquire” is a hypernym of the verb “buy”The verbs “get to” and “reach” are synonyms

Inferential properties of “manage”, “fail”, “avert”, “not”

Monotonicity properties of “every”, “a”, “no”, “not”

Restrictive behavior of adjectival modifiers “small”, “experienced”, “sensitive”

The type of temporal modifiers associated with prepositional phrases headed by “in”, “for”, “on”, or even nothing (e.g. “last week”, “every day”)

Toward NL UnderstandingToward NL Understanding

Local Textual Inference A measure of understanding a text is the ability to make

inferences based on the information conveyed by it. We can test understanding by asking questions about the text.

Veridicality reasoningDid an event mentioned in the text actually occur?

Temporal reasoningWhen did an event happen? How are events ordered in time?

Spatial reasoningWhere are entities located and along which paths do they

move?

Causality reasoning Enablement, causation, prevention relations between events

Local Textual Inference Local Textual Inference

PASCAL RTE Challenge (Ido Dagan, Oren Glickman) 2005, 2006

PREMISE

CONCLUSIONTRUE/FALSE

Rome is in Lazio province and Naples is in Campania.

Rome is located in Lazio province.

TRUE ( = entailed by the premise)

Romano Prodi will meet the US President George Bush in his capacity as the president of the European commission.

George Bush is the president of the European commission.

FALSE (= not entailed by the premise)

PARC Entailment and Contradiction PARC Entailment and Contradiction Detection (ECD)Detection (ECD)

Text: Kim hopped.Hypothesis: Someone moved.Answer: TRUE

Text: Sandy touched Kim.Hypothesis: Sandy kissed Kim.Answer: UNKNOWN

Text: Sandy kissed Kim.Hypothesis: No one touched Kim.Answer: NO

Text: Sandy didn’t wait to kiss Kim.Hypothesis: Sandy kissed Kim.Answer: AMBIGUOUS

Linguistic meaning vs. speaker meaningLinguistic meaning vs. speaker meaning

Not a pre-theoretic but rather a theory-dependent distinction Multiple readings

ambiguity of meaning?single meaning plus pragmatic factors?

The diplomat talked to most victimsThe diplomat did not talk to all victimsUNKNOWN / YES

You can have the cake or the fruit.

You can have the fruit

I don’t know which.

YESUNKNOWN

World Knowledge World Knowledge

Romano Prodi will meet the US President George Bush in his capacity as the president of the European commission.

George Bush is the president of the European commission.

FALSE (= not entailed by the premise on the correct anaphoric resolution)

G. Karas will meet F. Rakas in his capacity as the president of the European commission.

F. Rakas is the president of the European commission.

TRUE (= entailed by the premise on one anaphoric resolution)

Recognizing textual entailmentsRecognizing textual entailments

Monotonicity Calculus, Polarity, Semantic RelationsMuch of language-oriented reasoning is tied to specific

words, word classes and grammatical featuresClass 1: “fail”, “refuse”, “not”, …Class 2: “manage”, “succeed”, …Tenses, progressive –ing form, …

Representation and inferential properties of modifiers of different kinds

throughout July vs. in July

for the last three years vs. in the last three years sleep three hours -- duration sleep three times -- cardinality

XLE PipelineXLE Pipeline

• Mostly symbolic system• Ambiguity-enabled through packed representation of analyses• Filtering of dispreferred/improbable analyses is possible

• OT marks• mostly on c-/f-structure pairs, but also on c-structures• on semantic representations for selectional preferences

• Statistical models• PCFG-based pruning of the chart of possible c-structures• Log-linear model that selects n-best c-/f-structure pairs

morphological analyses

c-structures

c-/f-structure pairs

CSTRUCTURE OT marks PCFG-based chart pruning

“general” OT markslog-linear model

Ambiguity is rampant in languageAmbiguity is rampant in language

Alternatives multiply within and across layers…

C-structure

F-structure

Linguistic S

em

Abstract K

R

KR

R

What not to doWhat not to do

Use heuristics to prune as soon as possible

C-structure

F-structure

Linguistic sem

Abstract K

R

KR

R

XX

X

Statistics

X

Oops: Strong constraints may reject the so-far-best (= only) option

Fast computation, wrong result

X

Manage ambiguity insteadManage ambiguity instead

The sheep liked the fish.How many sheep?

How many fish?

The sheep-sg liked the fish-sg.The sheep-pl liked the fish-sg.The sheep-sg liked the fish-pl.The sheep-pl liked the fish-pl.

Options multiplied out

The sheep liked the fish sgpl

sgpl

Options packed

Packed representation:– Encodes all dependencies without loss of information– Common items represented, computed once– Key to practical efficiency with broad-coverage grammars

System OverviewSystem Overview

“A girl hopped.”

stringsyntactic F-structure

LFGParser

rewrite rules

AKR (Abstract Knowledge Representation)

XLE PipelineXLE Pipeline

Process OutputText-Breaking Delimited Sentences

NE recognition Type-marked Entities (names, dates, etc.)

Morphological Analysis Word stems + features

LFG parsing Functional Representation

Semantic Processing Scope, Predicate-argument structure

AKR Rules Abstract Knowledge Representation

Alignment Aligned T-H Concepts and Contexts

Entailment and Contradiction Detection

YES / NO / UNKNOWN

XLE System ArchitectureXLE System ArchitectureText Text AKR AKR

Parse text to f-structuresConstituent structure

Represent syntactic/semantic features (e.g. tense, number)

Localize arguments (e.g. long-distance dependencies, control)

Rewrite f-structures to AKR clauses

Collapse syntactic alternations (e.g. active-passive)

Flatten embedded linguistic structure to clausal form

Map to concepts and roles in some ontology

Represent intensionality, scope, temporal relations

Capture commitments of existence/occurrence

AKR representationAKR representation

concept term WordNet synsets

thematicrole

A collection of statements

instantiability facts

event time

F-structures F-structures vs.vs. AKR AKR

Nested structure of f-structures vs. flat AKRF-structures make syntactically, rather than conceptually, motivated

distinctions Syntactic distinctions canonicalized away in AKR

Verbal predications and the corresponding nominalizations or deverbal adjectives with no essential meaning differences

Arguments and adjuncts map to roles

Distinctions of semantic importance are not encoded in f-structures Word sensesSentential modifiers can be scope taking (negation, modals,

allegedly, predictably)Tense vs. temporal reference

Nonfinite clauses have no tense but they do have temporal reference

Tense in embedded clauses can be past but temporal reference is to the future

F-Structure to AKR MappingF-Structure to AKR Mapping

Input: F-structures

Output: clausal, abstract KR

Mechanism: packed term rewritingRewriting system controls lookup of external ontologies via Unified Lexicon compositionally-driven transformation to AKR

Transformations:Map words to Wordnet synsetsCanonicalize semantically equivalent but formally distinct

representationsMake conceptual & intensional structure explicitRepresent semantic contribution of particular constructions

F-Structure to AKR MappingF-Structure to AKR Mapping

Input: F-structures

Output: clausal, abstract KR

Mechanism: packed term rewritingRewriting system controls lookup of external ontologies via Unified Lexicon compositionally-driven transformation to AKR

Transformations:Map words to Wordnet synsetsCanonicalize semantically equivalent but formally distinct

representationsMake conceptual & intensional structure explicitRepresent semantic contribution of particular constructions

Basic structure of AKRBasic structure of AKR

Conceptual StructurePredicate-argument structures

Sense disambiguation

Associating roles to arguments and modifiers

Contextual StructureClausal complements

Negation

Sentential modifiers

Temporal StructureRepresentation of temporal expressions

Tense, aspect, temporal modifiers

Ambiguitymanagement

withchoice spaces

girl with a telescope

seeing with a telescope

Conceptual StructureConceptual Structure

Captures basic predicate-argument structures

Maps words to WordNet synsets

Assigns VerbNet roles

subconcept(talk:4,[talk-1,talk-2,speak-3,spill-5,spill_the_beans-1,lecture-1])role(Actor,talk:4,Ed:1)subconcept(Ed:1,[male-2])alias(Ed:1,[Ed])role(cardinality_restriction,Ed:1,sg)

Shared by “Ed talked”, “Ed did not talk” and “Bill will say that Ed talked.”

Temporal StructureTemporal Structure

temporalRel(startsAfterEndingOf,Now,talk:6)

temporalRel(startsAfterEndingOf,say:6,Now)

temporalRel(startsAfterEndingOf,say:6,talk:21)

“Bill will say that Ed talked.”

Shared by “Ed talked.” and “Ed did not talk.”

Matrix vs. embedded tense

Canonicalization in conceptual structureCanonicalization in conceptual structure

subconcept(tour:13,[tour-1])

role(Theme,tour:13,John:1) role(Location,tour:13,Europe:21) 

subconcept(Europe:21,[location-1])

alias(Europe:21,[Europe]) 

role(cardinality_restriction,Europe:21,sg)

subconcept(John:1,[male-2])

alias(John:1,[John])

role(cardinality_restriction,John:1,sg)

subconcept(travel:6,[travel-1,travel-2,travel-3,travel-4,travel-5,travel-6])

role(Theme,travel:6,John:1)

role(Location,travel:6,Europe:22)

subconcept(Europe:22,[location-1])

alias(Europe:22,[Europe])

role(cardinality_restriction,Europe:22,sg)

subconcept(John:1,[male-2])

alias(John:1,[John])

role(cardinality_restriction,John:1,sg)

“John took a tour of Europe.”“John traveled around Europe.”

Contextual StructureContextual Structure

context(t)

context(ctx(talk:29))

context(ctx(want:19))

top_context(t)

context_relation(t,ctx(want:19),crel(Topic,say:6))

context_relation(ctx(want:19),ctx(talk:29),crel(Theme,want:19))

Bill said that Ed wanted to talk.

Use of contexts enables flat representations

Contexts as arguments of embedding predicates Contexts as scope markers

Concepts and ContextsConcepts and Contexts

Concepts live outside of contexts.

Still we want to tie the information about concepts to the contexts they relate to.

Existential commitmentsDid something happen?

e.g. Did Ed talk? Did Ed talk according to Bill?

Does something exist?e.g. There is a cat in the yard. There is no cat in the yard.

InstantiabilityInstantiability

An instantiability assertion of a concept-denoting term in a context implies the existence of an instance of that concept in that context.

An uninstantiability assertion of a concept-denoting term in a context implies there is no instance of that concept in that context.

If the denoted concept is of type event, then existence/nonexistence corresponds to truth or falsity.

NegationNegation

Contextual structurecontext(t)context(ctx(talk:12)) new context triggered by negationcontext_relation(t, ctx(talk:12), not:8)antiveridical(t,ctx(talk:12)) interpretation of negation

Local and lifted instantiability assertions instantiable(talk:12, ctx(talk:12)) uninstantiable (talk:12, t) entailment of negation

“Ed did not talk”

Relations between contextsRelations between contexts

Generalized entailment: veridicalIf c2 is veridical with respect to c1,

the information in c2 is part of the information in c1Lifting rule: instantiable(Sk, c2) => instantiable(Sk, c1)

Inconsistency: antiveridicalIf c2 is antiveridical with respect to c1,

the information in c2 is incompatible with the info in c1Lifting rule: instantiable(Sk, c2) => uninstantiable(Sk, c1)

Consistency: averidicalIf c2 is averidical with respect to c1,

the info in c2 is compatible with the information in c1No lifting rule between contexts

Determinants of context relationsDeterminants of context relations

Relation depends on complex interaction ofConceptsLexical entailment classSyntactic environment

ExampleHe didn’t remember to close the window. He doesn’t remember that he closed the window. He doesn’t remember whether he closed the window.

He closed the window.Contradicted by 1Implied by 2Consistent with 3

Embedded clausesEmbedded clauses

The problem is to infer whether an embedded event is instantiable or uninstantiable on the top level.

It is surprising that there are no WMDs in Iraq.

It has been shown that there are no WMDs in Iraq.

==> There are no WMDs in Iraq.

Embedded examples in real textEmbedded examples in real text

From Google:

Song, Seoul's point man, did not forget to persuade the North Koreans to make a “strategic choice” of returning to the bargaining table...

Song persuaded the North Koreans…

The North Koreans made a “strategic choice”…

Semantic relationsSemantic relations

Presupposition (Factive verbs, Implicative verbs)It is surprising that there are no WMDs in Iraq.

It is not surprising that there are no WMDs in Iraq.Is it surprising that there are no WMDs in Iraq?If it is surprising that there are no WMDs in Iraq, it is because we

had good reasons to think otherwise.

Entailment (Implicative verbs)It has been shown that there are no WMDs in Iraq.

It has not been shown that there are no WMDs in Iraq.Has it been shown that there are no WMDs in Iraq?If it has been shown that there are no WMDs in Iraq, the war has

turned out to be a mistake.

FactivesFactives

Class Inference Pattern

Positive

Negative

+-/+ forget that forget that X ⇝ X, not forget that X ⇝ X

+-/- pretend that pretend that X not ⇝ X, not pretend that X not ⇝ X

ImplicativesImplicatives

++/-- manage to

+-/-+ fail to

manage to X ⇝ X, not manage to X not ⇝ X

fail to X not ⇝ X, not fail to X ⇝ X

++ force to force X to Y ⇝ Y

+- prevent from prevent X from Ying not Y⇝

-- be able to not be able to X not X⇝

-+ hesitate to not hesitate to X X⇝

Class Inference Pattern

Two-wayimplicatives

One-wayimplicatives

Implicatives under FactivesImplicatives under Factives

It is surprising that Bush dared to lie.

It is not surprising that Bush dared to lie.

Bush lied.

Phrasal ImplicativesPhrasal Implicatives

Have

Take

Ability NounChance Noun

Character Noun

= --Implicative= --Implicative= ++/--Implicative

Miss Chance Noun = +-/-+Implicative

Seize Chance Noun = ++/--Implicative

Chance Noun

Effort NounAsset Noun

= ++/--Implicative= ++/--Implicative

= ++/--Implicative

Use Chance NounAsset Noun

= ++/--Implicative= ++/--Implicative

WasteChance Noun

Asset Noun

= +-/-+Implicative

= ++/--Implicative

+

+

+

+

+

+

(ability/means)

(chance/opportunity)(courage/nerve)

(chance/opportunity)(money)(trouble/initiative)

(chance/opportunity)(money)

(chance/opportunity)(money)

(chance/opportunity)

(chance/opportunity)

Conditional verb classesConditional verb classes

Joe had the chutzpah to steal the money. ⇝ Joe stole the money.

Two-way implicativewith “character nouns”

“character noun”(gall, gumption, audacity…)

Relative PolarityRelative Polarity

Veridicality relations between contexts determined on the basis of a recursive calculation of the relative polarity of a given “embedded” context

Globality: The polarity of any context depends on the sequence of potential polarity switches stretching back to the top context

Top-down each complement-taking verb or other clausal modifier, based on its parent context's polarity, either switches, preserves or simply sets the polarity for its embedded context

Example: polarity propagationExample: polarity propagation

“Ed did not forget to force Dave to leave.”

“Dave left.”

Ed

Dave

subj

objsubj comp

comp

comp

subj

not

force

Dave

leave

forget

Ed

+

-

+

+

subj

Dave

leave

Summary of basic structure of AKRSummary of basic structure of AKR

Conceptual StructureTerms representing types of individuals and events, linked to WordNet synonym sets

by subconcept declarations.

Concepts typically have roles associated with them.

Ambiguity is encoded in a space of alternative choices.

Contextual Structuret is the top-level context, some contexts are headed by some event term

Clausal complements, negation and sentential modifiers also introduce contexts.

Contexts can be related in various ways such as veridicality.

Instantiability declarations link concepts to contexts.

Temporal StructureLocating events in time.

Temporal relations between events.

ECDECD

ECD operates on the AKRs of the passage and of the hypothesis

ECD operates on packed AKRs, hence no disambiguation is required for entailment and contradiction detection

If one analysis of the passage entails one analysis of the hypothesis and another analysis of the passage contradicts some other analysis of the hypothesis, the answer returned is AMBIGUOUS

Else: If one analysis of the passage entails one analysis of the hypothesis, the answer returned is YES

If one analysis of the passage contradicts one analysis of the hypothesis, the answer returned is NO

Else: The answer returned is UNKNOWN

AKR (Abstract Knowledge AKR (Abstract Knowledge Representation)Representation)

More specific entails less specificMore specific entails less specific

How ECD worksHow ECD works

Kim hopped.

Someone moved.

Text:

Hypothesis:Alignment

Specificitycomputation

Elimination ofElimination ofH facts that areH facts that are

entailed by T facts.entailed by T facts.

Kim hopped.

Someone moved.

Text:

Hypothesis:

Kim hopped.Text:

Hypothesis:

t

t

t

t

t

t

Context

Someone moved.

Alignment and specificity computationAlignment and specificity computation

Specificitycomputation

Alignment

Every (↓) (↑) Some (↑) (↑)

Every boy saw a small cat.

Every small boy saw a cat.

Text:

Hypothesis:

Every boy saw a small cat.

Every small boy saw a cat.

Text:

Hypothesis:

Every boy saw a small cat.

Every small boy saw a cat.

Text:

Hypothesis:

t

t

t

t

t

t

Context

Elimination of entailed termsElimination of entailed terms

Every boy saw a small cat.

Every small boy saw a cat.

Text:

Hypothesis:

t

t

Every boy saw a small cat.

Every small boy saw a cat.

Text:

Hypothesis:

t

t

Every boy saw a small cat.

Every small boy saw a cat.

Text:

Hypothesis:

t

t

Context

Contradiction:Contradiction:instantiable --- uninstantiableinstantiable --- uninstantiable

Stages of ECDStages of ECD

1. WordNet and Alias alignment for (un)instantiable concepts in conclusion

1a Returns < = > depending on hyperlists of terms 1b Returns < = > depending on theory of names (assuming 1a

matched)2. Make extra top contexts for special cases — e.g. Making head of

question (below) interrogative a top_context3. Context alignment Any top context in conclusion aligns with any top context in

premise Any non-top_context in conclusion aligns with any non top_context

in premise if their context_heads align in stage 14. paired_roles are saved (roles with the same role name in

premise and conclusion on aligned concepts)

Stages of ECDStages of ECD

6. unpaired roles in premise and conclusion (both) makes concepts not align.

7. cardinality restrictions on concepts are checked and modify

alignment direction (including dropping inconsistent alignments)

8. Paired roles are checked to see how their value specificity affects alignment

9. Temporal modifiers are used to modify alignment10. Instantiable concepts in the conclusion are removed if there is

an more specific concept instantiable in an aligned context in premise.

11. Conversely for uninstantiable12. Contradiction checked (instantiable in premise and

uninstantiable in conclusion, and vice versa)

AKR modificationsAKR modifications

AKR0

P-AKR

Q-AKR

simplify

augment

Oswald killed Kennedy => Kennedy died.

Kim managed to hop. => Kim hopped.

normalize

The situation improved.

The situation became better.

=>

From temporal modifiers to temporal From temporal modifiers to temporal relationsrelations

Inventory of temporal relations: the Allen relations plus certain disjunctions thereof

Recognize the type of temporal modifiere.g. bare modifiers, “in” PPs, “for” PPs

Ed visited us Monday/that week/every day.Ed slept the last two hours.Ed will arrive a day from/after tomorrow.

Represent the interval specified in the temporal modifierLocate intervals designated by temporal expressions on time axisDetermine qualitative relations among time intervals

Interpretation of temporal expressionsInterpretation of temporal expressions

Compositional make-up determines qualitative relationsRelative ordering can be all a sentence specifies

Reference of calendrical expressions depends on interpretation of tense

Two different computationsDetermine qualitative relations among time intervals

Locate intervals designated by temporal expressions on time axis

Infer relations not explicitly mentioned in textSome through simple transitive closure

Others require world/domain knowledge

Temporal modification under negation and Temporal modification under negation and quantificationquantification

Temporal modifiers affect monotonicity-based inferences

Everyone arrived in the first week of July 2000.Everyone arrived in July 2000.YESNo one arrived in July 2000.No one arrived in the first week of July 2000.YES

Everyone stayed throughout the concert.Everyone stayed throughout the first part of the concert.YESNo one stayed throughout the concert.No one stayed throughout the first part of the concert.UNKNOWN

Quantified modifiers and monotonicityQuantified modifiers and monotonicity

Many inference patterns do not depend on calendrical anchoring but on basic monotonicity properties

Monotonicity-based inferences depend on implicit dependencies being represented

Last year, in September, he visited us every day.Last year he visited us every day.UNKNOWNLast year he visited us every day.Last year he visited us every day in September.YES

Every boy bought a toy from Ed.Every boy bought a toy.YESEvery boy bought a toy.Every boy bought a toy from Ed.UNKNOWN

Allen Interval RelationsAllen Interval RelationsRelation Illustration Interpretation

X < YY > X

X _ _ Y _

X takes place before Y

X m YY mi X

_ X _ _ Y _

X meets Y (i stands for inverse

X o YY oi X

_ X _ _ Y _

X overlaps Y

X s YY si X

_ X __ Y _

X starts Y

X d YY di X

_ X __ Y _

X during Y

X F YY fi X

_ X _ _ Y _

X finishes Y

X = Y_ X _ _ Y _

X is equal to Y(X is cotemporal with Y)

Qualitative relations of intervals and Qualitative relations of intervals and eventsevents

Left boundary Right boundary

throughout

NOWwithin

Determining the relevant interval

Determining the relation between interval and event

Taking negation and quantification into consideration

From language to qualitative relations From language to qualitative relations of intervals and eventsof intervals and events

Left boundary Right boundary

throughoutNOW

Ed has been living in Athens for 3 years.Mary visited Athens in the last 2 years.

Mary visited Athens while Ed lived in Athens.

Ed’s living in Athens Mary’s visit to Athens

within

Left boundary

From English to AKRFrom English to AKR

Ed has been living in Athens for 3 years.trole(duration,extended_now:13,interval_size(3,year:17)) trole(when,extended_now:13,interval(finalOverlap,Now))trole(when,live:3,interval(includes,extended_now:13)

Mary visited Athens in the last 2 years.trole(duration,extended_now:10,interval_size(2,year:11))trole(when,extended_now:10,interval(finalOverlap,Now))trole(when,visit:2,interval(included_in,extended_now:10))

Mary visited Athens while Ed lived in Athens.trole(ev_when,live:22,interval(includes,visit:6)) trole(ev_when,visit:6,interval(included_in,live:22))

Quantified modifiers and monotonicityQuantified modifiers and monotonicity

Many inference patterns do not depend on calendrical anchoring but on basic monotonicity properties

Monotonicity-based inferences depend on implicit dependencies being represented

Last year, in September, he visited us every day.Last year he visited us every day.UNKNOWNLast year he visited us every day.Last year he visited us every day in September.YES

Every boy bought a toy from Ed.Every boy bought a toy.YESEvery boy bought a toy.Every boy bought a toy from Ed.UNKNOWN

Distributed modifiersDistributed modifiers

Multiple temporal modifiers are dependent on one anotherImplicit dependencies are made explicit in the

representationEd visited us in July, 1991.

trole(when,visit:1,interval(included_in,date:month(7):18))trole(subinterval,date:month(7):18,date:year(1991):18)

In 1991 Ed visited us in July.trole(when,visit:12,interval(included_in,date:month(7):26))trole(subinterval,date:month(7):26,date:year(1991):4)

In 1991 Ed visited us in July every week.trole(when,visit:12,interval(included_in,week:37))trole(subinterval,week:37,date:month(7):26)trole(subinterval,date:month(7):26,date:year(1991):4)

Associating time points with event Associating time points with event descriptionsdescriptions

Trilobites: 540m‐251m years ago

Ammonites: 400m‐65m years ago

1. There were trilobites before there were

ammonites. TRUE

2. There were ammonites before there were

trilobites. FALSE

3. There were trilobites after there were

ammonites. TRUE

4. There were ammonites after there were

trilobites. TRUE

Associating time points with event Associating time points with event descriptionsdescriptions

1. Ed felt better before every injection was administered to him.ordering wrt last injection

2. Ed felt better after every injection was administered to him.ordering wrt last injection

3. Ed felt better before most injections were administered to him. ordering wrt first injection to tip the balance

4. Ed felt better after most injections were administered to him.ordering wrt first injection to tip the balance

How “before” and “after” orderHow “before” and “after” order

In a modifier of the form before S or after S, we need to derive from S a temporal value to pass on to the preposition.

The default operation takes the end of the earliest interval when S is true.

The temporal asymmetry of this operation produces the appearance of after and before being non-inverses.

TimeBank and TimeMLTimeBank and TimeML

A corpus of 183 news articles annotated by hand with temporal information following the TimeML specification

Events, times and temporal links between them are identified and texts are appropriately annotated

TimeML represents temporal information using four primary tag types: TIMEX3 for temporal expressions EVENT for temporal events SIGNAL for temporal signalsLINK for representing relationships

Semantics of TimeML an open issue

TimeML and AKRTimeML and AKR

ŅPrivately, authorities saye74 Rudolph has becomee76 a focus of their investigatione77.Ņ Human-friendly TimeML: creation_time: t92 Tlinks: l45 saye74,ei2046 includes 19980227t92 l46 investigatione77,ei2048 includes becomee76,ei2047 Slinks: l62 saye74,ei2046 evidential becomee76,ei2047 Alinks: none

AKR excerpt: event(say:8) event(become:13) event(investigate:38) trole(when,become:13,interval(before,Now)) trole(ev_when,become:13,interval(included_in,investigate:38)) trole(when,say:8,interval(includes,Now)) trole(when,say:8,interval(includes,Now))

TimeML-AKR matchTimeML-AKR match

<EVENT eid="e76" class="OCCURRENCE">become</EVENT> <MAKEINSTANCE eventID="e76" eiid="ei2047" tense="PRESENT" aspect="PERFECTIVE" polarity="POS" pos="VERB" />

event(become:13) trole(when,become:13,interval(before,Now))

EVENT eid="e77" class="OCCURRENCE">investigation</EVENT> <MAKEINSTANCE eventID="e76" eiid="ei2047" tense="PRESENT" aspect="PERFECTIVE" polarity="POS" pos="VERB" />

event(investigate:38)

<TLINK lid="l46" relType ="INCLUDES " eventInstanceID="ei2048" relatedToEvent Instance="ei2047" />

trole(ev_when,become:13, interval(included_in,investigate:38))

<EVENT eid="e74" class="REPORTING ">say</EVENT> <MAKEINSTANCE eventID="e74" eiid="ei2046" tense="PRESENT" aspect="NONE" polarity="POS" pos="VERB" />

event(say:8) trole(when,say:8,interval(includes,Now))

Credits for the Bridge SystemCredits for the Bridge System

NLTT (Natural Language Theory and Technology) group at PARCDaniel BobrowBob CheslowCleo CondoravdiDick Crouch*Ronald Kaplan*Lauri KarttunenTracy King* * = now at PowersetJohn MaxwellValeria de Paiva† † = now at CuilAnnie Zaenen

InternsRowan NairnMatt PadenKarl PichottaLucas Champollion

Thank youThank you

top related