computational semantics

Download Computational Semantics

Post on 02-Jan-2016

33 views

Category:

Documents

1 download

Embed Size (px)

DESCRIPTION

Computational Semantics. Torbjrn Lager Department of Linguistics Uppsala University. Computational Semantics. Compositional, logical semantics* Computational lexical semantics Word sense disambiguation* Text categorization Information extraction Information retrieval. - PowerPoint PPT Presentation

TRANSCRIPT

  • Computational SemanticsTorbjrn LagerDepartment of LinguisticsUppsala University

    NLP1 - Torbjrn Lager

  • Computational SemanticsCompositional, logical semantics*Computational lexical semanticsWord sense disambiguation*Text categorizationInformation extractionInformation retrieval

    NLP1 - Torbjrn Lager

  • Logical Semantics ExampleJohn laughedlaughed'(j)

    Nobody laughedx[laughed'(x)]

    But this is just translation! What's semantic about that?

    NLP1 - Torbjrn Lager

  • What Is the Name of This Business?Truth conditional semanticsModel theoretic semanticsLogical semanticsFormal semanticsCompositional semanticsSyntax-driven semantic analysis

    Compositional, logical, truth conditional, model theoretic semantics ....

    NLP1 - Torbjrn Lager

  • An Important TraditionWe use language to talk about the worldSemantics is something that relates sentences (or utterances) of language and the outside worldThere are other ideas about meaning, but in this tradition we don't believe in them!

    Natural languageThe outside world

    NLP1 - Torbjrn Lager

  • Truth Conditional SemanticsMeaning = Truth conditionsExamples:"John whistles" is true iff John whistles"John visslar" is true iff John whistles"Ogul fautu seq" is true iff...Natural languageThe outside world

    NLP1 - Torbjrn Lager

  • Model Theoretic SemanticsWe don't know what the world is really like, so let's talk about a model of the world insteadSuch a model does (usually) consists of individuals, sets of individuals, functions and relations. i.e the sort of things set theory talks aboutTruth becomes truth relative to a modelNatural languageModelThe outside world

    NLP1 - Torbjrn Lager

  • Compositional SemanticsThe Compositionality Principle:The meaning of the whole is a function of the meaning of the parts and the mode of combining them.The meaning of a complex expression is uniquely determined by the meaning of its constituents and the syntactic construction used to combine them.Natural languageModelThe World

    NLP1 - Torbjrn Lager

  • Truth Conditional, Model Theoretic and Compositional Semantics CombinedA simple model M:Domain: {John, Mary, Paul}Interpretation:Names: "John" refers to John , "Mary" refers to Mary, etc.Verbs: "whistles" refers to {John, Paul}

    Example"John whistles" is true in M iff the individual in M referred to as "John" is an element in the set of individuals that "whistles" refer to.Richard Montague (1970): "I reject the contention that an important theoretical difference exists between formal and natural languages"

    NLP1 - Torbjrn Lager

  • Translational SemanticsAccount for the meanings of natural language utterances by translating them into another language.It could be any language, but only if this language has a formal semantics are we done.Natural languageLogical Form LanguageModelThe World

    NLP1 - Torbjrn Lager

  • Why Logical Semantics?Account for ambiguity"every man loves a woman"Allow evaluatione.g. by database lookupAllow logical inferenceEvery man who whistles is happyJohn is a manJohn whistlesTherefore: John is happy

    NLP1 - Torbjrn Lager

  • Applications of Logical SemanticsNLU systemsSemantics + 'World knowledge' --> 'understanding'Information ExtractionMachine translationLF as (part of an) interlinguaDialogue Systems

    NLP1 - Torbjrn Lager

  • Grammar and Logical FormS -> NP VP [S] = [VP]([NP])NP -> john [NP] = jVP -> whistles [VP] = x[whistles'(x)]

    [john whistles] = whistles'(j)cf. The principle ofcompositionality

    NLP1 - Torbjrn Lager

  • Beta Reduction (Lambda Conversion)[S] = [VP]([NP])[NP] = j[VP] = x[whistles'(x)]

    Beta reduction rule: u() where every occurrence of u in is replaced by

    x[whistles'(x)](j) applicationwhistles'(j) reduction

    NLP1 - Torbjrn Lager

  • Grammar and Logical FormS -> NP VP [S] = [NP]([VP])NP -> john [NP] = P[P(j)]VP -> whistles [VP] = x[whistles'(x)]

    [john whistles] = whistles'(j)

    NLP1 - Torbjrn Lager

  • From Logical Form to Truth Conditionswhistles'(j) is true iff the individual (in the model) denoted by 'j' has the property denoted by 'whistles'cf. "John whistles" is true iff John whistles

    NLP1 - Torbjrn Lager

  • Beta Reduction (Lambda Conversion)[S] = [NP]([VP])[NP] = P[P(j)][VP] = x[whistles'(x)]

    Beta reduction rule: u() where every occurrence of u in is replaced by

    P[P(j)](x[whistles'(x)]) applicationx[whistles'(x)](j) reductionwhistles'(j) reduction

    NLP1 - Torbjrn Lager

  • A Larger ExampleS -> NP VP [S] = [NP]([VP])NP -> DET N [NP] = [DET]([N])DET -> every [DET] = Q[P[z[Q(z) P(z)]]]N -> man [N] = x[man'(x)]VP -> whistles [VP] = x[whistles'(x)]

    [every man whistles} = z[man'(z) whistles'(z)]

    NLP1 - Torbjrn Lager

  • A Larger Example (cont'd)[S] = [NP]([VP])[NP] = [DET]([N])[DET] = Q[P[z[Q(z) P(z)]]][N] = x[man'(x)][VP] = x[whistles'(x)]

    Q[P[z[Q(z) P(z)]]](x[man'(x)]) applicationP[z[x[man'(x)](z) P(z)]] reductionP[z[man'(z) P(z)]] reductionP[z[man'(z) P(z)]](x[whistles'(x)]) applicationz[man'(z) x[whistles'(x)](z)] reductionz[man'(z) whistles'(z)] reduction

    NLP1 - Torbjrn Lager

  • Alternative Ontologies[john whistles] = ex[isa(e,WhistleEvent) agent(e,x) named(x,"john")]

    [john whistles] = whistles'(j)

    NLP1 - Torbjrn Lager

  • Alternative Meaning Representation FormalismsDiscourse Representation Theory

    [every man whistles} =

    man(x)whistles(x)x

    NLP1 - Torbjrn Lager

  • Semantics ResearchHow to design a nice meaning representation language?Ontology?How to treat a particular NL construct? in a compositional way?and end up with correct truth conditions?and still be elegant and clean?How to deal with things like time and events, propositional attitude reports, reference to non-existent individuals, etc.? How to solve a particular semantic puzzle?How to design a nice syntax-semantics interface?How to design a nice semantics-pragmatics interface?What is the role of inference in semantic processing?How to account for things like presuppositions?

    NLP1 - Torbjrn Lager

  • Semantic PuzzlesWhy is "Every man loves a woman" ambiguous, but not "Every man loves Mary"?What's wrong with the following argument: "Nothing is better than a long and prosperous life. A ham sandwich is better than nothing. Therefore, a ham sandwich is better than a long and prosperous life."The morning star = the evening star, still "John believes that Venus is the morning star" may be true, and at the same time, "John believes that Venus is the evening star", may be false.Everything written on this slide is false.

    NLP1 - Torbjrn Lager

  • Word Sense DisambiguationTorbjrn LagerDepartment of LinguisticsUppsala University

    NLP1 - Torbjrn Lager

  • Example: Senses of "interest"From the LDOCE

    readiness to give attention, quality of causing attention to be givenactivity, subject, etc., which one gives time and attention toadvantage, advancement, or favoura share (in a company, business, etc.)money paid for the use of money

    NLP1 - Torbjrn Lager

  • WSD ExamplesAt the same time, the drop in interest rates since the spring has failed to revive the residential construction industry.Cray Research will retain a 10% interest in the new company, which will be based in Colorado Springs.Although that may sound like an arcane maneuver of little interest outside Washington, it would set off a political earthquake.

    NLP1 - Torbjrn Lager

  • Why Word Sense Disambiguation?Well, compositional logical semantics doesn't deal with word meaning, so...Machine translationA non-disambiguated Russian translation of: The spirit is willing but the flesh is weak gave The vodka is good but the meat is rotten Information RetrievalWhen searching the web for info about companies buying shares in other companies, you dont want to retrieve the information about interest rates.Provide clues to pronounciation"banan" -> BAnan or baNAN

    NLP1 - Torbjrn Lager

  • Approaches to WSDDeep (but brittle) WSD'Selectional restriction'-based approachesApproaches based on general reasoning with 'world knowledge'Shallow and robust WSDMachine learning approachesSupervised learningUnsupervised learningBootstrapping approachesDictionary-based approachesVarious combinations of methods

    NLP1 - Torbjrn Lager

  • Machine Learning ApproachesTraining data in the form of annotated corporaDecide on features on which to conditionPreprocessing StepsContext trimmingStemming/LemmatizingPart-of-speech taggingPartial ParsingUse a machine learning algorithmEnter the training-test cycle

    NLP1 - Torbjrn Lager

  • Dictionary-Based ApproachesLesk (1986)

    Find the dictionary definition that overlaps most with the definitions for the words in the ambiguous words context. Problem 1: A lot of computation.Problem 2: Definitions are usually too short

    NLP1 - Torbjrn Lager

  • Lesk ExampleLexicon entries: pine 1 kinds of evergreen tree with needle-shaped leaves 2 waste away through sorrow or illness cone 1 solid body which narrows to a point 2 something of this shape whether solid or hollow 3 fruit of certain evergreen tree

    Example: .... pine cone ...

    The third sense for "cone" is selected here, since two of the (content) words in its entry, "evergreen" and "

Recommended

View more >