natural language processing - kursused - …€¢slides largely based on:...

63
Lexical Semantics Natural Language Processing: Lecture 8 26.10.2017 Kairit Sirts

Upload: vubao

Post on 10-Apr-2018

241 views

Category:

Documents


4 download

TRANSCRIPT

Page 1: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Lexical SemanticsNatural Language Processing: Lecture 8

26.10.2017

Kairit Sirts

Page 2: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Lexical semantics

• What is the meaning of a lexical item?• Word

• bank, bat

• Multiword expression• Pay a visit, rock the boat

2

Page 3: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Topics

• Word senses

• WordNet - a lexical database/ontology

• Word sense disambiguation

• Slides largely based on:• https://web.stanford.edu/~jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf

• https://web.stanford.edu/~jurafsky/slp3/slides/Chapter18.wsd.pdf

3

Page 4: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Word senses

4

Page 5: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Lemma and word form

• A lemma or citation form• Same stem, part of speech, rough semantics

• A wordform• The inflected word as it appears in text

5

Wordform Lemma

banks bank

sung sing

laulis laulma

Page 6: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Word Senses

• One lemma “bank” can have many meanings:• …a bank can hold the investments in a custodial account…

• “…as agriculture burgeons on the east bank the river will

shrink even more”

• Sense (or word sense)• A discrete representation of an aspect of a word’s meaning.

• The lemma bank here has two senses

6

Sense 1:Sense 2:

Page 7: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Homonymy

Homonyms: words that share a form but have unrelated, distinct meanings:

• bank1: financial institution, bank2: sloping land

• bat1: club for hitting a ball, bat2: nocturnal flying mammal

1. Homographs • In English: bank/bank, bat/bat

• In Estonian: tee/tee

2. Homophones:• In English: write and right, piece and peace

• In Estonian: baar and paar, ball and pall

7

Page 8: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Problems with homonymy in NLP applications

• Information retrieval

• “bat care”

• Machine Translation

• bat: nahkhiir (animal) or kurikas (for baseball)

• Text-to-Speech• bass (stringed instrument) vs. bass (fish)

8

Page 9: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Polysemy

• 1. The bank was constructed in 1875 out of local red brick.

• 2. I withdrew the money from the bank

• Are those the same sense?• Sense 2: “A financial institution”

• Sense 1: “The building belonging to a financial institution”

• A polysemous word has related meanings• Most non-rare words have multiple meanings

9

Page 10: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Metonymy

Systematic polysemy: A systematic relationship between senses

• Lots of types of polysemy are systematic• School, university, hospital

• All can mean the institution or the building.

• A systematic relationship:• Building Organization

• Other such kinds of systematic polysemy: Author (Jane Austen wrote Emma)

Works of Author (I love Jane Austen)

Tree (Plums have beautiful blossoms)

Fruit (I ate a preserved plum)

10

Page 11: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

“Zeugma” test

• How do we know if a word has more than one sense?

• Two senses of serve?• Which flights serve breakfast?

• Does Lufthansa serve Philadelphia?

• ?Does Lufthansa serve breakfast and Philadelphia?

• Since this conjunction sounds weird, • we say that these are two different senses of “serve”

11

Page 12: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Synonyms

• Synonyms are words that can be substituted one for other without changing the meaning of the sentence

• But there are few (or no) examples of perfect synonymy.• Even if many aspects of meaning are identical

• Still may not preserve the acceptability based on notions of politeness, slang, register, genre, etc.

• Example:• Water/H20

• Big/large

• Brave/courageous

12

Page 13: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Synonymy as relation between senses

• Consider the words big and large

• Are they synonyms?• How big is that plane?• Would I be flying on a large or small plane?

• How about here:• Miss Nelson became a kind of big sister to Benjamin.• ?Miss Nelson became a kind of large sister to Benjamin.

• Why?• big has a sense that means being older, or grown up• large lacks this sense

13

Page 14: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Antonyms

• Senses that are opposites with respect to one feature of meaning

• Otherwise, they are very similar!dark/light short/long fast/slow rise/fall

hot/cold up/down in/out

• More formally: antonyms can• define a binary opposition or be at opposite ends of a scale

• long/short, fast/slow

• Be reversives:• rise/fall, up/down

14

Page 15: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Synonyms and antonyms in word embeddings

• We would like the synonyms to be close in space and antonyms far away

• In practice:• Antonyms can be close because their contexts are very similar• Synonyms can be very far away because their most common senses have very

different meanings:• Examples from Muromägi et al., 2017. Linear ensembles of word embedding

models• Kaks/puudulik (two/insufficient)• Ida/ost (east/loan word from German also meaning east, also a purchase)• Rubla/kull (rouble/bank note in slang, also a hawk)

15

Page 16: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Hyponymy and hypernymy

• One sense is a hyponym of another if the first sense is more specific, denoting a subclass of the other• car is a hyponym of vehicle

• mango is a hyponym of fruit

• Conversely hypernym/superordinate• vehicle is a hypernym of car

• fruit is a hypernym of mango

• Hyponymy is a IS-A relation: car IS A vehicle, mango IS A fruit

16

Superordinate/hyper vehicle fruit furniture

Subordinate/hyponym car mango chair

Page 17: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Meronymy

• The part-whole relation• A leg is part of a chair; a wheel is part of a car.

• Wheel is a meronym of car, and car is a holonym of wheel.

17

Page 18: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

WordNet

18

Page 19: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

English WordNet 3.0

• A hierarchically organized lexical database

• On-line thesaurus + aspects of a dictionary• Some other languages available or under development

• (Arabic, Finnish, German, Portuguese…)

19

Category Unique Strings

Noun 117,798

Verb 11,529

Adjective 22,479

Adverb 4,481

Page 20: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Estonian WordNet

• Accessible from Estnltk

20

Category Unique Strings

Noun 83,487

Verb 11,093

Adjective 8,173

Adverb 4,790

Page 21: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Senses of “bass” in English WordNet

21

Page 22: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Senses of “bass” in Estonian WordNet

22http://wordties.cst.dk/wordties-estwn/

Page 23: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

How is sense defined in WordNet?

• The synset (synonym set), the set of near-synonyms, instantiates a sense or concept, with a gloss

• Example: chump as a noun with the gloss:“a person who is gullible and easy to take advantage of”

• This sense of “chump” is shared by 9 words:chump1, fool2, gull1, mark9, patsy1, fall guy1, sucker1, soft touch1, mug2

• Each of these senses have this same gloss• (Not every sense; sense 2 of gull is the aquatic bird)

23

Page 24: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

WordNet Hypernym hierarchy for ”bass”

24

Page 25: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

WordNet Noun relations

25

Page 26: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

WordNet Verb relations

26

Page 27: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

WordNet as a graph

27

Page 28: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Supersenses

• The top level hypernyms in the hierarchy

• Counts from Schneider and Smith 2014’s STREUSLE corpus

28

Page 29: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Supersenses

• A word’s supersense can be a useful coarse-grained representation of word meaning for NLP tasks

29

SemEval 2016 Task 10: Detecting Minimal Semantic Units and their

Meanings (DiMSUM)

Task Home Page

The DiMSUM shared task at SemEval 2016 is concerned with predicting, given an English sentence, abroad-coverage representation of lexical semantics. The representation consists of two closely connectedfacets: a segmentation into minimal semantic units, and a labeling of some of those units with semanticclasses known as supersenses.

For example, given the POS-tagged sentence

IPRP googledVBD restaurantsNNS inIN theDT areaNN andCC FujiNNP SushiNNP cameVBD upRB andCCreviewsNNS wereVBD greatJJ soRB IPRP madeVBD aDT carryVB outRP orderNN

the goal is to predict the representation

I googledcommunication restaurantsGROUP in the areaLOCATION and Fuji_SushiGROUPcame_upcommunication and reviewsCOMMUNICATION werestative great so I made_ a

carry_outpossession _ordercommunication

where lowercase labels are verb supersenses, UPPERCASE labels are noun supersenses, and _ joins tokenswithin a multiword expression. (carry_outpossession and made_ordercommunication are separate MWEs.)

The two facets of the representation are discussed in greater detail below. Systems are expected to producethe both facets, though the manner in which they do this (e.g., pipeline vs. joint model) is up to you.

Gold standard training data labeled with the combined representation will be provided in two domains:online reviews and tweets. (Rules for using other data resources in data conditions.) Blind test data will be inthese two domains as well as a third, surprise domain. The domain will not be indicated as part of the input attest time. The three test domains will have equal weight in the overall system scores (details of the scoringprocedure will be announced at a future time).

Minimal semantic units

The word tokens of the sentence are partitioned into basic units of lexical meaning. Equivalently, wheremultiple tokens function together as an idiomatic whole, they are grouped together into a multiword

expression (MWE). MWEs include: nominal compounds like hot dog; verbal expressions like do away with'eliminate', make decisions 'decide', kick the bucket 'die'; PP idioms like at all and on the spot 'withoutplanning'; multiword prepositions/connectives like in front of and due to; multiword named entities; andmany other kinds.

Input word tokens are never subdivided.Grouped tokens do not have to be contiguous; e.g., verb-particle constructions are annotated whetherthey are contiguous (make up the story) or gappy (make the story up). There are, however, formalconstraints on gaps to facilitate sequence tagging.Combinations considered to be statistical collocations (yet compositional in meaning) are called "weak

Page 30: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Word similarity with WordNet

• Synonymy: a binary relation• Two words are either synonymous or not

• Similarity (or distance): a looser metric• Two words are more similar if they share more features of meaning

• Similarity is properly a relation between senses• The word “bank” is not similar to the word “slope”

• Bank1 is similar to fund3

• Bank2 is similar to slope5

30

Page 31: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Word Similarity

• A practical component in lots of NLP tasks• Question answering

• Natural language generation

• Automatic essay grading

• Plagiarism detection

31

Page 32: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Word similarity and word relatedness

• We often distinguish word similarity from word relatedness• Similar words: near-synonyms

• Related words: can be related any way

• car, bicycle: similar

• car, gasoline: related, not similar

32

Page 33: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Two classes of similarity algorithms

• Thesaurus-based algorithms• Are words “nearby” in hypernym hierarchy?

• Do words have similar glosses (definitions)?

• Distributional algorithms• Do words have similar distributional contexts?

• Cosine similarity between word embeddings

33

Page 34: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Path-based similarity

• Two concepts (senses/synsets) are similar if they are near each other in the thesaurus hierarchy • =have a short path between them

• concepts have path 1 to themselves

34

Page 35: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Refinements to path-based similarity

• pathlen(c1,c2) = 1 + number of edges in the shortest path in the hypernym graph between sense nodes c1 and c2

• simpath(c1,c2) =

• ranges from 0 to 1 (identity)

• wordsim(w1,w2) = max simpath(c1,c2)

35

1

pathlen(c1,c2 )

c1senses(w1),c2senses(w2)

Page 36: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Example: path-based similarity

• simpath(c1,c2) = 1/pathlen(c1,c2)

simpath(nickel,coin) =

simpath(fund,budget) =

simpath(nickel,currency) =

simpath(nickel,money) =

simpath(coinage,Richter scale) =

36

Page 37: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Example: path-based similarity

• simpath(c1,c2) = 1/pathlen(c1,c2)

simpath(nickel,coin) = 1/2 = 0.5

simpath(fund,budget) = 1/2 = 0.5

simpath(nickel,currency) = 1/4 = 0.25

simpath(nickel,money) = 1/6 = 0.17

simpath(coinage,Richter scale) = 1/6 = 0.17

37

Page 38: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Word Sense Disambiguation

38

Page 39: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Word Sense Disambiguation (WSD)

• Given • A word in context

• A fixed inventory of potential word senses

• Decide which sense of the word this is

• Why? Machine translation, QA, speech synthesis

• What set of senses?• English-to-Spanish MT: set of Spanish translations

• Speech Synthesis: homographs like bass and bow

• In general: the senses in a thesaurus like WordNet

39

Page 40: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Two variants of WSD task

• Lexical Sample task• Small pre-selected set of target words (line, plant)

• And inventory of senses for each word

• Supervised machine learning: train a classifier for each word

• All-words task• Every word in an entire text

• A lexicon with senses for each word

• Data sparseness: can’t train word-specific classifiers

40

Page 41: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

WSD methods

• Thesaurus/Dictionary Methods

• Supervised Machine Learning

• Semi-Supervised Learning

41

Page 42: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

WSD with dictionary and thesaurus methods

• The simplified Lesk algorithm

• Let’s disambiguate “bank” in this sentence:The bank can guarantee deposits will eventually cover future tuition costs because it invests in adjustable-rate mortgage securities.

• given the following two WordNet senses:

42

Page 43: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

The simplified Lesk algorithm

• Choose sense with most word overlap between gloss and context (not counting function words)

The bank can guarantee deposits will eventually cover future tuition costs because it invests in adjustable-rate mortgage securities.

43

Page 44: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

The Corpus Lesk algorithm

• Assumes we have some sense-labeled data (like SemCor)

• Take all the sentences with the relevant word sense:These short, "streamlined" meetings usually are sponsored by local banks1, Chambers of Commerce, trade associations, or other civic organizations.

• Now add these to the gloss + examples for each sense, call it the “signature” of a sense.

• Choose sense with most word overlap between context and signature.

44

Page 45: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Supervised machine learning approaches

• Supervised machine learning approach:• a training corpus of words tagged in context with their sense

• used to train a classifier that can tag words in new text

• Summary of what we need:• the tag set (“sense inventory”)

• the training corpus

• A set of features extracted from the training corpus

• A classifier

45

Page 46: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Supervised WSD 1: WSD Tags

• What’s a tag?A dictionary sense?

• For example, for WordNet an instance of “bass” in a text has 8 possible tags or labels (bass1 through bass8).

46

Page 47: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

8 senses of “bass” in WordNet

1.bass - (the lowest part of the musical range)2.bass, bass part - (the lowest part in polyphonic music)3.bass, basso - (an adult male singer with the lowest voice)4.sea bass, bass - (flesh of lean-fleshed saltwater fish of the family

Serranidae)5.freshwater bass, bass - (any of various North American lean-fleshed

freshwater fishes especially of the genus Micropterus)6.bass, bass voice, basso - (the lowest adult male singing voice)7.bass - (the member with the lowest range of a family of musical

instruments)8.bass - (nontechnical name for any of numerous edible marine and

freshwater spiny-finned fishes)

47

Page 48: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Inventory of sense tags for bass

WordNet sense Estonian translation Roget category Word in context

bass4 ahven FISH/INSECT … fish as Pacific salmon and striped bass and …

bass4 ahven FISH/INSECT … produce filets of smoked bass or sturgeon …

bass7 bass MUSIC … exciting jazz bass player since Ray Brown …

bass7 bass MUSIC … play bass because he doesn’t have to solo …

48

Page 49: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Training corpus

• Annotated with• Sense tag

• Lemma

• POS

• Several corpora available for English:• A large corpus for supervised word sense disambiguation

• Senseval

49

Page 50: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Features for the classifier

• Collocational features and bag-of-words features• Collocational

• Features about words at specific positions near target word

• Often limited to just word identity and POS

• Bag-of-words• Features about words that occur anywhere in the window (regardless of position)

• Typically limited to binary indicator functions or frequency counts

50

Page 51: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Collocational features

• Position-specific information about the words and collocations in window

• guitar and bass player stand

• word 1,2,3 grams in window of ±3 is common

51

Page 52: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Bag-of-words features

• “an unordered set of words” – position ignored

• Counts of words occur within the window.

• First choose a vocabulary

• Then count how often each of those terms occurs in a given window• sometimes just a binary “indicator” 1 or 0

• Assume we’ve settled on a possible vocabulary of 12 words in “bass” sentences:

[fishing, big, sound, player, fly, rod, pound, double, runs, playing, guitar, band]

• The vector for:guitar and bass player stand

[0,0,0,1,0,0,0,0,0,0,1,0] 52

Page 53: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

WSD Evaluation and baselines

• Best evaluation: extrinsic (‘end-to-end’, `task-based’) evaluation• Embed WSD algorithm in a task and see if you can do the task better!

• What we often do for convenience: intrinsic evaluation• Exact match sense accuracy

• % of words tagged identically with the human-manual sense tags

• Usually evaluate using held-out data from same labeled corpus

• Baselines• Most frequent sense

• The Lesk algorithm

53

Page 54: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Most Frequent Sense

• WordNet senses are ordered in frequency order

• So “most frequent sense” in WordNet = “take the first sense”

• Sense frequencies come from the SemCor corpus

54

Page 55: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Ceiling

• Human inter-annotator agreement• Compare annotations of two humans

• On same data

• Given same tagging guidelines

• Human agreements on all-words corpora with WordNet style senses• 75%-80%

55

Page 56: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Semi-Supervised Learning

Problem: supervised and dictionary-based approaches require large hand-built resources

What if you don’t have so much training data?

Solution: BootstrappingGeneralize from a very small hand-labeled seed-set.

56

Page 57: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Bootstrapping

• For bass• Rely on “One sense per collocation” rule

• A word reoccurring in collocation with the same word will almost surely have the same sense.

• the word play occurs with the music sense of bass

• the word fish occurs with the fish sense of bass

57

Page 58: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Sentences extraction using “fish” and “play”

58

Page 59: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Generating seeds

1) Hand labeling

2) “One sense per collocation”:• A word reoccurring in collocation with the same word will almost surely have

the same sense.

3) “One sense per discourse”:• The sense of a word is highly consistent within a document - Yarowsky

(1995)

• (At least for non-function words, and especially topic-specific words)

59

Page 60: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

WSD using word embeddings(based on Yuan et al., 2016. Semisupervised Word Sense Disambiguation with Neural Models)

• Construct features from context word embeddings: concatenate or average• Use only these or together with traditional features

• Compute a different embedding vector for each word sense

• Nearest neighbor methods to find the closest sense vector to the context vector

• Fine-tune pre-trained embeddings in a supervised WSD task

60

Page 61: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Semisupervised WSD with Neural ModelsYuan et al., 2016

61

Language model computing context vectorsNearest neighbor classifierfinding closest sense vector to the query context vector

Page 62: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Semisupervised WSD with Neural ModelsYuan et al., 2016

62

Page 63: Natural Language Processing - Kursused - …€¢Slides largely based on: •jurafsky/slp3/slides/Chapter18_introandsimilarity.pdf •jurafsky/slp3/slides/Chapter18.wsd.pdf 3 Word

Conclusion

• Lexical semantics deals with the meaning of words

• A word can have several meanings – senses

• Word senses can be unrelated (homonymy) or related (polysemy)

• WordNet is a lexical database that organises words into synsets and hypernymy hierarchies

• Word sense disambiguation is a task that attempt to identify a word’s sense in a particular context

63