natural language processing >>introductionintroduction

47
Natural Language Processing >>Introduction<< winter / fall 2011/2012 41.4268 Prof. Dr. Bettina Harriehausen-Mühlbauer Univ. of Applied Science, Darmstadt, Germany www.fbi.h-da.de/~harriehausen [email protected]

Upload: buitram

Post on 07-Apr-2018

221 views

Category:

Documents


1 download

TRANSCRIPT

Natural Language Processing>>Introduction<<

winter / fall 2011/201241.4268

Prof. Dr. Bettina Harriehausen-MühlbauerUniv. of Applied Science, Darmstadt, Germanywww.fbi.h-da.de/~harriehausen

[email protected]

WS 2011/2012 Natural Language Systems 2

What does Star Trek have to do with NLP ?

the past - the present - the future

WS 2011/2012 Natural Language Systems 3

What is NLP / Computational Linguistics ?

WS 2011/2012 Natural Language Systems 4

What is NLP / Computational Linguistics ?

definition:

A system is called a natural language processing systemwhen

• a subset of the input or output of the system is coded / written in a natural language and

• the processing of the data is performed by algorithms forthe morpho-syntactic, semantic, and pragmatic analysis orgeneration of natural language

WS 2011/2012 Natural Language Systems 5

Natural Language Processing is ...

an interdisciplinary field / art / science

• computer science (A.I.)

• linguistics (language independent)

• mathematics (logics, predicate logic, knowledge based systems, statistics, ...)

• psychology (cognitive science)

• physics (speech recognition, spoken language)

• ...

WS 2011/2012 Natural Language Systems 6

Natural Language Processing is ... a broad field / art / science• phonetics / phonology (speech processing / speech recognition)

phonemes = the smallest meaning-distinguishing items

• morphology (segmentation , compounding,...) - tokenizationmorphemes = the smallest items carrying meaning

• lexicology / electronic dictionaries – tagginglexemes , lemmas vs. full-forms (each entry needs a tag)idiomatic expressions , neologisms / „trendy words“ , homonyms , …

• syntax (analysis and generation of phonemes, morphemes, lexemes, phrases, sentences, paragraphs) …grammar / - formalisms(from transformation to unification)

• semantics (meaning, disambiguation, anaphora resolution,...)

• pragmatics (discourse representation)

WS 2011/2012 Natural Language Systems 7

We will focus on... (1)

• intro

• morphology• parsing / tokenization

• compounds

• lexicon / electronic dictionaries• lemmas / inflected forms

• coding features / tagging

• idiomatic expressions

• neologisms / „trendy words“

• homonyms

WS 2011/2012 Natural Language Systems 8

We will focus on... (2)

• syntax -> semantics : from transformation to unification(RTN / ATN), case grammar (Fillmore) , CD-structures

• machine translation

• data mining / text mining

• speech recognition

WS 2011/2012 Natural Language Systems 9

We will focus on... (3)

dictionary grammar

parser

WS 2011/2012 Natural Language Systems 10

We will focus on... (4)

• get an overview of and understand the scope of NLP

• get an overview of the state-of-the-art technologies(subset)

• understand the parallels between CL and NLP and A.I.

• reach the ability to use principles of linguistic theories in NLP programming

• develop a (prototype) of an NLP system

together, we want to:

WS 2011/2012 Natural Language Systems 11

reading material (obligatory)

Latest edition: Prentice Hall, 2008

ISBN-10: 0131873210, ISBN-13: 978-0131873216

First chapter: http://www.cs.colorado.edu/~martin/SLP/Updates/1.pdf

WS 2011/2012 Natural Language Systems 12

reading material (recommended)

MIT Press, 1999, ISBN 0262133601

Reader link: http://www.amazon.de/gp/reader/0262133601/ref=sib_dp_pt/028-2523061-0018166#reader-page

http://cognet.mit.edu/library/books/view?isbn=0262133601

WS 2011/2012 Natural Language Systems 13

reading material (recommended)

Pierre M. NuguesAn Introduction to Language Processing withPerl and Prolog: An Outline of Theories, Implementation, and Application with Special Consideration of English, French, and German (Cognitive Technologies). Springer Berlin Heidelberg.ISBN-10: 3642064051 ISBN-13: 978-3642064050 Latest edition: 2010Reader link: http://www.amazon.de/Introduction-Language-Processing-Perl-Prolog/dp/3642064051/ref=sr_1_2?s=books-intl-de&ie=UTF8&qid=1316528179&sr=1-2#reader_3642064051

WS 2011/2012 Natural Language Systems 14

more...reading material (A.I.)

• Bobrow, D.G., Winograd, T. „An Overview of KRL, a Knowledge RepresentationLanguage“ in: Cognitive Science, Vol.1, No.1, 3-46, 1977

• Charniak, E. „A common representation for problem solving and natural languagecomprehension information“. Artificial Intelligence, 1981, 225-255.

• Friedman, J.A. Computer Model of Transformational Grammar. New York: Elsevier. 1971.

• Christopher D. Manning (Author), Prabhakar Raghavan (Author), Hinrich Schütze(Author). Introduction to Information Retrieval. Cambridge University Press. 2008. ISBN-10: 0521865719 ISBN-13: 978-0521865715

• Norvig, Peter. Unified Theory of Inference for Text Understanding. Univ. of California, Berkeley, Computer Science Division. Report. No. UCB/CSD 87/339. 1987.

• Quillian, M.R. „Sematic Memory“. In: M.Minsky, ed. Semantic Information Processing. MIT Press. Cambridge. 1968.

more

WS 2011/2012 Natural Language Systems 15

more...reading material (A.I.)

• Stuart Russell (Author), Peter Norvig (Author) Artificial Intelligence: A Modern Approach (2nd Edition) (Prentice Hall Series in Artificial Intelligence). Prentice Hall, 2002. ISBN-10: 0137903952

• Schank, R.C. Conceptual Information Processing. Amsterdam: North Holland. 1975.

• Schank, R.C., Abelson, R.P. Scripts, Goals and Understanding: An Inquiry intoHuman Knowledge Structures. Hillsdale: Lawrence Erlbaum Associates. 1977.

• Wilensky, R., Arens, Y. PHRAN: A knowledge-based approach to naturallanguage analysis. Electronics Research Laboratory, College of Engineering. University of California, Berkeley. Memorandum No. UCB/ERL M80/34. 1980.

• Wilensky, Robert. „Some Problems for proposals for KnowledgeRepresentation“. University of Berkeley, CS Dept. 1986.

• Woods, W.A. „What‘s a link: Foundations for Semantic Networks“. In: Representation and Understanding: Studies in Cognitive Science. D.G. Bobrow, A. Collins, eds. New York: Academic Press, 1975.

more

WS 2011/2012 Natural Language Systems 16

more...reading material (NLP)• Bresnan, Joan, ed. The mental Representations of Language. London: MIT Press.

1982.• Bresnan, Joan. Lexical Functional Grammar. Stanford Linguistic Institute. 1987• Chomsky, Noam. Aspects of the Theory of Syntax. Cambridge: MIT Press. 1965.• Ronen Feldman (Author), James Sanger (Author) The Text Mining Handbook:

Advanced Approaches in Analyzing Unstructured Data (Hardcover). Cambridge University Press. 2006.

• Fillmore, Charles. The Case for Case. Ohio State University, 1968.• Fillmore, Charles. „The case for Case reopened“. In: P. Cole, J.M. Saddock, eds.

Syntax and Semantics 8: Grammatical Relations. Academic Press, N.Y. 1977.• Harriehausen, B. „Why grammars need to expand their scope of parsable input“,

Proceedings Second Conference on Arabic Computational Linguistics, Kuwait, 11/89.• Harriehausen, B. „The PLNLP Grammar checkers - CRITIQUE“, Proceedings ALLC-

ACH 90 Conference „The New Medium“. Siegen. 6/1990.• Harriehausen-Mühlbauer, B. „PLNLP - a comprehensive natural language processing

system for analysis and generation across languages“, Proceedings: The First International Seminar on Arabic Computational Linguistics, Egyptian Computer Society, Cairo, 6/92.

more

WS 2011/2012 Natural Language Systems 17

more...reading material (NLP)• Harriehausen-Mühlbauer, B,. Koop, A. „SCRIPT - a prototype for the recognition of

continuous, cursive, handwritten input by means of a neural network simulator“, Proceedings 1993 IEEE International Conference on Neural Networks, San Francisco, 3/1993.

• Jurafsky, Daniel, and James H. Martin. 2008. Speech and Language Processing: An Introduction to Natural Language Processing, Speech Recognition, and Computational Linguistics. 2nd edition. Prentice-Hall.

• Manning, Christopher / Schütze, Hinrich. Foundations of Statistical NaturalLanguage Processing. MIT Press. 1999.

• Levin, L., Rappaport, M., Zaenen, A., eds. Papers in Lexical Functional Grammar. Bloomington: Indiana University Linguistics Club. 1983.

• Ruslan Mitkov (Editor) The Oxford Handbook of Computational Linguistics (Oxford Handbooks in Linguistics). Oxford University Press. 2005 .

• Radford, A. Transformational Syntax. Cambridge: Cambridge University Press. 1981.• Rieger, C.J. „Conceptual Memory and Inference“. In: R.C. Schank. Conceptual

Information Processing. North Holland. 1975.• Shieber, S.M. An Introduction to Unification-based Approaches to Grammar. Stanford:

CSLI. 1986.• Winograd, T. Phenomenological Foundations of AI in Language.Stanford University,

Linguistic Institute, 1987.

WS 2011/2012 Natural Language Systems 18

history of NLP / CL

How did it all start ?

1949-1960 beginning of electronic languageprocessing: machine translation, linguistics data processing

The spirit is strong but the flesh is weak.->The vodka is strong but the meat is rotten.

WS 2011/2012 Natural Language Systems 19

history of NLP / CL

How did it all start ?

1960-1970 first formal (transformation) grammars(Chomsky 1957),beginning of language oriented researchin A.I.: first simple question-answering-systems; keyword (pattern-matching)-systems

1963 Sad-Sam (Lindsay), BASEBALL (Green)1966 DEACON (Craig), ELIZA (Weizenbaum), SYNTHEX (Simmons et.al.)1968 TLC (Quillian), SIR (Raphael), STUDENT (Bobrow), CONVERSE (Kellog)

WS 2011/2012 Natural Language Systems 20

ELIZA – pattern-matching (1)

• ELIZA is a computer program devised by Joseph Weizenbaum (1966) that simulates the role of a Rogerian psychologist.

• ELIZA was one of the first programs developed thatexplored the issues involved in using natural languageas the mode of communication between humans and the machine.

WS 2011/2012 Natural Language Systems 21

ELIZA – pattern-matching (2)

Why Simulate a Rogerian Psychologist?

Client-Centered Therapy (CCT), was developed by Carl Rogers in the 40's and 50's and is described as being a "non-directive" approach to counselling. That is, unlike most other forms of counselling, the therapist does not offer treatment,disagree, point out contradictions, or make interpretations or diagnoses. Instead, CCT is founded on the belief that people have the capacity to figure out their own solutions which can be facilitated by a psychologist who provides an accepting andunderstanding environment. As pointed out by Weizenbaum, "[this form of] psychiatric interview is one of the few examples of categorized dyadic natural language communication in which one of the participating pair is free to assume the pose of knowing almost nothing of the real world." For example, an appropriate response to a client's comment of "I went for a long walk„ could possibly be "Tell me about long walks." In this reply, the client would not assume that the therapist knew nothing about long walks, but instead, had some motive for steering the conversation in this direction. Such assumptions make this anappealing domain to simulate, as a degree of realism can be obtained without the need for storing explicit information about the real world.

WS 2011/2012 Natural Language Systems 22

ELIZA – pattern-matching (3)

How successful is ELIZA ?

WS 2011/2012 Natural Language Systems 23

ELIZA – pattern-matching (4)

How does ELIZA work?

• identifying keywords or phrases that the user inputs

• using patterns associated with these phrases to generate responses

• the most basic of these output patterns respond identically to all sentences containing the keyword

WS 2011/2012 Natural Language Systems 24

ELIZA – pattern-matching (5)

How does ELIZA work?

key: xnone 0answer: I‘m not sure I understand you fully-answer: That is interesting. Please continue.

key: sorryanswer: Please don‘t apologise.answer: Apologies are not necessary.

xnone = ELIZA responds to an input sentence that is not understood

(xnone is the default used when no other keyword is found in the sentence)

sorry = ELIZA responds to an input sentence that contains the word„sorry“

single keywords triggering a response:

WS 2011/2012 Natural Language Systems 25

ELIZA – pattern-matching (6)

How does ELIZA work?

key: I like xxx. (where xxx is an arbitrary string)

answer: Why do you like xxx ?answer: Why do you say you like xxx ?

keyphrases triggering a response with a conversion:

Example:

user: I like xxx.

ELIZA: Why do you like xxx?

WS 2011/2012 Natural Language Systems 26

ELIZA – pattern-matching (7)

How does ELIZA work?

key: I am xxx. (where xxx is an arbitrary string)

answer: Tell me why you think you are xxx .

keyphrases triggering a response with a conversion:

Example:

user: I am very unhappy at the moment.

ELIZA: Tell me why you think you are very unhappy at the moment.

WS 2011/2012 Natural Language Systems 27

ELIZA – pattern-matching (8)

How does ELIZA work?

key: rememberdecomp: * I remember *

answer: Do you often think of (2) ?answer: What else do you recollect ?

keyphrases triggering a response with a conversion plus postprocessing of reference words:

Example:

user: I remember my first boyfriend.

Decomposition: the first * = empty string, the second * = my firstboyfriend (= (2))

ELIZA: Do you often think of (* my ) your first boyfriend.

WS 2011/2012 Natural Language Systems 28

ELIZA – pattern-matching (9)

Now it‘s your turn !

Try out ELIZA, make up your own mind as to ELIZA‘s realism. Get a first idea of man-machinecommunication.

WS 2011/2012 Natural Language Systems 29

ELIZA – pattern-matching (10)

to „play“ with ELIZA (see: the following links)

ELIZA program:

http://www.manifestation.com/neurotoys/eliza.php3

http://www-ai.ijs.si/eliza-cgi-bin/eliza_script

http://www-ai.ijs.si/eliza/eliza.html

Reading:Joseph Weizenbaum: ELIZA - A Computer Program For the Study of

Natural Language Communication Between Man And Machine. in: Communications of the ACM. New York 9.1966,1. ISSN 0001-0782

WS 2011/2012 Natural Language Systems 30

but now back to the history of NLP / CL

WS 2011/2012 Natural Language Systems 31

history of NLP / CL

How did it all start ?

1970-1980 knowledge-based expert systems and natural language database interfaces, development of formal grammars (esp. syntax analysis)

dialogue systems1972 SHRDLU (Winograd)1977 GUS (Bobrow et.al.), PAL (Sidner et.al.)natural language interfaces1972 LUNAR (Woods et.al.)1972-1976 RENDEVOUZ (Codd), REL (Thompson), REQUEST (Plath)1977 LIFER (Henrix), INTELLECT (Harris), PLANES (Waltz et.al.), CO-OP (Kaplan)

WS 2011/2012 Natural Language Systems 32

SHRDLUPerson: PICK UP A BIG RED BLOCK. Computer: OK. (does it) Person: GRASP THE PYRAMID. Computer: I DON'T UNDERSTAND WHICH PYRAMID YOU MEAN. Person: FIND A BLOCK WHICH IS TALLER THAN THE ONE YOU ARE HOLDING AND PUT ITINTO THE BOX. Computer: BY "IT", I ASSUME YOU MEAN THE BLOCK WHICH IS TALLER THAN THE ONE I AM HOLDING. Computer: OK. (does it) Person: WHAT DOES THE BOX CONTAIN? Computer: THE BLUE PYRAMID AND THE BLUE BLOCK. Person: WHAT IS THE PYRAMID SUPPORTED BY? Computer: THE BOX. Person: HOW MANY BLOCKS ARE NOT IN THE BOX? Computer: FOUR OF THEM. Person: IS AT LEAST ONE OF THEM NARROWER THAN THE ONE WHICH I TOLD YOU TO PICK UP? Computer: YES, THE RED CUBE. Person: IS IT SUPPORTED? Computer: YES, BY THE TABLE.

WS 2011/2012 Natural Language Systems 33

Natural language DB interface

LanguageAccess (natural language interface to a relational database)

Sentence xy: WHICH COUNTRY EXPORTS FISH

SQL-query: SELECT DISTINCT X1 COUNTRY, X1.PRODUCT

FROM EXPORTBASE X1

WHERE X1.PCLASS=„FMF“

WS 2011/2012 Natural Language Systems 34

history of NLP / CL

How did it all start ?text “understanding“ and text generating systems1975 MARGIE (Schank et.al.), SAM (Schank et.al.)1976-1979 TALE-SPIN (Meehan), PAM (Wilensky), FRUMP (DeJong)1980 PHRAN (Wilensky)

• 1980-1990 focus on semantic-pragmatic analysis, natural language applications, modelsof complex communication pattern

- robust dialogue systems

- integration of natural language components in expert systems

- knowledge acquisition via natural language (both man and machinelearn)

WS 2011/2012 Natural Language Systems 35

history of NLP / CL

How did it all start ?• 1990-2000+ machine translation (revival), data

mining / text mining, intelligent text processing systems (text critiquing), integration of computerlinguisticcomponents in multimedia (CALL, CBT, TELL,...)...

boom (integration of NLP everywhere)

that‘s where we aretoday:

- growing demand

- growing size of applications

- growing userexpectations

WS 2011/2012 Natural Language Systems 36

NLP / CL today

we have come very far,

but... ...there are still a lot of open questions:

• what is knowledge ?

• when do we have to consider knowledge in natural language processing ?

• how can knowledge be formalized ?

• how are the analysis of language and the understanding of language interrelated ?

• what is communication ?

• easy (?) natural language

• technical language as a „dialect“ of natural language (e.g. medical language)

• artificial language as „meta language“ (e.g. Esperanto)

• logics (a special form of representation on an abstract level)

WS 2011/2012 Natural Language Systems 37

Question : Natural language ... easy ?

…after all… we all use / speak / write it

Does this mean natural language is easy and easy to formalize ?

WS 2011/2012 Natural Language Systems 38

Natural language ... easy ?

Little Red Ridinghood

Rotkäppchen

Do you remember the story of the little girl that wore a red cape and which met a wolf while going to her grandmother‘s house ?

What‘s the problem ?

a little girl -> in German, -chen is the diminutive

Don -> Donny ; Kate -> Katie , Bill -> Billy

WS 2011/2012 Natural Language Systems 39

Natural language ... easy ?

other application: natural language database query

LanguageAccess (natural language interface to a relational database)

Sentence xy: WHICH COUNTRY EXPORTS FISH

natural language paraphrase / disambiguation of the input:

Which interpretation did you mean ?

Which country exports the product fish (fish = object)

Which country is exported by fish (fish=subject)

in German: with zero-article, it‘s ambiguous (disambiguation by casemarking of the article)

SQL-query: SELECT DISTINCT X1 COUNTRY, X1.PRODUCT

FROM EXPORTBASE X1

WHERE X1.PCLASS=„FMF“

WS 2011/2012 Natural Language Systems 40

Natural language ... easy ?

SENTENCE XY: Who placed as many software orders as Garzillo?

SQL-query:

SELECT DISTINCT X.1 NAME, X1.PURCHASENUMBER

FROM PURCHASES X1, ORDERS X2

WHERE X1.PURCHASERNUMBER=X2.PURCHASERNUMBER

GROUP BY X1.PURCHASERNUMBER, X1.NAME

HAVING COUNT (*) =>

(SELECT COUNT (*)

FROM ORDERS X3, PURCHASERS X4

WHERE X3.PURCHASERNUMBER = X4.PURCHASERNUMBER

AND X4.NAME=‘Garzillo‘

GROUP BY X3.PURCHASERNUMBER)

WS 2011/2012 Natural Language Systems 41

Natural language ... easy ?

language is extremely ambiguous

easy for humans ??? easy for machines ???• lexical The pipe was brandnew.

• structural I saw the man with the telescope.

• deep structural She got ready for the picture.

• semantic Mary wants to get married to an Italian.

• pragmatic While walking from the gate to the house it collapsed.

WS 2011/2012 Natural Language Systems 42

Natural language ... easy ?

language is complex...you can say a lot with a few words

Mary sold John a book.

surface structure (obvious): transfer of book

deep structure (implication of „to sell“): transfer of money

WS 2011/2012 Natural Language Systems 43

Natural language ... easy ?

language can do a lot....e.g. with conjunctionsNP–NP I am eating a hamburger and a pizza.VP-VP I will eat the hamburger and throw away the pizza.S-S I eat a hamburger and Bill eats a pizza.PP-PP I eat a pizza with ham and with salami.ADJP-ADJP I eat a cold but delicious hamburger.ADVP-ADVP I eat the hamburger slowly and patiently.V-V I bake and eat a hamburger.AUX-AUX I can and will eat a hamburger.

and even more....

???-??? Mary is sitting on and Bill under the table.

WS 2011/2012 Natural Language Systems 44

Natural language ... easy ?

language is analyzed on different levels

WS 2011/2012 Natural Language Systems 45

TheThe 7 7 levelslevels of of languagelanguage understandingunderstanding

Needed knowledge:

features of the voice phonetic analysis

sound combinations of language phonological analysis

dictionary morphological / lexical analysis

grammar rules (parser) syntactic analysis

knowledge representation semantic analysis

world knowledge pragmatic analysis

acoustic signals

Bill ... letters

Billy is eating his lunch. sentences

small (Billy) knowledge

Billy... words

[Billy] is [mother] consequenceschild of

æ ç Þ ð ţ ş sounds

WS 2011/2012 Natural Language Systems 46

Natural language ... easy ?

Why then natural language ?Computers speak their own language. This language is efficient, economical, and exact. Why then would we want to „teach“ thecomputer a natural language with all its ambiguities and difficulties ?

Back to the boom!

when busy with your hands and you still want to type (voice type)

when you are a slow typer(voice type)

when travelling (machinetranslation)

when you don‘t want to learn a programming language to program your computer(machine translation)

when you want to evaluatemillions of lines of text (text/datamining)

when you don‘t want to learn a database query language to getdata (Startrek) (textanalysis, textgeneration, machinetranslation)

when you need to make a phonecallwith someone in Japan, but youdon‘t speak Japanese (voicerecognition, machine translation)

WS 2011/2012 Natural Language Systems 47

applications of natural languagesystems

spoken text written text

Speech input Speech output

Dialogue Systems

dialogueunderstanding

text

analysis generation

translation

• speaker & voicerecognition

• spokencommands /command &control

• automaticdictation

• text-to-speech

• telephony

• IVR (interactivevoiceresponse)

• informationsystems

• DB query

• expert systems

• CALL

• robot stearing

• programminglanguages

• spell aid

• text critiquing

• text summaries

• knowledgeacquisition(e.g. forexpertsystems)

• helpfunctions fortranslations

• automatictranslation

•simultaneoustranslation

• explanationsfor users

• knowledgerepresentation

• text generation

• writingsupport

Applications