recognizing contextual polarity in phrase-level sentiment analysis (hlt/emnlp 2005 )
DESCRIPTION
Recognizing Contextual Polarity in Phrase-Level Sentiment Analysis (HLT/EMNLP 2005 ). Theresa Wilson Janyce Wiebe Paul Hoffmann ( University of Pittsburgh ) Acknowledgements: This slide is created based on the presentation slides from http://www.cs.pitt.edu/~wiebe/. Outline. Introduction - PowerPoint PPT PresentationTRANSCRIPT
Recognizing Contextual Polarity in Phrase-Level Sentiment Analysis (HLT/EMNLP 2005 )
Theresa WilsonJanyce Wiebe
Paul Hoffmann
(University of Pittsburgh)
Acknowledgements: This slide is created based on the presentation slides from http://www.cs.pitt.edu/~wiebe/
2/35
Outline
Introduction Manual Annotations Corpus Prior-Polarity Subjectivity Lexicon Experiments Conclusions
3/35
Introduction (1/6) Sentiment analysis: task of
identifying positive and negative opinions, emotions, and evaluations
How detailed? depends on the application Flame detection, review classification
document-level analysis Question answering, review mining
sentence or phrase-level analysis
4/35
Introduction (2/6)
QA example: Q: What is the international reaction to th
e reelection of Robert Mugabe as President of Zimbabwe?
A: African observers generally approved of his victory while Western Governments denounced it.
5/35
Introduction (3/6) Prior polarity:
Use a lexicon of positive and negative words Examples:
beautiful positive horrid negative
Out of context Contextual polarity:
A word may appear in a phrase that expresses a different polarity in context
Example: Cheers to Timothy Whitfield for the wonderfully
horrid visuals.
6/35
Introduction (4/6)
Another interesting example: Philip Clap, President of the
National Environment Trust, sums up well the general thrust of the reaction of environmental movements: there is no reason at all to believe that the polluters are suddenly going to become reasonable.
7/35
Introduction (5/6)
Another interesting example: Philip Clap, President of the
National Environment Trust, sums up well the general thrust of the reaction of environmental movements: there is no reason at all to believe that the polluters are suddenly going to become reasonable.
prior polarity
contextual polarity
8/35
Introduction (6/6)
Goal: automatically distinguish contextual polarity
Approach: use machine learning and variety of features
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2
AllInstances
PolarInstances
9/35
Manual Annotation (1/3) Need: sentiment expressions (positive and
negative expressions of emotions, evaluations, stances) with contextual polarity
Had: subjective expression (words/phrases expressing emotions, evaluations, stances, speculations, etc.) annotations in MPQA Opinion Corpus
Decision: annotate subjective expressions in MPQA Corpus with their contextual polarity
10/35
Manual Annotation (2/3) Mark polarity of subjective expressions as
positive, negative, both, or neutral African observers generally approved (positive) of
his victory while Western governments denounced (negative) it.
Besides, politicians refer to good and evil (both) … Jerome says the hospital feels (neutral) no
different than a hospital in the states. Judge the contextual polarity of
sentiment ultimately being conveyed They have not succeeded, and will never succeed
(positive), in breaking the will of this valiant people.
11/35
Manual Annotation (3/3)
Agreement study: 2 annotators, using 10 documents with
447 subjective expressions Kappa: 0.72 (82%)
Remove uncertain cases at least one annotator marked uncertain (18%)
Kappa: 0.84 (90%)
But all data are included in experiments
12/35
Corpus 425 documents from MPQA Opinion
Corpus 15,991 subjective expressions in 8,984
sentences
Divided into two sets Development set
66 docs / 2,808 subjective expressions Experiment set
359 docs / 13,183 subjective expressions Divided into 10 folds for cross-validation
13/35
Prior-Polarity Subjectivity Lexicon Over 8,000 words from a variety of sources
Both manually and automatically identified Positive/negative words from General Inquirer a
nd Hatzivassiloglou and McKeown (1997) All words in lexicon tagged with:
Prior polarity: positive, negative, both, neutral Reliability: strongly subjective (strongsubj),
weakly subjective (weaksubj)
14/35
Experiment
Both Steps: BoosTexter AdaBoost.HM 5000 rounds boosting 10-fold cross validation
Give each instance its own label
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
28 features 10 features
15/35
Definition of Gold Standard Given an instance inst from the lexicon:
if inst not in a subjective expression: goldclass(inst) = neutral
else if inst in at least one positive and one negative subjective expression: goldclass(inst) = both
else if inst in a mixture of negative and neutral:goldclass(inst) = negative
else if inst in a mixture of positive and neutral: goldclass(inst) = positive
else: goldclass(inst) = contextual polarity of subjective expression
16/35
Features
Many inspired by Polanya & Zaenen (2004): Contextual Valence Shifters Examples: little threat, little truth
Others capture dependency relationships between words Example: wonderfully horrid
mod
17/35
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
Word features Modification
features Structure features Sentence features Document feature
Word token terrifies
Word part-of-speech VB
Context (3 word tokens) that terrifies me
Prior Polarity negative
Reliability strongsubj
18/35
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
Word features Modification
features Structure features Sentence features Document feature
(Binary features) Preceded by
adjective adverb (other than not) intensifier (e.g. deeply, entirely…)
Self intensifier Modifies
strongsubj clue weaksubj clue
Modified by strongsubj clue weaksubj clue
Dependency Parse Tree
19/35
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
Word features Modification
features Structure
features Sentence features Document feature
(Binary features) Climbing up the tree
toward the root In subject
The human rights report poses
In copular I am confident
In passive voice must be regarded
The human rights
report
poses
a substantial
challenge
…
detadj mod adj
det
subj obj
p
20/35
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
Word features Modification
features Structure features Sentence
features Document feature
Count of strongsubj clues in previous, current, next sentence
Count of weaksubj clues in previous, current, next sentence
Counts of various parts of speech adjectives, adverbs, whether a pronoun…
21/35
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
Word features Modification
features Structure features Sentence features Document
feature
Document topic (15) economics health Kyoto protocol presidential election in Zimbabwe ……
For example, document on health may contain the word “fever,” but it is not being used to express a sentiment.
22/35
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
75.9
63.4
82.1
40
50
60
70
80
90
Accuracy Polar F Neutral F
Word token
Word + Prior Polarity
All Features
Results 1a
23/35
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
30
40
50
60
70
80
Polar Recall Polar Precision
Word token
Word + Prior Polarity
All Features
Results 1b
24/35
Step 2: Polarity Classification
Classes positive, negative, both, neutral
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
19,506 5,671
25/35
Word token Word prior polarity Negated Negated subject Modifies polarity Modified by polarity Conjunction polarity General polarity
shifter Negative polarity
shifter Positive polarity
shifter
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
26/35
Word token Word prior polarity Negated Negated subject Modifies polarity Modified by polarity Conjunction polarity General polarity shifter Negative polarity
shifter Positive polarity shifter
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
Word token terrifies
Word prior polarity negative
27/35
Word token Word prior polarity Negated Negated subject Modifies polarity Modified by polarity Conjunction polarity General polarity
shifter Negative polarity
shifter Positive polarity
shifter
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
(Binary features) Negated
not good does not look very good
Negated subject No politically prudent Israeli could support either of them.
28/35
Word token Word prior polarity Negated Negated subject Modifies polarity Modified by
polarity Conjunction polarity General polarity shifter Negative polarity
shifter Positive polarity shifter
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
Modifies polarity 5 values: positive, negative, neutral, both, not mod substantial: negative
Modified by polarity 5 values: positive, negative, neutral, both, not mod challenge: positivesubstantial (pos) challenge (neg)
29/35
Word token Word prior polarity Negated Negated subject Modifies polarity Modified by polarity Conjunction
polarity General polarity
shifter Negative polarity
shifter Positive polarity shifter
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
Conjunction polarity 5 values: positive, negative, neutral, both, not mod good: negative
good (pos) and evil (neg)
30/35
Word token Word prior polarity Negated Negated subject Modifies polarity Modified by polarity Conjunction polarity General polarity
shifter Negative
polarity shifter Positive polarity
shifter
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
4 words before General polarity shifter
pose little threat contains little truth
Negative polarity shifter lack of understanding
Positive polarity shifter abate the damage
31/35
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
65.7 65.1
77.2
46.2
30
40
50
60
70
80
90
Accuracy Pos F Neg F Neutral F
Word token
Word + Prior Polarity
All Features
Results 2a
32/35
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
40
50
60
70
80
90
PosRecall
Pos Prec NegRecall
Neg Prec
Word token
Word + Prior Polarity
All Features
Results 2b
33/35
Ablation experiments removing features: AB1: Negated, negated subject AB2: Modifies polarity, modified by polarity AB3: Conjunction polarity AB4: General, negative, positive polarity
shifters Results:
The only significant difference is neutral F-measure when AB2 are removed the combination of features is needed to achieve significant performance
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
34/35
Conclusion Automatically identify the contextual
polarity of a large subset of sentiment expression Presented a two-step approach to phrase-
level sentiment analysis
1. Determine if an expression is neutral or polar
2. Determines contextual polarity of the ones that are polar
Achieve significant results for a large subset of sentiment expressions
35/35
Q & A