using error-correcting codes for text classification

30
Using Error-Correcting Codes For Text Classification Rayid Ghani [email protected] is presentation can be accessed at http://www.cs.cmu.edu/~rayid/talks/

Upload: gyala

Post on 19-Jan-2016

39 views

Category:

Documents


0 download

DESCRIPTION

Using Error-Correcting Codes For Text Classification. Rayid Ghani [email protected]. This presentation can be accessed at http://www.cs.cmu.edu/~rayid/talks/. Outline. Introduction to ECOC Intuition & Motivation Some Questions? Experimental Results Semi-Theoretical Model Types of Codes - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Using Error-Correcting Codes For Text Classification

Using Error-Correcting Codes For Text Classification

Rayid [email protected]

This presentation can be accessed at http://www.cs.cmu.edu/~rayid/talks/

Page 2: Using Error-Correcting Codes For Text Classification

Outline Introduction to ECOC Intuition & Motivation Some Questions? Experimental Results Semi-Theoretical Model Types of Codes Drawbacks Conclusions

Page 3: Using Error-Correcting Codes For Text Classification

Introduction Decompose a multiclass

classification problem into multiple binary problems One-Per-Class Approach (moderately

expensive) All-Pairs (very expensive) Distributed Output Code (efficient but

what about performance?) Error-Correcting Output Codes (?)

Page 4: Using Error-Correcting Codes For Text Classification
Page 5: Using Error-Correcting Codes For Text Classification

Is it a good idea? Larger margin for error since errors can

now be “corrected” One-per-class is a code with minimum

hamming distance (HD) = 2 Distributed codes have low HD

The individual binary problems can be harder than before

Useless unless number of classes > 5

Page 6: Using Error-Correcting Codes For Text Classification

Training ECOC

Given m distinct classes

Create an m x n binary matrix M.

Each class is assigned ONE row of M.

Each column of the matrix divides the classes into TWO groups.

Train the Base classifiers to learn the n binary problems.

Page 7: Using Error-Correcting Codes For Text Classification

Testing ECOC To test a new instance

Apply each of the n classifiers to the new instance

Combine the predictions to obtain a binary string(codeword) for the new point

Classify to the class with the nearest codeword (usually hamming distance is used as the distance measure)

Page 8: Using Error-Correcting Codes For Text Classification

ECOC - Picture

0 0 1 1 01 0 1 0 00 1 1 1 00 1 0 0 1

ABCD

A

DC

B

f1 f2 f3 f4 f5

Page 9: Using Error-Correcting Codes For Text Classification

ECOC - Picture

0 0 1 1 01 0 1 0 00 1 1 1 00 1 0 0 1

ABCD

A

DC

B

f1 f2 f3 f4 f5

Page 10: Using Error-Correcting Codes For Text Classification

ECOC - Picture

0 0 1 1 01 0 1 0 00 1 1 1 00 1 0 0 1

ABCD

A

DC

B

f1 f2 f3 f4 f5

Page 11: Using Error-Correcting Codes For Text Classification

ECOC - Picture

0 0 1 1 01 0 1 0 00 1 1 1 00 1 0 0 1

ABCD

A

DC

B

f1 f2 f3 f4 f5

X 1 1 1 1 0

Page 12: Using Error-Correcting Codes For Text Classification

Questions? How well does it work? How long should the code be? Do we need a lot of training data? What kind of codes can we use? Are there intelligent ways of creating

the code?

Page 13: Using Error-Correcting Codes For Text Classification

Previous Work Combine with Boosting –

ADABOOST.OC (Schapire, 1997), (Guruswami & Sahai, 1999)

Local Learners (Ricci & Aha, 1997) Text Classification (Berger, 1999)

Page 14: Using Error-Correcting Codes For Text Classification

Experimental Setup Generate the code

BCH Codes Choose a Base Learner

Naive Bayes Classifier as used in text classification tasks (McCallum & Nigam 1998)

Page 15: Using Error-Correcting Codes For Text Classification

Dataset Industry Sector Dataset

Consists of company web pages classified into 105 economic sectors

Standard stoplist No Stemming Skip all MIME headers and HTML tags Experimental approach similar to

McCallum et al. (1998) for comparison purposes.

Page 16: Using Error-Correcting Codes For Text Classification

Results

Industry Sector Data Set

Naïve Bayes

Shrinkage1 ME2 ME/ w Prior3

ECOC 63-bit

66.1% 76% 79% 81.1% 88.5%

ECOC reduces the error of the Naïve Bayes Classifier by 66%

1. (McCallum et al. 1998) 2,3. (Nigam et al. 1999)

Page 17: Using Error-Correcting Codes For Text Classification

The Longer the Better!Naive Bayes Classifier15-bit ECOC 31-bit ECOC 63-bit ECOC

Accuracy(%) 65.3 77.4 83.6 88.1

Table 2: Average Classification Accuracy on 5 random 50-50 train-test splits of the Industry Sector dataset with a vocabulary size of 10000 words selected using Information Gain.

Longer codes mean larger codeword separation

The minimum hamming distance of a code C is the smallest distance between any pair of distance codewords in C

If minimum hamming distance is h, then the code can correct (h-1)/2 errors

Page 18: Using Error-Correcting Codes For Text Classification

Size Matters?

Variation of accuracy with code length and training size

40

50

60

70

80

90

100

0 20 40 60 80 100

Training size per class

Acc

ura

cy (

%) SBC

15bit

31bit

63bit

Page 19: Using Error-Correcting Codes For Text Classification

Size does NOT matter!

Percent Decrease in Error with Training size and length of code

30

35

40

45

50

55

60

65

70

0 20 40 60 80 100

Training Size

% D

ecre

ase

in E

rro

r

15bit

31bit

63bit

Page 20: Using Error-Correcting Codes For Text Classification

Semi-Theoretical Model Model ECOC by a Binomial Distribution B(n,p)

n = length of the codep = probability of each bit being classified

incorrectly

inave

iave

E

i

ppi

nnp

)1()(

max

0

Page 21: Using Error-Correcting Codes For Text Classification

Semi-Theoretical Model Model ECOC by a Binomial Distribution B(n,p)

n = length of the codep = probability of each bit being classified

incorrectly# of Bits Hmin Emax Pave Accuracy

15 5 2 .85 .59

15 5 2 .89 .80

15 5 2 .91 .84

31 11 5 .85 .67

31 11 5 .89 .91

31 11 5 .91 .94

63 31 15 .89 .99

Page 22: Using Error-Correcting Codes For Text Classification

Semi-Theoretical Model Model ECOC by a Binomial Distribution B(n,p)

n = length of the codep = probability of each bit being classified

incorrectly# of Bits Hmin Emax Pave Accuracy

15 5 2 .85 .59

15 5 2 .89 .80

15 5 2 .91 .84

31 11 5 .85 .67

31 11 5 .89 .91

31 11 5 .91 .94

63 31 15 .89 .99

inave

iave

E

i

ppi

nnp

)1()(

max

0

Page 23: Using Error-Correcting Codes For Text Classification

Theoretical Vs. Experimental AccuracyVocabsize=10000

0

20

40

60

80

100

15 15 15 31 31 31 63

Length of Code

Acc

ura

cy (

%)

Theoretical

Exprerimental

Page 24: Using Error-Correcting Codes For Text Classification

Types of CodesTypes of Codes Data-Independent Data-Dependent

Algebraic

Random

Hand-Constructed

Adaptive

Page 25: Using Error-Correcting Codes For Text Classification

What is a Good Code? Row Separation Column Separation (Independence

of errors for each binary classifier) Efficiency (for long codes)

Page 26: Using Error-Correcting Codes For Text Classification

Choosing Codes

Random Algebraic

Row Sep On AverageFor long codes

Guaranteed

Col Sep On AverageFor long codes

Can be Guaranteed

Efficiency No Yes

Page 27: Using Error-Correcting Codes For Text Classification

Experimental Results

Code Min Row HD

Max Row HD

Min Col HD

Max Col HD

Error Rate

15-Bit BCH

5 15 49 64 20.6%

19-Bit Hybrid

5 18 15 69 22.3%

15-bit Random

2 (1.5)

13 42 60 24.1%

Page 28: Using Error-Correcting Codes For Text Classification

Drawbacks Can be computationally expensive Random Codes throw away the real-

world nature of the data by picking random partitions to create artificial binary problems

Page 29: Using Error-Correcting Codes For Text Classification

Future Work Combine ECOC with Co-Training Automatically construct optimal /

adaptive codes

Page 30: Using Error-Correcting Codes For Text Classification

Conclusion Improves Classification Accuracy

considerably! Can be used when training data is sparse Algebraic codes perform better than

random codes for a given code lenth Hand-constructed codes are not the

answer