machine learning: decision trees homework 4 assigned

46
Machine Learning: Decision Trees Homework 4 assigned courtesy: Geoffrey Hinton, Yann LeCun, Tan, Steinbach, Kumar

Upload: infinity

Post on 23-Feb-2016

44 views

Category:

Documents


0 download

DESCRIPTION

Machine Learning: Decision Trees Homework 4 assigned. What is machine learning?. It is very hard to write programs that solve problems like recognizing a face. We don’t know what program to write because we don’t know how its done. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Machine Learning:  Decision  Trees Homework 4 assigned

Machine Learning: Decision Trees

Homework 4 assigned

courtesy: Geoffrey Hinton, Yann LeCun, Tan, Steinbach, Kumar

Page 2: Machine Learning:  Decision  Trees Homework 4 assigned

What is machine learning? It is very hard to write programs that solve problems like

recognizing a face.· We don’t know what program to write because we don’t

know how its done.· Even if we had a good idea about how to do it, the

program might be horrendously complicated. Instead of writing a program by hand, we collect lots of

examples that specify the correct output for a given input. A machine learning algorithm then takes these examples

and produces a program that does the job.· The program produced by the learning algorithm may

look very different from a typical hand-written program. It may contain millions of numbers.

· If we do it right, the program works for new cases as well as the ones we trained it on.

2

Page 3: Machine Learning:  Decision  Trees Homework 4 assigned

3

Page 4: Machine Learning:  Decision  Trees Homework 4 assigned

Handwriting/face recognition/detection

4

Page 5: Machine Learning:  Decision  Trees Homework 4 assigned

Visual object/action recognition

5

Aeroplanes Bicycles Birds Boats Buses Cars

Cats Trains Cows Chairs Dogs Horses

Page 6: Machine Learning:  Decision  Trees Homework 4 assigned

Different types of learning Supervised Learning: given training examples of inputs and

corresponding outputs, produce the “correct” outputs for new inputs. Example: character recognition.

Unsupervised Learning: given only inputs as training, find structure in the world: discover clusters, manifolds, characterize the areas of the space to which the observed inputs belong

Reinforcement Learning: an agent takes inputs from the environment, and takes actions that affect the environment. Occasionally, the agent gets a scalar reward or punishment. The goal is to learn to produce action sequences that maximize the expected reward.

6

Page 7: Machine Learning:  Decision  Trees Homework 4 assigned

Applications handwriting recognition, OCR: reading checks and zip codes,

handwriting recognition for tablet PCs. speech recognition, speaker recognition/verification security: face detection and recognition, event detection in videos. text classification: indexing, web search. computer vision: object detection and recognition. diagnosis: medical diagnosis (e.g. pap smears processing) adaptive control: locomotion control for legged robots, navigation for

mobile robots, minimizing pollutant emissions for chemical plants, predicting consumption for utilities...

fraud detection: e.g. detection of “unusual” usage patterns for credit cards or calling cards.

database marketing: predicting who is more likely to respond to an ad campaign. (...and the antidote) spam filtering.

games (e.g. backgammon). Financial prediction (many people on Wall Street use machine learning).

7

Page 8: Machine Learning:  Decision  Trees Homework 4 assigned

Learning ≠ memorization

rote learning is easy: just memorize all the training examples and their corresponding outputs.

when a new input comes in, compare it to all the memorized samples, and produce the output associated with the matching sample.

PROBLEM: in general, new inputs are different from training samples. The ability to produce correct outputs or behavior on previously unseen inputs is called GENERALIZATION.

rote learning is memorization without generalization. The big question of Learning Theory (and practice): how to get

good generalization with a limited number of examples.

8

Page 9: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 9

Look ahead

Supervised learning– Decision trees– Linear models– Neural networks

Unsupervised learning– K-means clustering

http://www-stat.stanford.edu/~tibs/ElemStatLearn/Full text in PDF

Page 10: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 10

Learning from examples

Apply Model

Induction

Deduction

Learn Model

Model

Tid Attrib1 Attrib2 Attrib3 Class

1 Yes Large 125K No

2 No Medium 100K No

3 No Small 70K No

4 Yes Medium 120K No

5 No Large 95K Yes

6 No Medium 60K No

7 Yes Large 220K No

8 No Small 85K Yes

9 No Medium 75K No

10 No Small 90K Yes 10

Tid Attrib1 Attrib2 Attrib3 Class

11 No Small 55K ?

12 Yes Medium 80K ?

13 Yes Large 110K ?

14 No Small 95K ?

15 No Large 67K ? 10

Test Set

Learning algorithm

Training Set

Page 11: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 11

Examples of Classification Task

Handwriting recognition Face detection Speech recognition Object recognition

Page 12: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 12

Learning from examples

Page 13: Machine Learning:  Decision  Trees Homework 4 assigned

Uses different biases in predicting Russel’s waiting habbits

Russell waits

Wait time? Patrons? Friday?

0.3

0.5

full

0.3

0.2

some

0.4

0.3

None

F

T

RW

0.3

0.5

full

0.3

0.2

some

0.4

0.3

None

F

T

RW

Naïve bayes(bayesnet learning)--Examples are used to --Learn topology --Learn CPTs

Neural Nets--Examples are used to --Learn topology --Learn edge weights

Decision Trees--Examples are used to --Learn topology --Order of questionsIf patrons=full and day=Friday

then wait (0.3/0.7)If wait>60 and Reservation=no then wait (0.4/0.9)

Association rules--Examples are used to --Learn support and confidence of association rules SVMs

K-nearest neighbors

Page 14: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 14

Page 15: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 15

Decision Tree Classification Task

Apply Model

Induction

Deduction

Learn Model

Model

Tid Attrib1 Attrib2 Attrib3 Class

1 Yes Large 125K No

2 No Medium 100K No

3 No Small 70K No

4 Yes Medium 120K No

5 No Large 95K Yes

6 No Medium 60K No

7 Yes Large 220K No

8 No Small 85K Yes

9 No Medium 75K No

10 No Small 90K Yes 10

Tid Attrib1 Attrib2 Attrib3 Class

11 No Small 55K ?

12 Yes Medium 80K ?

13 Yes Large 110K ?

14 No Small 95K ?

15 No Large 67K ? 10

Test Set

TreeInductionalgorithm

Training Set

Decision Tree

Page 16: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 16

Apply Model to Test Data

Refund

MarSt

TaxInc

YESNO

NO

NO

Yes No

Married Single, Divorced

< 80K > 80K

Refund Marital Status

Taxable Income Cheat

No Married 80K ? 10

Test DataStart from the root of tree.

Page 17: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 17

Apply Model to Test Data

Refund

MarSt

TaxInc

YESNO

NO

NO

Yes No

Married Single, Divorced

< 80K > 80K

Refund Marital Status

Taxable Income Cheat

No Married 80K ? 10

Test Data

Page 18: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 18

Apply Model to Test Data

Refund

MarSt

TaxInc

YESNO

NO

NO

Yes No

Married Single, Divorced

< 80K > 80K

Refund Marital Status

Taxable Income Cheat

No Married 80K ? 10

Test Data

Page 19: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 19

Apply Model to Test Data

Refund

MarSt

TaxInc

YESNO

NO

NO

Yes No

Married Single, Divorced

< 80K > 80K

Refund Marital Status

Taxable Income Cheat

No Married 80K ? 10

Test Data

Page 20: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 20

Apply Model to Test Data

Refund

MarSt

TaxInc

YESNO

NO

NO

Yes No

Married Single, Divorced

< 80K > 80K

Refund Marital Status

Taxable Income Cheat

No Married 80K ? 10

Test Data

Page 21: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 21

Apply Model to Test Data

Refund

MarSt

TaxInc

YESNO

NO

NO

Yes No

Married Single, Divorced

< 80K > 80K

Refund Marital Status

Taxable Income Cheat

No Married 80K ? 10

Test Data

Assign Cheat to “No”

Page 22: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 22

Decision Tree Classification Task

Apply Model

Induction

Deduction

Learn Model

Model

Tid Attrib1 Attrib2 Attrib3 Class

1 Yes Large 125K No

2 No Medium 100K No

3 No Small 70K No

4 Yes Medium 120K No

5 No Large 95K Yes

6 No Medium 60K No

7 Yes Large 220K No

8 No Small 85K Yes

9 No Medium 75K No

10 No Small 90K Yes 10

Tid Attrib1 Attrib2 Attrib3 Class

11 No Small 55K ?

12 Yes Medium 80K ?

13 Yes Large 110K ?

14 No Small 95K ?

15 No Large 67K ? 10

Test Set

TreeInductionalgorithm

Training Set

Decision Tree

Page 23: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 23

Decision Tree Induction

Many Algorithms:– Hunt’s Algorithm (one of the earliest)– CART– ID3, C4.5

http://www2.cs.uregina.ca/~dbd/cs831/notes/ml/dtrees/c4.5/tutorial.html

– SLIQ,SPRINT

http://www.cs.waikato.ac.nz/ml/weka/

Page 24: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 24

Tree Induction

Greedy strategy.– Split the records based on an attribute test

that optimizes certain criterion.

Issues– Determine how to split the records How to specify the attribute test condition? How to determine the best split?

– Determine when to stop splitting

Page 25: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 25

Tree Induction

Greedy strategy.– Split the records based on an attribute test

that optimizes certain criterion.

Issues– Determine how to split the records How to specify the attribute test condition? How to determine the best split?

– Determine when to stop splitting

Page 26: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 26

How to Specify Test Condition?

Depends on attribute types– Nominal– Ordinal– Continuous

Depends on number of ways to split– 2-way split– Multi-way split

Page 27: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 27

Splitting Based on Nominal Attributes

Multi-way split: Use as many partitions as distinct values.

Binary split: Divides values into two subsets. Need to find optimal partitioning.

CarTypeFamily

SportsLuxury

CarType{Family, Luxury} {Sports}

CarType{Sports, Luxury} {Family} OR

Page 28: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 28

Splitting Based on Continuous Attributes

Different ways of handling– Discretization to form an ordinal categorical

attribute Static – discretize once at the beginning Dynamic – ranges can be found by equal interval

bucketing, equal frequency bucketing(percentiles), or clustering.

– Binary Decision: (A < v) or (A v) consider all possible splits and finds the best cut can be more compute intensive

Page 29: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 29

Tree Induction

Greedy strategy.– Split the records based on an attribute test

that optimizes certain criterion.

Issues– Determine how to split the records How to specify the attribute test condition? How to determine the best split?

– Determine when to stop splitting

Page 30: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 30

How to determine the Best Split

Greedy approach: – Nodes with homogeneous class distribution

are preferred Need a measure of node impurity:

C0: 5C1: 5

C0: 9C1: 1

Non-homogeneous,High degree of impurity

Homogeneous,Low degree of impurity

Page 31: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 31

Entropy

The entropy of a random variable V with values vk, each with probability P(vk) is:

)

Page 32: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 32

Splitting Criteria based on INFO

Entropy at a given node t:

(NOTE: p( j | t) is the relative frequency of class j at node t).

– Measures homogeneity of a node. Maximum (log nc) when records are equally distributed

among all classes implying least informationMinimum (0.0) when all records belong to one class,

implying most information

j

tjptjptEntropy )|(log)|()(

Page 33: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 33

How to Find the Best Split

B?

Yes No

Node N3 Node N4

A?

Yes No

Node N1 Node N2

Before Splitting:

C0 N10 C1 N11

C0 N20 C1 N21

C0 N30 C1 N31

C0 N40 C1 N41

C0 N00 C1 N01

M0

M1 M2 M3 M4

M12 M34Gain = M0 – M12 vs M0 – M34

Page 34: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 34

Examples for computing Entropy

C1 0 C2 6

C1 2 C2 4

C1 1 C2 5

P(C1) = 0/6 = 0 P(C2) = 6/6 = 1Entropy = – 0 log 0 – 1 log 1 = – 0 – 0 = 0

P(C1) = 1/6 P(C2) = 5/6Entropy = – (1/6) log2 (1/6) – (5/6) log2 (1/6) = 0.65

P(C1) = 2/6 P(C2) = 4/6Entropy = – (2/6) log2 (2/6) – (4/6) log2 (4/6) = 0.92

j

tjptjptEntropy )|(log)|()(2

Page 35: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 35

Splitting Based on INFO...

Information Gain:

Parent Node, p is split into k partitions;ni is number of records in partition i

– Measures Reduction in Entropy achieved because of the split. Choose the split that achieves most reduction (maximizes GAIN)

– Used in ID3 and C4.5

k

i

i

splitiEntropy

nnpEntropyGAIN

1)()(

Page 36: Machine Learning:  Decision  Trees Homework 4 assigned

N+N-

N1+N1-

N2+N2-

Nk+Nk-

Splitting on feature f

P+ : N+ /(N++N-)P- : N- /(N++N-)

I(P+ ,, P-) = -P+ log(P+) - P- log(P- )

I(P1+ ,, P1-) I(P2+ ,, P2-) I(Pk+ ,, Pk-)

S [Ni+ + Ni- ]/[N+ + N-] I(Pi+ ,, Pi-)

i=1

k

The differenceis the informationgain

So, pick the featurewith the largest Info Gain

I.e. smallest residual info

The Information GainComputation

Given k mutually exclusive and exhaustiveevents E1….Ek whose probabilities are p1….pk

The “information” content (entropy) is defined as S i -pi log2 pi

A split is good if it reduces the entropy..

Page 37: Machine Learning:  Decision  Trees Homework 4 assigned

Ex Masochistic Anxious Nerdy HATES EXAM

1 F T F Y

2 F F T N

3 T F F N

4 T T T Y

A simple example

V(M) = 2/4 * I(1/2,1/2) + 2/4 * I(1/2,1/2) = 1

V(A) = 2/4 * I(1,0) + 2/4 * I(0,1) = 0

V(N) = 2/4 * I(1/2,1/2) + 2/4 * I(1/2,1/2) = 1

So Anxious is the best attribute to split onOnce you split on Anxious, the problem is solved

I(1/2,1/2) = -1/2 *log 1/2 -1/2 *log 1/2

= 1/2 + 1/2 =1

I(1,0) = 1*log 1 + 0 * log 0 = 0

Page 38: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 38

Decision Tree Based Classification

Advantages:– Inexpensive to construct– Extremely fast at classifying unknown records– Easy to interpret for small-sized trees– Accuracy is comparable to other classification

techniques for many simple data sets

Page 39: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 39

Tree Induction

Greedy strategy.– Split the records based on an attribute test

that optimizes certain criterion.

Issues– Determine how to split the records How to specify the attribute test condition? How to determine the best split?

– Determine when to stop splitting

Page 40: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 40

Underfitting and Overfitting

Overfitting

Underfitting: when model is too simple, both training and test errors are large

Page 41: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 41

Overfitting due to Noise

Decision boundary is distorted by noise point

Page 42: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 42

Notes on Overfitting

Overfitting results in decision trees that are more complex than necessary

Training error no longer provides a good estimate of how well the tree will perform on previously unseen records

Need new ways for estimating errors

Page 43: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 43

How to Address Overfitting

Pre-Pruning (Early Stopping Rule)– Stop the algorithm before it becomes a fully-grown tree– Typical stopping conditions for a node: Stop if all instances belong to the same class Stop if all the attribute values are the same

– More restrictive conditions: Stop if number of instances is less than some user-specified

threshold Stop if class distribution of instances are independent of the

available features (e.g., using 2 test) Stop if expanding the current node does not improve impurity

measures (e.g., Gini or information gain).

Page 44: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 44

How to Address Overfitting…

Post-pruning– Grow decision tree to its entirety– Trim the nodes of the decision tree in a

bottom-up fashion– If generalization error improves after trimming,

replace sub-tree by a leaf node.– Class label of leaf node is determined from

majority class of instances in the sub-tree

Page 45: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 45

Decision Boundary

y < 0.33?

: 0 : 3

: 4 : 0

y < 0.47?

: 4 : 0

: 0 : 4

x < 0.43?

Yes

Yes

No

No Yes No

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

x

y

• Border line between two neighboring regions of different classes is known as decision boundary

• Decision boundary is parallel to axes because test condition involves a single attribute at-a-time

Page 46: Machine Learning:  Decision  Trees Homework 4 assigned

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 46

Oblique Decision Trees

x + y < 1

Class = + Class =

• Test condition may involve multiple attributes• More expressive representation• Finding optimal test condition is computationally expensive