pattern recognition concepts

18
Stockman CSE803 Fall 2009 1 Pattern Recognition Concepts Chapter 4: Shapiro and Stockman How should objects be represented? Algorithms for recognition/matching * nearest neighbors * decision tree * decision functions * artificial neural networks How should learning/training be done?

Upload: duke

Post on 06-Feb-2016

43 views

Category:

Documents


0 download

DESCRIPTION

Pattern Recognition Concepts. Chapter 4: Shapiro and Stockman How should objects be represented? Algorithms for recognition/matching * nearest neighbors * decision tree * decision functions * artificial neural networks How should learning/training be done?. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Pattern Recognition Concepts

Stockman CSE803 Fall 2009 1

Pattern Recognition Concepts Chapter 4: Shapiro and Stockman How should objects be represented? Algorithms for recognition/matching * nearest neighbors * decision tree * decision functions * artificial neural networks How should learning/training be done?

Page 2: Pattern Recognition Concepts

Stockman CSE803 Fall 2009 2

Feature Vector Representation

X=[x1, x2, … , xn], each xj a real number

Xj may be object measurement

Xj may be count of object parts

Example: object rep. [#holes, Area, moments, ]

Page 3: Pattern Recognition Concepts

Stockman CSE803 Fall 2009 3

Possible features for char rec.

Page 4: Pattern Recognition Concepts

Stockman CSE803 Fall 2009 4

Some Terminology

Classes: set of m known classes of objects (a) might have known description for each (b) might have set of samples for each Reject Class: a generic class for objects not in any of the designated known classes Classifier: Assigns object to a class based on features

Page 5: Pattern Recognition Concepts

Stockman CSE803 Fall 2009 5

Classification paradigms

Page 6: Pattern Recognition Concepts

Stockman CSE803 Fall 2009 6

Discriminant functions

Functions f(x, K) perform some computation on feature vector x

Knowledge K from training or programming is used

Final stage determines class

Page 7: Pattern Recognition Concepts

Stockman CSE803 Fall 2009 7

Decision-Tree Classifier Uses subsets of

features in seq. Feature

extraction may be interleaved with classification decisions

Can be easy to design and efficient in execution

Page 8: Pattern Recognition Concepts

Stockman CSE803 Fall 2009 8

Decision Trees

#holes

moment ofinertia

#strokes #strokes

best axisdirection

#strokes

- / 1 x w 0 A 8 B

01

2

< t t

2 4

0 1

060

90

0 1

Page 9: Pattern Recognition Concepts

Stockman CSE803 Fall 2009 9

Classification using nearest class mean

Compute the Euclidean distance between feature vector X and the mean of each class.

Choose closest class, if close enough (reject otherwise)

Low error rate at left

Page 10: Pattern Recognition Concepts

Stockman CSE803 Fall 2009 10

Nearest mean might yield poor results with complex structure

Class 2 has two modes

If modes are detected, two subclass mean vectors can be used

Page 11: Pattern Recognition Concepts

Stockman CSE803 Fall 2009 11

Scaling coordinates by std dev

Page 12: Pattern Recognition Concepts

Stockman CSE803 Fall 2009 12

Another problem for nearest mean classification If unscaled, object

X is equidistant from each class mean

With scaling X closer to left distribution

Coordinate axes not natural for this data

1D discrimination possible with PCA

Page 13: Pattern Recognition Concepts

Stockman CSE803 Fall 2009 13

Receiver Operating Curve ROC

Plots correct detection rate versus false alarm rate

Generally, false alarms go up with attempts to detect higher percentages of known objects

Page 14: Pattern Recognition Concepts

Stockman CSE803 Fall 2009 14

Confusion matrix shows empirical performance

Page 15: Pattern Recognition Concepts

Stockman CSE803 Fall 2009 15

Bayesian decision-making

Page 16: Pattern Recognition Concepts

Stockman CSE803 Fall 2009 16

Normal distribution 0 mean and unit

std deviation Table enables us

to fit histograms and represent them simply

New observation of variable x can then be translated into probability

Page 17: Pattern Recognition Concepts

Stockman CSE803 Fall 2009 17

Parametric Models can be used

Page 18: Pattern Recognition Concepts

Stockman CSE803 Fall 2009 18

Cherry with bruise Intensities at about 750 nanometers

wavelength Some overlap caused by cherry surface

turning away