1. fisher linear discriminant 2. multiple discriminant

23
CSE 555: Srihari 0 Discriminant Analysis 1. Fisher Linear Discriminant 2. Multiple Discriminant Analysis

Upload: others

Post on 14-Jan-2022

9 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 1. Fisher Linear Discriminant 2. Multiple Discriminant

CSE 555: Srihari 0

Discriminant Analysis

1. Fisher Linear Discriminant2. Multiple Discriminant Analysis

Page 2: 1. Fisher Linear Discriminant 2. Multiple Discriminant

CSE 555: Srihari 1

Motivation

Projection that best separates the data in a least-squares sense– PCA finds components that are useful for representing

data – However no reason to assume that components are useful

for discriminating between data in different classes– Pooling data may discard directions that are essential

• OCR example: Discriminating between O and Q may ignore tail

8

Page 3: 1. Fisher Linear Discriminant 2. Multiple Discriminant

CSE 555: Srihari 2

Fisher Linear Discriminant

Projecting data from d dimensions onto a line

,.. samples of set ingcorrespond a and

as of components theofn combinatiolinear a form wish toWe labelled subset in the labelled subset in the

,.. samples ldimensiona- ofSet

1

222

111

1

n

n

yyny

DnDn

d n

xwx

xx

t=

ωω

Page 4: 1. Fisher Linear Discriminant 2. Multiple Discriminant

CSE 555: Srihari 3

Two-Dimensional ExampleFisher Liner Discriminant: two-dimensional example

Better SeparationClasses mixed

Projection of same set of two-class samples onto two different linesin the direction marked w.

Page 5: 1. Fisher Linear Discriminant 2. Multiple Discriminant

CSE 555: Srihari 4

Finding best direction w

|)|||is means projectedbetween Distance

11:points projected ofMean Sample

1:space ldimensiona-din Mean Sample

121 2

x

x

m(mw

mwxw

xm

−=−

===

=

∑∑

∈∈

t

it

D

t

iYyii

Dii

mm

ny

nm

n

ii

i

))

)

Page 6: 1. Fisher Linear Discriminant 2. Multiple Discriminant

CSE 555: Srihari 5

Criterion for Fisher Linear Discriminant

Rather than forming sample variances, define scatter for the projected samples

22)( i

Yyi mys

i

∑ −=ε

Thus ))(/1(22

21 ssn + is an estimate of the variance of the pooled data

)(22

21 ss +Total within class scatter is

)(

||)( 22

21

221

ss

mmwJ+

−=Find that linear function wtx for which

is maximum and independent of ||w||.While maximizing J(.) leads to best separation between the two projected sets, we will need a threshold criterion to have a true classifier.

Page 7: 1. Fisher Linear Discriminant 2. Multiple Discriminant

CSE 555: Srihari 6

Scatter Matrices

Within ClassScatter Matrix

Between ClassScatter Matrix

To obtain J(.) as an explicit function of w, we define scatter matrices Si and SW

Page 8: 1. Fisher Linear Discriminant 2. Multiple Discriminant

CSE 555: Srihari 7

Scatter MatricesTo obtain J(.) as an explicit function of w, we define scatter matrices Si and SW

Within ClassScatter Matrix

21

))((

SSS

mxmxS

W

Dx

tiii

i

+=

−−= ∑ε

Between ClassScatter Matrix

tB mmmmS ))(( 2121 −−=

Page 9: 1. Fisher Linear Discriminant 2. Multiple Discriminant

CSE 555: Srihari 8

Criterion Function in terms of Scatter Matrices

Page 10: 1. Fisher Linear Discriminant 2. Multiple Discriminant

CSE 555: Srihari 9

Final form of Fisher Discriminant

Page 11: 1. Fisher Linear Discriminant 2. Multiple Discriminant

CSE 555: Srihari 10

Case of Multivariate Normal pdfs

Bayes Decision rule is to compute Fisher LD and decide ω1 if it exceeds a threshold and ω2 otherwise

Page 12: 1. Fisher Linear Discriminant 2. Multiple Discriminant

CSE 555: Srihari 11

Fisher’s Linear DiscriminantExample

Discriminating between machine-print and handwriting

Page 13: 1. Fisher Linear Discriminant 2. Multiple Discriminant

CSE 555: Srihari 12

Cropped signature image

Page 14: 1. Fisher Linear Discriminant 2. Multiple Discriminant

CSE 555: Srihari 13

hmw1

wm

h1

x1 = ( h1+w1) / (hm+wm) = 0.4034x2 = h1 / w1 = 0.9355

w1

Connected components and their features

Page 15: 1. Fisher Linear Discriminant 2. Multiple Discriminant

CSE 555: Srihari 14

Feature Values of Labelled Components(from 20 images)

Print components Handwriting componentsComponent x1 x2 Compon

ent x1 x2

Page 16: 1. Fisher Linear Discriminant 2. Multiple Discriminant

CSE 555: Srihari 15

Feature distribution in 2 dimensional space

T

T

T

)0040.0,0338.0()(

3043.730076.00076.01720.0

3043.732349.32349.39554.5

)2652.1,2725.0(

)0782.1,0843.0(

1

1

−−=−=

⎥⎦

⎤⎢⎣

⎡=

⎥⎦

⎤⎢⎣

⎡−

−=

=

=

21w

w

w

2

1

mmSw

S

S

m

m

After projectionSamples are 1-DA Bayes classifieris designed for theprojected samples

Page 17: 1. Fisher Linear Discriminant 2. Multiple Discriminant

CSE 555: Srihari 16

hmw1

wm

h1

x1 = ( h1+w1) / (hm+wm) = 0.4034x2 = h1 / w1 = 0.9355

w1

Discrete component analysis

Page 18: 1. Fisher Linear Discriminant 2. Multiple Discriminant

CSE 555: Srihari 17

The obtained printed text component candidates

Page 19: 1. Fisher Linear Discriminant 2. Multiple Discriminant

CSE 555: Srihari 18

Framework of printed text filtering

Step 1: Extract shape features of bounding box for each discrete component. Detect candidates by Fisher’s Linear Discriminant and Bayesian Classification

Step 2: Remove candidates that satisfy the spatial relation defined for printed text components

Step 3: For candidates surviving from step2, remove isolated and small pieces.

Page 20: 1. Fisher Linear Discriminant 2. Multiple Discriminant

CSE 555: Srihari 19

Processed image after ( a ): Step 2, ( b ): Step 3 (final)

( a )

( b )

Page 21: 1. Fisher Linear Discriminant 2. Multiple Discriminant

CSE 555: Srihari 20

Sample images before and after enhancement

Page 22: 1. Fisher Linear Discriminant 2. Multiple Discriminant

CSE 555: Srihari 21

Multiple Discriminant Analysis

• c-class problem• Natural generalization of Fisher’s Linear

Discriminant function involves c-1 discriminant functions

• Projection is from a d-dimensional space to a c-1 dimensional space

Page 23: 1. Fisher Linear Discriminant 2. Multiple Discriminant

CSE 555: Srihari 22

Mapping from d-dimensional space to c-dimensional space d=3, c=3