a fast nearest neighbor classifier based on self-organizing incremental neural network (soinn)

Post on 22-Mar-2016

53 Views

Category:

Documents

5 Downloads

Preview:

Click to see full reader

DESCRIPTION

A fast nearest neighbor classifier based on self-organizing incremental neural network (SOINN). Presenter : Lin, Shu -Han Authors : Shen Furao ,, Osamu Hasegawa. Neuron Networks (NN, 2008). Outline. Introduction Motivation Objective Methodology Experiments Conclusion Comments. - PowerPoint PPT Presentation

TRANSCRIPT

Intelligent Database Systems Lab

N.Y.U.S.T.I. M.

A fast nearest neighbor classifier based on

self-organizing incremental neural network(SOINN)

Neuron Networks (NN, 2008)

Presenter : Lin, Shu-HanAuthors : Shen Furao,, Osamu Hasegawa

Intelligent Database Systems Lab

N.Y.U.S.T.I. M.

2

Outline

Introduction Motivation Objective Methodology Experiments Conclusion Comments

Intelligent Database Systems Lab

N.Y.U.S.T.I. M.

3

Introduction - self-organizing incremental neural network (SOINN)

Distance: Too far

Node = prototype

Intelligent Database Systems Lab

N.Y.U.S.T.I. M.

4

Introduction - self-organizing incremental neural network (SOINN)

Link age

Intelligent Database Systems Lab

N.Y.U.S.T.I. M.

5

Introduction - self-organizing incremental neural network (SOINN)

Age: Too old

Intelligent Database Systems Lab

N.Y.U.S.T.I. M.

6

Introduction - self-organizing incremental neural network (SOINN)

Run two times

Insert node if error is large

Cancel Insertion if insert is no use

Intelligent Database Systems Lab

N.Y.U.S.T.I. M.

7

Introduction - self-organizing incremental neural network (SOINN)

Run two times

Delete outlier:Nodes without

neighbor(low-density assumption)

Intelligent Database Systems Lab

N.Y.U.S.T.I. M.Motivation

SOINN classifier (their first research in 2005) Use 6 user determined parameters Do not mentioned about noise Too many prototypes Unsupervised learning

Their second research (in 2007)talk aboutthese weakness

8

Intelligent Database Systems Lab

N.Y.U.S.T.I. M.

9

Objectives

Propose a Improved version of SOINN, ASC (Adjust SOINN Classifier) FASTER: delete/less prototype

Training phase Classification phase

CLASSIFIER: 1-NN (prototype) rule INCREMENTAL LEARNING ONE LAYER: easy to understand the setting,

less parameters~ MORE STABLE: help of k-means

Intelligent Database Systems Lab

N.Y.U.S.T.I. M.Methodology – Adjusted SOINN

10

Distance: Too far

A node is a cluster

Intelligent Database Systems Lab

N.Y.U.S.T.I. M.Methodology – Adjusted SOINN

11

Link age

Intelligent Database Systems Lab

N.Y.U.S.T.I. M.Methodology – Adjusted SOINN

12

Winner

Neighbor

Intelligent Database Systems Lab

N.Y.U.S.T.I. M.Methodology – Adjusted SOINN

13

Age: Too old > ad

Intelligent Database Systems Lab

N.Y.U.S.T.I. M.Methodology – Adjusted SOINN

14

Delete outlier:Nodes without

neighbor(low-density assumption)

Intelligent Database Systems Lab

N.Y.U.S.T.I. M.Methodology – Adjusted SOINN

15

Lambda = iterations

Intelligent Database Systems Lab

N.Y.U.S.T.I. M.Methodology – k-means

16

Help of k-means clustering, k = # of neurons Adjust the result prototypes: assume that each node nearby the centroid

of class

Intelligent Database Systems Lab

N.Y.U.S.T.I. M.Methodology – noise-reduction

17

Help of k-Edit Neighbors Classifier (ENC), k=? Delete the node which label are differs from the majority voting of its

k-neighbors: assume that are generated by noise

Intelligent Database Systems Lab

N.Y.U.S.T.I. M.Methodology – center-cleaning

18

Delete neurons: if it has never been the nearest neuron to other class: assume that are lies in the central part of class

Intelligent Database Systems Lab

N.Y.U.S.T.I. M.Experiments: Artificial dataset

19

dataset Adjusted SOINN

ASC

Error: sameSpeed: faster

Intelligent Database Systems Lab

N.Y.U.S.T.I. M.Experiments: Artificial dataset

20

dataset Adjusted SOINN

ASC

Error: sameSpeed: faster

Intelligent Database Systems Lab

N.Y.U.S.T.I. M.Experiments: Artificial dataset

21

dataset Adjusted SOINN

ASC

Error: betterSpeed: faster

Intelligent Database Systems Lab

N.Y.U.S.T.I. M.Experiments: Artificial dataset

22

dataset Adjusted SOINN

ASC

Error: betterSpeed: faster

Intelligent Database Systems Lab

N.Y.U.S.T.I. M.Experiments: Real dataset

23

Compression ratio (%)

Speed up ratio (%)

Intelligent Database Systems Lab

N.Y.U.S.T.I. M.

Experiments: Compare with other prototype-based classification method

24

Nearest Subclass Classifier (NSC) k-Means Classifier (KMC) k-NN Classifier (NNC) Learning Vector Quantization (LVQ)

Intelligent Database Systems Lab

N.Y.U.S.T.I. M.

Experiments: Compare with other prototype-based classification method

25

Intelligent Database Systems Lab

N.Y.U.S.T.I. M.

26

Conclusions

ASC Learns the number of nodes needed to determine the decision boundary Incremental neural network Robust to noisy training data Fast classification Fewer parameters: 3 parameters

Intelligent Database Systems Lab

N.Y.U.S.T.I. M.

27

Comments

Advantage Improve many things A previous paper to demonstrate the thing they want to modify

Drawback NO Suggestion of parameters

Application A work from unsupervised learning to supervised learning

top related