presenter : fen-rou, ciou authors : toh koon charlie neo, dan ventura 2012, prl
DESCRIPTION
A direct boosting algorithm for the k-nearest neighbor classifier via local warping of the distance metric. Presenter : Fen-Rou, Ciou Authors : Toh Koon Charlie Neo, Dan Ventura 2012, PRL. Outlines. Motivation Objectives Methodology Experiments Conclusions Comments. Motivation. - PowerPoint PPT PresentationTRANSCRIPT
Intelligent Database Systems Lab
Presenter : Fen-Rou, Ciou
Authors : Toh Koon Charlie Neo, Dan Ventura
2012, PRL
A direct boosting algorithm for the k-nearest neighbor classifier via local warping
of the distance metric
Intelligent Database Systems Lab
OutlinesMotivationObjectivesMethodologyExperimentsConclusionsComments
Intelligent Database Systems Lab
Motivation
• The k-nearest neighbor pattern classifier is an
effective learning algorithm, it can result in large
model sizes.
Intelligent Database Systems Lab
Objectives
• The paper present a direct boosting algorithm for the
k-NN classifier that creates an ensemble of models
with locally modified distance weighting to increase
the accuracy and condense the model size.
Intelligent Database Systems Lab
Methodology - Framework
AdaBoost
v = {+, }
Dzxi
x
Intelligent Database Systems Lab
Methodology
Sensitivity data order - Randomize - Batch update
Intelligent Database Systems Lab
Methodology
Sensitivity data order - Randomize - Batch update
Intelligent Database Systems Lab
Methodology
Voting mechanism - simple voting - error-weigh voting
Intelligent Database Systems Lab
Methodology
Condensing model size - optimal weight - average the weight
Intelligent Database Systems Lab
Experiments
Intelligent Database Systems Lab
Experiments
Intelligent Database Systems Lab
Experiments
Fig 8. Boosted k-NN with randomized data order.Fig 9. Boosted k-NN with batch update. Fig 10. Boosted k-NN with error-weighted voting.Fig 11. Boosted k-NN with optimal weights. Fig 12. Boosted k-NN with average weights.
Intelligent Database Systems Lab
Experiments
Intelligent Database Systems Lab
Conclusions• The Boosted k-NN can boost the generalization
accuracy of the k-nearest neighbor algorithm.
• The Boosted k-NN algorithm modifier the decision
surface, producing a better solution.
Intelligent Database Systems Lab
Comments• Advantages– The paper describes rich experiment.
• Applications– classification