03 26 2008. particle swarm optimization (pso) kennedy, j., eberhart, r. c. (1995). particle swarm...

27
PARTICLE SWARM OPTIMIZATION 03 26 2008

Upload: wesley-goodman

Post on 16-Dec-2015

227 views

Category:

Documents


0 download

TRANSCRIPT

PARTICLE SWARM OPTIMIZATION

03 26 2008

Particle Swarm Optimization (PSO) Kennedy, J., Eberhart, R. C. (1995). Particle

swarm optimization. Proc. IEEE International Conference on Neural Networks (Perth, Australia), lEEE Service Center, Piscataway, NJ, pp. IV: 1942- 1948.

Behavior of Flock of Birds

Behavior of Flock of Birds

Behavior of Flock of Birds

Behavior of Flock of Birds

Behavior of Flock of Birds

Behavior of Flock of Birds Self-Experience Success of Others

Self-Experience

Success of Others

v id = w* v id + c1* rand( ) * (p id - x id) + c2* Rand( ) * (p

gd - x id)x id = x id + v id

PSO Equationv id = w* v id + c1* rand( ) * (p id - x id) + c2* Rand( ) * (p

gd - x id)x id = x id + v id

Self-Experience

Success of Others

Position : x i

Velocity: v i

ith Particle

Previous Best Position : p i

Global Best Position : p g

Inertia

Optimization Problem

Input System Output

ParameterAdjustment

Input

System_1

Output

System_2

System_3

System_n

…n particles

Particle Swarm OptimizationCost

x

Cost

x

Cost

x

Iteration

……

v id = w* v id + c1* rand( ) * (p id - x id) + c2* Rand( ) * (p

gd - x id)x id = x id + v id

Vp

VgInertia

xk

xk+1

xk-1

Inertia Weight

v id = w* v id + c1* rand( ) * (p id - x id) + c2* Rand( ) * (p

gd - x id)x id = x id + v id

Vp

Vg

LargeInertiaWeight

xk

xk+1

xk-

1

W: inertia weight

Vp

Vg

SmallInertiaWeight

xk

xk+1

xk-

1

Inertia Weight

v id = w* v id + c1* rand( ) * (p id - x id) + c2* Rand( ) * (p

gd - x id)x id = x id + v id

W: inertia weightCost

x

Cost

x

LargeInertiaWeight

SmallInertiaWeight

InertiaWeight

Large

Small

Global Search

LocalSearch

Fuzzy Adaptive PSO Kennedy, J., Eberhart, R. C. (2001).“Fuzzy adaptive

particle swarm optimization,” in Proc. IEEE Int. Congr. Evolutionary Computation, vol. 1, 2001, pp. 101–106.

InertiaWeight

Large

Small

Global Search

LocalSearch

Fuzzy Adaptive

Normalized Current Best Performance Evaluation (NCBPE)

minmax

min

CBPECBPE

CBPECBPENCBPE

Cost

xCBPE

CBPEmax

CBPEmin

Fuzzy Adaptive PSOInertiaWeight

Large

Small

Global Search

LocalSearch

Fuzzy Adaptive

NCBPEL M H

MembershipMembership

0

1

WeightL M H

MembershipMembership

0

1

W_Change

L M H

MembershipMembership

0

1

A description of a fuzzy system for adapting the inertia weight of PSO.

Fuzzy Rule

Fuzzy Rule

Experimental ResultsMinimization

Linearly Decreasing Inertia Weight

Fuzzy Adaptive Inertia Weight

The performance of PSO is not sensitive to the population size, and the scalability of the PSO is acceptable.

Application Example1 Feature Training for Face Detection

Iteration 1

Iteration 2

Iteration k

Application Example2 Neural Network TrainingV.G. Gudisz, G.K. Venayagamoorthy, Comparison of particle swarm optimization and backpropagation as training algorithms for neural networks, in: IEEE Swarm Intelligence Symposium 2003 (SIS 2003), Indianapolis, IN, 2003, pp. 110–117.

Introduction of Neural Network

ai = W ij X for i=1 to 4, j=1,2Where X = [x 1]T

di = 1 / (1-eai)

y = [V1 V2 V3 V4 ][d1 d2 d3 d4 ] T

Neural Network Training Backpropagation PSO

Neural Network Training Backpropagation PSO

Neural Network Training Backpropagation PSO Parameter Set of PSO

Training Results

Training 2x4x1 neural network to fit y = 2x2+1

Mean square error curve of neural networks during mining withBP and PSO for bias 1

Test curve for trained neural networks with fixed weights obtained from BP and PSO training algorithm with bias 1

Conclusions

The concept of PSO is introduced. PSO is an extremely simple algorithm

for global optimization problem. Low memory cost Low computational cost

Fuzzy system is implemented to dynamically adjust the inertia weight to improve the performance of PSO.