optimization and particle swarm optimization (o & pso)

32
By: Nosheen Memon B.E = Electrical engineerin ( QUCEST LARKANA) # 12 BATCH.. OPTIMIZATION AND PARTICLE SWARM OPTIMIZATION (PSO)

Upload: nosheen-memon

Post on 20-Mar-2017

72 views

Category:

Engineering


3 download

TRANSCRIPT

Page 1: Optimization and particle swarm optimization (O & PSO)

By: Nosheen MemonB.E = Electrical engineering( QUCEST LARKANA)# 12 BATCH..

OPTIMIZATION AND PARTICLE SWARM OPTIMIZATION (PSO)

Page 2: Optimization and particle swarm optimization (O & PSO)

Optimization toolbox

Solve linear, quadratic, integer, and nonlinear optimization problems

Optimization Toolbox™ provides functions for finding parameters that minimize or maximize objectives while satisfying constraints. The toolbox includes solvers for linear programming, mixed-integer linear programming, quadratic programming, nonlinear optimization, and nonlinear least squares. You can use these solvers to find optimal solutions to continuous and discrete problems, perform tradeoff analyses, and incorporate optimization methods into algorithms and applications.

The optimum seeking methods are also known as mathematical programming techniques and are generally studied as a part of operations research.

Optimization is the mathematical discipline which is concerned with finding the maxima and minima of functions, possibly subject to constraints.

Page 3: Optimization and particle swarm optimization (O & PSO)

Cont..The conventional design procedures aim at finding an

acceptable or adequate design which merely satisfies the functional and other requirements of the problem.

In general, there will be more than one acceptable design, and the purpose of optimization is to choose the best one of the many acceptable designs available.

Thus a criterion has to be chosen for comparing the different alternative acceptable designs and for selecting the best one.

The criterion with respect to which the design is optimized, when expressed as a function of the design variables, is known as the objective function.

Page 4: Optimization and particle swarm optimization (O & PSO)

Cont..• With multiple objectives there arises a possibility of conflict,

and one simple way to handle the problem is to construct an overall objective function as a linear combination of the conflicting multiple objective functions.

• Thus, if f1 (X) and f2 (X) denote two objective functions, construct a new (overall) objective function for optimization as:

where 1 and 2 are constants whose values indicate the relative importance of one objective function to the other.

)()()( 2211 XXX fff

Page 5: Optimization and particle swarm optimization (O & PSO)

Minimum and maximum points of function• If a point x* corresponds to the minimum value of the function

f (x), the same point also corresponds to the maximum value of the negative of the function, -f (x). Thus optimization can be taken to mean minimization since the maximum of a function can be found by seeking the minimum of the negative of the same function.

Page 6: Optimization and particle swarm optimization (O & PSO)

ConstraintsBehaviour constraints: Constraints that represent limitations on the behaviour or performance of the system are termed behaviour or functional constraints. Side constraints: Constraints that represent physical limitations on design variables such as manufacturing limitations. Why we optimize• In general, there will be more than one acceptable design, and the purpose of optimization is to choose the best one of the many acceptable designs available.

• The criterion with respect to which the design is optimized, when expressed as a function of the design variables, is known as the objective function

Page 7: Optimization and particle swarm optimization (O & PSO)

Cont..• In civil engineering, the objective is usually taken as the

minimization of the cost.

• In mechanical engineering, the maximization of the mechanical efficiency is the obvious choice of an objective function.

• In aerospace structural design problems, the objective function for minimization is generally taken as weight.

• In some situations, there may be more than one criterion to be satisfied simultaneously. An optimization problem involving multiple objective functions is known as a multiobjective programming problem.

Page 8: Optimization and particle swarm optimization (O & PSO)

Particle Swarm Optimization(PSO)

IntroductionMany difficulties such as multi-modality, dimensionality and differentiability are associated with the optimization of large-scale problems.

Traditional techniques such as steepest decent, linear programing and dynamic programing generally fail to solve such large-scale problems especially with nonlinear objective functions.

Traditional techniques often fail to solve optimization problems that have many local optima.

To overcome these problems, there is a need to develop more powerful optimization techniques.

Page 9: Optimization and particle swarm optimization (O & PSO)

Cont..• Some of the well-known population-based optimization

techniques are:• Genetic Algorithms (GA)• Artificial Immune Algorithms (AIA)• Ant Colony Optimization (ACO)• Particle Swarm Optimization (PSO)• Bacteria Foraging Optimization (BFO)• Artificial Bee Colony (ABC)• Biogeography-Based Optimization (BBO) Etc.

Page 10: Optimization and particle swarm optimization (O & PSO)

Swarm Intelligence (SI)

SI is artificial intelligence, based on the collective behavior of decentralized, self-organized systems.

The expression was introduced by Gerardo Beni and Jing Wang in 1989, in the context of cellular robotic systems.

SI systems are typically made up of a population of simple agents interacting locally with one another and with their environment.

Natural examples of SI include ant colonies, bird flocking, animal herding, bacterial growth, and fish schooling.

Page 11: Optimization and particle swarm optimization (O & PSO)

Some SI Application

• U.S. military is investigating swarm techniques for controlling unmanned vehicles.

NASA is investigating the use of swarm technology for planetary mapping.

Page 12: Optimization and particle swarm optimization (O & PSO)

Particle Swarm Optimization (PSO)

Particle swarm optimization (PSO) is an evolutionary computation technique developed by Kennedy and Eberhart.

It exhibits common evolutionary computation attributes including initialization with a population of random solutions and searching for optima by updating generations.

A Simulation of a simplified social system.

Particle swarm optimization or PSO is a global optimization, population-based evolutionary algorithm for dealing with problems in which a best solution can be represented as a point or surface in an n-dimensional space.

Hypotheses are plotted in this space and seeded with an initial velocity, as well as a communication channel between the particles.

Page 13: Optimization and particle swarm optimization (O & PSO)

How it works? • PSO is initialized with a group of random particles (solutions) and

then searches for optima by updating generations.• Particles move through the solution space, and are evaluated according to some fitness criterion after each time step. In every iteration, each part, the first one is the best solution (fitness) it has achieved so far (the fitness value is also stored). This value is called pbest.• Another "best" value that is tracked by the particle swarm optimizer is the best value obtained so far by any particle in the population.• This second best value is a global best and called gbest.

Page 14: Optimization and particle swarm optimization (O & PSO)

Cont..•When a particle takes part of the population as its topological neighbors, the second best value is a local best and is called lbest.• Neighborhood bests allow parallel exploration of the search space

and reduce the susceptibility of PSO to falling into local minima, but slow down convergence speed, particle is updated by following two "best" values.

Page 15: Optimization and particle swarm optimization (O & PSO)

Particle Properties

With Particle Swarm Optimization, a swarm of particles (individuals) in a n dimensional search space G is simulated, where each particle p has a position p.g

∈ G ⊆ Rn and a velocity p.v ∈ Rn.The position p.g corresponds to the genotypes, and, in most

cases, also to the solution candidates, i. e., p.x = p.g, since most often the problem space X is also the Rn and X = G. However, this is not necessarily the case and generally, we can introduce any form of genotype-phenotype mapping in Particle Swarm Optimization.

The velocity vector p.v of an individual p determines in which direction the search will continue and if it has an explorative (high velocity) or an exploitive (low velocity) character.

Page 16: Optimization and particle swarm optimization (O & PSO)

• A particle status on the search space is characterized by two factors:• its position (Xi) • and velocity(Vi).And will be updated according to the following equations:

Page 17: Optimization and particle swarm optimization (O & PSO)

• Vi= [vi1,vi2, ...,vin] called the velocity for particle i.• Xi= [xi1,xi2, ..., xin] represents the position of particle i.• Pbest :represents the best previous position of particle i(i.e., local-best position or its experience)• Gbest :represents the best position among all particles in the population X= [X1,X2, . . .,XN] (i.e. global-best position)• Rand(.)and rand(.): are two random variables between [0,1].• C1 and C2 : are positive numbers called acceleration coefficients that guide each particle toward the individual best and the swarm best positions, respectively.

Page 18: Optimization and particle swarm optimization (O & PSO)

• Eq. (1)

• first part , Vi [k], represents particle’s previous velocity. • It stores previous flight direction .• to prevent the particle from drastically changing its

direction and alignment it to its current direction.• second part, C1 Rand (.)(pbesti [k]-Xi[k]), • is called cognition part and resembles individual memory of

the position that was best for the particle.• its effect is to linearly attract the particle towards its own

best experience .

Page 19: Optimization and particle swarm optimization (O & PSO)

• third part, C2rand (.) (gbest [k] -Xi [k]), is called social or cooperation component• resembles a group standard which individuals seek to attain.• The effect of this term is to attract the particle towards the

best experience of the all particles in the swarm scaled by random weight .

Page 20: Optimization and particle swarm optimization (O & PSO)

Particle Swarm OptimizationAlgorithmFor each particle

    Initialize particle with feasible random numberEndDo    For each particle         Calculate the fitness value        If the fitness value is better than the best fitness value (pbest) in history            Set current value as the new pbest    End

Choose the particle with the best fitness value of all the particles as the gbest    For each particle         Calculate particle velocity according to velocity update equation

        Update particle position according to position update equation     End While maximum iterations or minimum error criteria is not attained

Page 21: Optimization and particle swarm optimization (O & PSO)

PSO Algorithm (General)Searches Hyperspace of Problem for Optimum1) Define problem to search• How many dimensions?• Solution criteria?2) Initialize Population• Random initial positions• Random initial velocities3) Determine Best Position• Global Best Position• Personal Best Position4) Update Velocity and Position Equations

Page 22: Optimization and particle swarm optimization (O & PSO)

Particle Swarm OptimizationSwarm Topology• In PSO, there have been two basic topologies used in the literature

• Ring Topology (neighborhood of 3) Star Topology (global neighborhood)• I4

I0 I1

I2 I3

I4

I0 I1

I2 I3

Page 23: Optimization and particle swarm optimization (O & PSO)

• The step-by-step implementation

Page 24: Optimization and particle swarm optimization (O & PSO)

Step 1:Initialize PSO parameters which are necessary for the algorithm.

population size which indicates the number of individuals,

number of generations necessary for the termination criterion,

cognitive constant, social constant, variation of inertia weight, maximum velocity, number of design variables and respective ranges for the design variables.

Page 25: Optimization and particle swarm optimization (O & PSO)

Step 2:Generate random population equal to the population size

specified.Each population member contains the value of all the design

variables. This value of design variable is randomly generated in between the design variable range specified.

population means the group of birds (particles) which represents the set of solutions.

Page 26: Optimization and particle swarm optimization (O & PSO)

Step 3:Obtain the values of the objective function for all the

population members.For the first iteration, value of objective function indicates the

pBest for the respective particle in the solution. Identify the particle with best objective function value which

identifies as gBest. If the problem is a constrained optimization problem, then a

specific approach such as static penalty, dynamic penalty and adaptive penalty is used to convert the constrained optimization problem into the unconstrained optimization problem.

Page 27: Optimization and particle swarm optimization (O & PSO)

Step 4:Update the velocity of each particle and Check for the maximum

velocity. If the velocity obtained exceeds the maximum velocity,then reduce the existing velocity to the maximum velocity.

Step 5:Update the position of the particles, Check all the design variables for the upper and lower limits. Step 6: Obtain the value of objective function for all the particles. The new solution replaces the pBest if it has better function value. Identify the gBest from the population. Update the value of inertia weight if required.

Page 28: Optimization and particle swarm optimization (O & PSO)

Step 7:Best obtained results are saved using elitism. All elite members are not modified using crossover and

mutation operators but can be replaced if better solutions are obtained in any iteration.

Step 8:Repeat the steps (from step 4) until the specified number of

generations or termination criterion is reached.

Page 29: Optimization and particle swarm optimization (O & PSO)

Simplified steps•Step 1: initialize a population array. Initialize each particle with a random velocity and position•Step 2: For each particle, evaluate the desired optimization fitness function. Calculate the objective value of all particles .•Step 3: Set the position and objective of each particle as Pi and Pbest ,respectively.

compare particle’s fitness evaluation with its pbest; if current value is better than its pbest; , then pbest= current value,• pi=current location xi in D- dimensional space. identify the particle with best success so far, and design its index to variable g. Set the position and objective of the particle with the best fitness (least objective) as Pg and gbest, respectively

Page 30: Optimization and particle swarm optimization (O & PSO)

Yi(t+1) :best position of the particleF: fitness functionX(t+1): current position of the particle.Y’(t):best position of the particle according to all particles.

• Step 4: Update Particles’ Vi and Xi according to Equations (1) and (2).• change the velocity and position of particle according to the

equation 3.

• Step 5: if a criterion meet, then exit.• Step 6: if criteria are not meet, go to step 3.

Page 31: Optimization and particle swarm optimization (O & PSO)

AdvantagesPSO is based on the intelligence. It can be applied into

both scientific research and engineering use.PSO have no overlapping and mutation calculation.The search can be carried out by the speed of the particle.

During the development of several generations, only the most optimist particle can transmit information onto the other particles, and the speed of the researching is very fast.

The calculation in PSO is very simple. Compared with the other developing calculations, it occupies the bigger optimization ability and it can be completed easily.

PSO adopts the real number code, and it is decided directly by the solution. The number of the dimension is equal to the constant of the solution.

Page 32: Optimization and particle swarm optimization (O & PSO)

Disadvantages The method easily suffers from the partial optimism, which

causes the less exact at the regulation of its speed and the direction.

The method can not work out the problems of scattering and The method can not work out the problems of non-coordinate

system, such as the solution to the energy field and the moving rules of the particles in the energy field