improved particle swarm optimization algorithm based on last...
TRANSCRIPT
Research ArticleImproved Particle Swarm Optimization Algorithm Based onLast-Eliminated Principle and Enhanced Information Sharing
Xueying Lv1 Yitian Wang1 Junyi Deng2 Guanyu Zhang 13 and Liu Zhang 13
1College of Instrumentation amp Electrical Engineering Jilin University Changchun 130061 China2College of Computer Science and Technology Jilin University Changchun 130022 China3National Engineering Research Center of Geophysics Exploration Instruments Jilin University Changchun 130061 China
Correspondence should be addressed to Guanyu Zhang zhangguanyujlueducn and Liu Zhang zhangliujlueducn
Received 18 May 2018 Revised 20 September 2018 Accepted 2 October 2018 Published 5 December 2018
Academic Editor Cornelio Yantildeez-Marquez
Copyright copy 2018 Xueying Lv et al is is an open access article distributed under the Creative Commons Attribution Licensewhich permits unrestricted use distribution and reproduction in any medium provided the original work is properly cited
In this study an improved eliminate particle swarm optimization (IEPSO) is proposed on the basis of the last-eliminated principleto solve optimization problems in engineering design During optimization the IEPSO enhances information communicationamong populations and maintains population diversity to overcome the limitations of classical optimization algorithms in solvingmultiparameter strong coupling and nonlinear engineering optimization problems ese limitations include advanced con-vergence and the tendency to easily fall into local optimizatione parameters involved in the imported ldquolocal-global informationsharingrdquo term are analyzed and the principle of parameter selection for performance is determined e performances of theIEPSO and classical optimization algorithms are then tested by using multiple sets of classical functions to verify the global searchperformance of the IEPSO e simulation test results and those of the improved classical optimization algorithms are comparedand analyzed to verify the advanced performance of the IEPSO algorithm
1 Introduction
e development of industrial society has led to the successfulapplication of the optimal design methods to diverse engi-neering practices such as path planning structural designcontrol theory and control engineering [1ndash10] In 1995 theforaging behavior of bird swarm inspired Kennedy andEberhart to propose the particle swarm optimization (PSO)algorithm PSO requires few parameter adjustments and iseasy to implement hence it is the most commonly usedswarm intelligence algorithm [11ndash20] However in practicalapplications most problems are complicated design problemswith multiple parameters strong coupling and nonlinearityerefore improving the global optimization capability of anoptimization algorithm is important in solving complexengineering optimization problems To improve the capabilityof traditional PSO many scholars have proposed improve-ment strategies including the adjustment of parameters andcombinations of various mechanisms
Shi and Eberhant [21] proposed an inertial weight im-provement strategy (SPSO) with strong global search
capability at the beginning of an iteration strong local searchcapability in the latter iteration and fine search near theoptimal solution Although the SPSO improves the con-vergence speed of the algorithm the ldquoprematurerdquo phe-nomenon remains Zhang [22] proposed an improved PSOalgorithm with adaptive inertial weight that is based onBayesian technology to balance the development and ex-ploration capability of populations Ratnawecra [23] pro-posed a linear adjustment method for learning factors In theearly stages of the iteration the particle flight was mainlybased on the historical information of the particle itself andthe latter particle flight was mainly based on the socialinformation between the particle and the global optimalparticle However this method still has defects e best fitfor the initial global search is similar to the local optimumMoreover convergence is only limited to some optimalregions rather than globally thereby causing the PSO al-gorithm to fall into the local extrema Chen and Ke [24]proposed a chaotic dynamic weight (CDW) PSO (CDW-PSO) algorithm Chaotic maps and dynamic weights wereused to modify the search process Although CDW-PSO
HindawiComputational Intelligence and NeuroscienceVolume 2018 Article ID 5025672 17 pageshttpsdoiorg10115520185025672
indicates an improved search performance relative to othernatural heuristic optimization algorithms it also easily fallsinto the local optimum Chen [25] proposed a dynamicmultiswarm differential learning PSO (DMSDL-PSO) algo-rithm in which the differential evolutionmethod is applied toeach subgroup combined with a differential mutation methodto conduct a global search and a quasi-Newton method isapplied for local search e DMSDL-PSO algorithm hasgood exploration and exploitation capabilities Jiang [26]proposed a new binary hybrid PSO with wavelet mutation(HPSOWM) in which the motion mechanism and mutationprocess of particles are converted into binary elements andthe problem is transformed from a continuous space prob-lem into a discrete domain one Although the convergencespeed of the HPSOWM algorithm is stable and robust itsconvergence rate is lower than those of other intelligentoptimization algorithms To solve the dynamic multiobjectiveoptimization problem with rapid environmental changea study proposed a cooperative multiswarm PSO for dynamicmultiobjective optimization (CMPSODMO) [27] In com-parison with other dynamic multiobjective optimization al-gorithms CMPSODMO indicates a better effect in addressinguncertain rapid environmental changes Ye [28] proposeda multiswarm PSO algorithm with dynamic learning strategiesin which the population is divided into ordinary and com-munication particlesedynamic communication informationof communication particles was applied to the algorithm tomaintain particle population diversity Although this methodimproves the capability of the algorithm to handle complexmultimodal functions it increases the computational com-plexity of the algorithm Cui [29] proposed a globally optimalprediction-based adaptive mutation PSO (GPAM-PSO) toavoid the local optimal problem of traditional PSO algorithmsHowever GPAM-PSO is limited to the dimensionality re-duction of nonzero mean data Zhang [30] proposed a vectorcovariance PSO algorithm that divides all the dimensions ofa particle into several parts randomly and optimizes each part toenhance the global and local search capabilities However thealgorithm continues to fall into the local extrema
PSO has attracted considerable research attention due toits easy implementation few parameter adjustments andadaptability Scholars use PSO to solve engineering opti-mization problems and gradually penetrate various fields ofapplication such as parameter optimization path planningpredictive control and global optimization Zhao [31] usedPSO to optimize wavelet neural network parameters reducethe limitations of the assessment of network security situ-ational awareness and thereby meet the requirements ofnetwork security in a big data environment e parameter-related coefficients in a nonlinear regression analysis modelwere optimized by combining particle swarm with a geneticphase [32] to reduce the vibrations caused by mine blastingthat damages the structures around the blasting area ederived diffusion-free PSO algorithm was used to estimatethe parameters of an infinite impulse response system andimprove the energy utilization of an infinite sensor network[33] Wang [34] used a multiobjective PSO algorithm tosolve a path-planning problem of mobile robots in a staticrough terrain environment Wang [35] combined PSO with
chaos optimization theory to establish amathematical modelof a path-planning problem in the radioactive environmentof nuclear facilities to ensure personnel safety Lopes [36]proposed a novel particle swarm-based heuristic techniqueto allocate electrical loads in an industrial setting throughoutthe day Multiobjective PSOwas used to solve the problem ofservice allocation in cloud computing [37] Petrovic [38]proposed chaos PSO to achieve an optimal process dis-patching plan Zhang [39] proposed an adaptive PSO tosolve problems in reservoir operation optimization withcomplex and dynamic nonlinear constraints
An improved PSO algorithm was used for a time-seriesprediction of a grayscale model [40] e algorithm reducesthe average relative error between the recovery and mea-sured values of the model to avoid the problems caused bythe optimization of background values Gulcu [41] used PSOto establish a power demand forecasting model
In view of the aforementioned methods an improvedPSO (IEPSO) algorithm is proposed in the present work InIEPSO the last-eliminated principle is used to update thepopulation and maintain particle population diversity eglobal search capability of the IEPSO algorithm is improvedby adding local-global information sharing terms A mul-tigroup test function is used for comparison with the IEPSOA classical optimization algorithm and its improved versionsare used to test and verify the global optimization perfor-mance of the IEPSO algorithm
2 IEPSO
21 Standard PSO e initial population of the PSO al-gorithm is randomized e IEPSO updates the position andspeed of the particle swarm by adaptive learning as shown infollowing formulas
vt+1id ωv
tid + C1R1 p
tid minus x
tid1113872 1113873 + C2R2 p
tgd minusx
tid1113872 1113873
xtid x
tid + v
t+1id
(1)
where ω is the inertial weight C1 and C2 are the accelerationterms R1 and R2 are the random variables uniformly dis-tributed in the range of (0 1) Pt
g is the global better positionPit is the particle that finds the best position in history xidis the particle position in the current iteration and vidt+1 is theparticle update speed at the next iteration
22 IEPSO e IEPSO algorithm is mainly based on thelast-eliminated principle and enhances the local-global in-formation sharing capability to improve its global optimi-zation performance e specific implementation of theIEPSO algorithm is shown in Figure 1
e position and velocity of particles in a population arerandomly initialized and the fitness value of the particles iscalculated Information on the current individual and globaloptimal particles including their positions and fitness valuesis saved en the particle swarm operation is conducted Inthe IEPSO algorithm Formula (2) is used to update the speedto balance the exploration and exploitation capabilities of theparticles in the global optimization process Formula (3) is thelocal-global information sharing term
2 Computational Intelligence and Neuroscience
vt+1id ωvtid + C1R1 ptid minusx
tid( ) + C2R2 p
tgd minusx
tid( )
+ C3R3 ptgd minusp
tid
∣∣∣∣∣∣∣∣∣∣
(2)
φ3 C3R3 ptgd minusp
tid
∣∣∣∣∣∣∣∣∣∣ (3)
Formula (2) comprises four parts namely the in-heritance of the previous speed particle self-cognition localinformation sharing and ldquolocal-global information sharingrdquo
e IEPSO algorithm is not limited to one-way com-munication between global and individual particles elocal-global information sharing term (φ3) is added to the
Initial parameter setting JD1 JD2 SL1 SL2 UB LB
PopSize Dim C1 C2 C3 R1 R2 R3 w Vmax MaxIter X and V
Particle swarm operation
Update speed position inertia weight
and acceleration term
If the particlersquos position and velocity exceed the set boundary range
Eliminate inferior particles and supplement new
particles
(i) Solve and save the population fitness value
If the current particle is the best
Give up new particles
Replace the local optimum
End
No
Yes
No
Yes
No
Yes
The principle of eliminationupdate the population with
the lattermost
If the global convergence condition is satisfied
If the local convergence condition is satisfied
No
Yes
(i) Solve and save the population fitness value and position
Randomize particle position andvelocity
(ii) Select the current best particle
(ii) Select the current best particle
Figure 1 IEPSO algorithm owcharts
Computational Intelligence and Neuroscience 3
information exchange between the local optimum and globaloptimal particles obtained by the current iteration and thepopulation velocity is updated by Formula (2) In the earlystage of the algorithm the entire search space is searched ata relatively high speed to determine the approximate rangeof the optimal solution the result is beneficial for globalsearch In the latter stage most of the particle search space isgradually reduced and concentrated in the neighborhood ofthe optimal value for deep search the result is beneficial forlocal search
e particles that have not exceeded the predeterminedrange after the speed update continue to retain their originalspeed e maximum value of the velocity is assigned to theparticle that is beyond the predetermined range after thespeed is updated e particles that have not exceeded thepredetermined range after the location update continue toretain their original positionsWhen the particles are beyondthe predetermined range inferior particles are eliminated byadding new particles to the population within the pre-determined range thereby forming a new population efitness value of the new population is recalculated and theinformation of the individual particle and its global optimalposition and fitness value obtained by the current iterationare preserved In all the algorithms particles have goodglobal search capability at the beginning of the iteration andas individual particles move closer to the local optimalparticle the algorithms gradually lose particle diversity Onthe basis of the idea of population variation of the traditionalgenetic algorithm (GA) the last-eliminated principle isapplied in the IEPSO algorithm to maintain particle pop-ulation diversity When the PSO satisfies the local conver-gence condition the optimal value obtained at this time maybe the local optimal value Particle population diversity ismaintained by using the particle fitness function as theevaluation criterion thereby eliminating particles with poorfitness or high similarity New particles are added to a newspecies in a predetermined range and the particle swarmoperations are reexecuted If the number of the currentiteration reaches the required predefined convergence ac-curacy the iteration is stopped and the optimal solution isproduced e complexity and runtime of the algorithmincrease due to the increased local-global informationsharing and the last-eliminated principle Neverthelessexperimental results show that the improved method canenhance the accuracy of the algorithm
3 Experimental Study
Eleven test functions are adopted in this study to test theperformance of the proposed IEPSO In this test f1ndashf5 areunimodal functions whereas f6ndashf11 are multimodal func-tions f6 (Griewank) is a multimodal function with multiplelocal extrema in which achieving the theoretically globaloptimum is difficult f7 (Rastrigin) possesses several localminima in which finding the global optimal value is dif-ficult f10 (Ackley) is an almost flat area modulated bya cosine wave to form a hole or a peak the surface isuneven and entry to a local optimum during optimizationis easy f11 (Cmfun) possesses multiple local extrema
around the global extremum point and falling into thelocal optimum is easy Table 1 presents the 11 test func-tions whereD is the space dimension S is the search rangeand CF is the theoretically optimal value31 Parameter InfluenceAnalysis of Local-Global InformationSharing Term is study proposes the addition of a local-global information sharing term which involves the pa-rameter C3 erefore the following exploration is con-ducted in a manner in which C3 is selected by using the 11test functions
(1) When C3 takes a constant value constant 2 isselected
(2) e linear variation formula of C3 is as follows
C3 k C3 startminus C3 startminusC3 end( 1113857 timest
tmax1113888 1113889 (4)
where k is the control factor When k 1 C3 is a linearlydecreasing function when k minus1 C3 is a linearly increasingfunctionC3_start andC3_end are the initial and terminationvalues of C3 respectively T is the iteration number and tmaxis the maximum number of iterations
Tables 2 and 3 and Figure 2 show that C3 is a constantthat linearly declines and linearly increases in three casesWhen the parameter C3 of the local-global informationsharing term is a linearly decreasing function the averagefitness value of the testing function is optimal and theconvergence speed and capability to jump out of the localextrema are higher than those in the other two cases WhenC3 takes a constant the algorithm cannot balance the globaland local search resulting in a ldquoprematurerdquo phenomenonWhen C3 adopts the linearly decreasing form the entire areacan be quickly searched at an early stage and close attentionis paid to local search in the latter part of the iteration toenhance the deep search ability of the algorithm While C3adopts a linearly increasing form it focuses on the global-local information exchange in the latter stage of the iterationAlthough this condition can increase the deep search abilityof the algorithm it will cause the convergence speed tostagnate erefore compared with the linearly increasingform the linearly decreasing form shows a simulation curvethat converges faster and with higher precision
erefore the selection rules of the parameter C3 oflocal-global information sharing in a decreasing function areinvestigated in this studye nonlinear variation formula ofC3 is as follows
C3 C3 startminusC3 end( 1113857 times tan 0875 times 1minust
tmax1113888 1113889
k
⎛⎝ ⎞⎠
+ C3 end
(5)where C3_start and C3_end are the initial and terminationvalues of the acceleration term C3 respectively and k is thecontrol factor When k 02 C3 is a convex decreasingfunction when k 2 C3 is a concave decreasing function t isthe iteration number and tmax is the maximum number ofiterations
4 Computational Intelligence and Neuroscience
Table 4 shows that when C3 is a convex function theprecision and robustness of the algorithm can obtain sat-isfactory results on f1ndashf5 Table 5 shows that when C3 is
a convex function the algorithm obtains a satisfactory so-lution and shows a fast convergence rate on f6 f8 f9 f10 andf11 In the unimodal test function the IEPSO algorithm does
Table 1 11 test functions
No Test function S CFf1 Sphere f1(x) 1113936
Di1x
2i [10] [minus100100]D 0
f2 Schaffer f(x y) 05 + (sin2x2 + y2
1113968minus 05)(1 + 0001(x2 + y2))2 [33] [minus100100]D 0
f3 Step f3(x) 1113936Di1[xi + 05]2 [10] [minus100100]D 0
f4 SumSquares f4(x) 1113936Di1ix
2i [10] [minus1010]D 0
f5 Zakharov f5(x) 1113936Di1x
2i + (1113936
Di105 times ix2
i )2 + (1113936Di105 times ixi)
4 [10] [minus100100]D 0
f6 Griewank f6(x) 140001113936Di1x
2i minus1113937
Di1cos(xi
i
radic+ 1) [10] [minus600600]D 0
f7 Rastrigin f7(x) 1113936Di1[x2
i minus 10 cos(2πxi + 10)] [10] [minus512512]D 0f8 Alpine f8(x) 1113936
Di1(|xi sinxi| + 01xi) [6] [minus1010]D 0
f9 Shubert minf9(x y) 11139365i i cos[(i + 1)x + i]1113966 1113967 times 1113936
5i i cos[(i + 1)y + i]1113966 1113967 [minus1010]D minus186731
f10 Ackley f10(x) minus20 exp(minus02
1D1113936Di1x
2i
1113969
)minus exp(1D1113936Di1cos 2πxi) + 20 + e [10] [minus3232]D 0
f11 Cmfun f11(x y) x sin(|x|
radic) + y sin(
|y|
1113968) [minus500500] minus837966
Table 2 Unimodal test functions
Functions Criteria C3 2 C3 2sim0 k minus1 C3 2sim0 k 1
f1Mean 722E + 02 107E minus 06 450E minus 20SD 397E + 04 111E minus 12 375E minus 16Best 405E + 02 241E minus 08 155E minus 25
f2Mean 250E minus 06 222E minus 17 0SD 232E minus 12 259E minus 33 0Best 285E minus 07 0 0
f3Mean 199E + 02 803E minus 07 182E minus 20SD 215E + 04 104E minus 12 105E minus 39Best 3581 395E minus 08 322E minus 24
f4Mean 7110 245E minus 08 820E minus 20SD 957 795E minus 16 511E minus 38Best 147 408E minus 09 843E minus 26
f5Mean 174E + 03 386E minus 04 556E minus 11SD 244E + 05 149E minus 07 488E minus 14Best 829E + 02 978E minus 06 354E minus 11
Table 3 Multimodal test functions
Functions Criteria C3 2 C3 2sim0 k minus1 C3 2sim0 k 1
f6Mean 110 818E minus 02 492E minus 02SD 46E minus 03 837E minus 04 596E minus 04Best 096 433E minus 02 123E minus 02
f7Mean 3503 410 19Ee minus 04SD 844 25461 5649Best 2967 2057 225E minus 05
f8Mean 293 133E minus 03 528E minus 10SD 030 310E minus 08 223E minus 12Best 202 134E minus 05 583E minus 13
f9Mean minus1867295 minus1867309 minus1867309SD 120E minus 06 0 0Best minus1867307 minus1867309 minus1867309
f10Mean 7649 235E minus 04 184E minus 11SD 0415 439E minus 09 227E minus 22Best 6513 573E minus 05 250E minus 12
f11Mean minus8379658 minus8379658 minus8379658SD 450E minus 09 0 440E minus 09Best minus8379658 minus8379658 minus8379658
Computational Intelligence and Neuroscience 5
0 50 100 150 200Iterations
0
500
1000
1500
2000
Fitn
ess v
alue
2
15
1
05
0100
0
ndash100 ndash100ndash50
050
100
k = ndash1k = 1Constant
times104
(a)
50 100 150 200Iterations
0
002
004
006
008Fi
tnes
s val
ue
08
06
04
02
0
ndash100
0
100
ndash100ndash50
500
100
k = ndash1k = 1Constant
(b)
50 100 150 200Iterations
0
500
1000
1500
2000
2500
3000
Fitn
ess v
alue
2times104
15
1
05
0100
0
ndash100 ndash100ndash50
0 50100
k = ndash1k = 1Constant
(c)
Figure 2 Continued
6 Computational Intelligence and Neuroscience
50 100 150 200Iterations
0
10
20
30
40
50
60
Fitn
ess v
alue
600
400
200
010
0
ndash10 ndash10 ndash5
05
10
k = ndash1k = 1Constant
(d)
50 100 150 200Iterations
0
5000
10000
15000
Fitn
ess v
alue
2times108
15
1
05
0100
0
ndash100 ndash100ndash50
050
100
k = ndash1k = 1Constant
(e)
50 100 150 200Iterations
0
05
1
15
2
Fitn
ess v
alue
2
15
1
05
0200
150100
50 50100
150200
k = ndash1k = 1Constant
(f )
Figure 2 Continued
Computational Intelligence and Neuroscience 7
50 100 150 200Iterations
0
10
20
30
40
50
60
Fitn
ess v
alue
80
60
40
20
05
0
ndash5 ndash50
5
k = ndash1k = 1Constant
(g)
50 100 150 200Iterations
0
1
2
3
4
5
6
Fitn
ess v
alue
15
10
5
010
0
ndash10 ndash10ndash5 0
105
k = ndash1k = 1Constant
(h)
0 50 100 150 200Iterations
ndash1865
ndash186
ndash1855
ndash185
ndash1845
ndash184
Fitn
ess v
alue
200
100
0
ndash100
200150
10050 50
100150 200
k = ndash1k = 1Constant
(i)
Figure 2 Continued
8 Computational Intelligence and Neuroscience
50 100 150 200Iterations
0
5
10
15
Fitn
ess v
alue
15
10
5
0
200
ndash20 ndash200
20
k = ndash1k = 1Constant
(j)
0 50 100 150 200Iterations
ndash840
ndash820
ndash800
ndash780
ndash760
ndash740
ndash720
Fitn
ess v
alue
1000
500
0
ndash500
500ndash1000
0
ndash500 ndash5000
500
k = ndash1k = 1Constant
(k)
Figure 2 11 test functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquares function (e) f5 Zakharovfunction (f ) f6 Griewank function (g) f7 Rastrigin function (h) f8 alpine function (i) f9 Shubert function (j) f10 Ackley function (k) f11Cmfun function
Table 4 Unimodal test functions
Functions Criteria C3 2sim0 k 02 C3 2sim0 k 2 C3 2sim0 k 1
f1Mean 266E minus 20 551E minus 10 450E minus 20SD 265E minus 39 287E minus 19 375E minus 16Best 912E minus 24 138E minus 11 155E minus 25
f2Mean 0 0 0SD 0 0 0Best 0 0 0
f3Mean 621E minus 19 604E minus 10 182E minus 20SD 263E minus 36 779E minus 19 105E minus 39Best 181E minus 27 308E minus 11 322E minus 24
f4Mean 170E minus 21 242E minus 11 820E minus 20SD 131E minus 41 440E minus 22 511E minus 38Best 282E minus 29 436E minus 12 843E minus 26
f5Mean 165E minus 10 283E minus 11 556E minus 11SD 330E minus 20 359E minus 11 488E minus 14Best 217E minus 11 100E minus 11 354E minus 11
Computational Intelligence and Neuroscience 9
not show its advantages because of its strong deep searchcapability In the complex multimodal test function whenthe convex function is used in C3 the downward trend isslow in the early stage thus beneting the global search andthe downward speed increases in the later stage thusbeneting the local search When the concave function isused for C3 the descent speed is fast in the early stageAlthough the search speed is improved the coverage area ofthe search is reduced thereby leading to the convergence ofthe algorithm to the nonoptimal value From the simulationdiagrams (f)ndash(k) the convergence speed is observed to beslightly slow when C3 is a convex function but its ability tojump out of the local extremum and the accuracy of theglobal search are higher than those in the other two casesWhen C3 is a concave function the convergence speed isfaster than those in the other two cases and the searchaccuracy is lower than that when C3 is a convex function
32 Comparison of Test Results e 11 test functions inFigure 1 are used to compare the IEPSO algorithm withclassical PSO SPSO dierential algorithm (DE) and GAe DE GA and PSO algorithms are all stochastic in-telligent optimization algorithms with population iterationse evaluation criteria of algorithm performance includespeed of convergence and size of individual populationsearch coverage e dierential optimization algorithm hasa low space complexity and obvious advantages in dealingwith large-scale and complex optimization problems eGA has good convergence when solving discrete multipeakand noise-containing optimization problems Based on thetraditional PSO algorithm the SPSO algorithm achieves thebalance between global search and local search by adjustingthe inertial weight (Figures 3 and 4)
e experimental parameters of the ve algorithmsare set as shown in Table 6 Each test function is runindependently 10 times and the average is recorded toreduce the data error e iteration is stopped when the
convergence condition meets the convergence accuracy ebest average tness value of the ve algorithms is blackenede standard deviation average tness and optimal value ofeach algorithm are shown in Tables 7 and 8 Figures 5 and 6plot the convergence curves of the 11 test functions
Table 7 shows that the IEPSO has the best performanceon f1 f2 f3 and f4 e IEPSO algorithm obtains the the-oretical optimal value on f2 DE can search the global so-lution on f5 e deep search capability of the IEPSOalgorithm is considerably higher than that of the PSO andSPSO algorithms due to the increased global-local in-formation sharing term and the last-eliminated principlee crossover mutation and selection mechanisms makethe DE algorithm perform well in the early stage of the globalsearch However the diversity of the population declines inthe latter stage because of population dierences e
0 2 4 6 8 10Iterations
0
02
04
06
08
1
C3
k = 2k = 02k = 1
Figure 3 e change curve of C3 with the number of iterations
Table 5 Multimodal test functions
Functions Criteria C3 2sim0 k 02 C3 2sim0 k 2 C3 2sim0 k 1
f6Mean 419E minus 02 479E minus 02 492E minus 02SD 343E minus 04 707E minus 04 596E minus 04Best 125E minus 02 57E minus 03 123E minus 02
f7Mean 446E minus 03 500E minus 05 19Ee minus 04SD 173E minus 04 303E minus 06 5649Best 231E minus 12 389E minus 11 225E minus 05
f8Mean 242E minus 10 374E minus 10 528E minus 10SD 674E minus 20 247E minus 12 223E minus 12Best 371E minus 16 436E minus 11 583E minus 13
f9Mean minus1867309 minus1867309 minus1867309SD 0 0 0Best minus1867309 minus1867309 minus1867309
f10Mean 113E minus 11 205E minus 10 184E minus 11SD 221E minus 22 437E minus 12 227E minus 22Best 506E minus 14 175E minus 10 250E minus 12
f11Mean minus8379658 minus8379658 minus8379658SD 0 0 440E minus 09Best minus8379658 minus8379658 minus8379658
10 Computational Intelligence and Neuroscience
0 50 100 150 200Iterations
0
500
1000
1500
2000
2500
3000Fi
tnes
s val
ue
2 4 6 8 10 120
1000
2000
k = 2k = 02k = 1
(a)
50 100 150 200Iterations
0
0005
001
0015
Fitn
ess v
alue
5 10 15 200
1
2
3times10ndash3
k = 02k = 2k = 1
(b)
50 100 150 200Iterations
0
500
1000
1500
2000
2500
3000
Fitn
ess v
alue
2 4 60
1000
2000
k = 1k = 2k = 02
(c)
50 100 150 200Iterations
0
10
20
30
40
50
60
Fitn
ess v
alue
2 4 6 8 100
20
40
k = 02k = 2k = 1
(d)
50 100 150 200Iterations
0
2000
4000
6000
8000
10000
12000
14000
Fitn
ess v
alue
5 10 15 200
2000
4000
6000
k = 02k = 2k = 1
(e)
50 100 150 200Iterations
0
05
1
15
2
Fitn
ess v
alue
5 10 15 20 25 300
05
1
15
k = 1k = 2k = 02
(f )
Figure 4 Continued
Computational Intelligence and Neuroscience 11
50 100 150 200Iterations
0
10
20
30
40
50Fi
tnes
s val
ue
20 40 60 800
10
20
30
k = 02k = 2k = 1
(g)
50 100 150 200Iterations
0
1
2
3
4
Fitn
ess v
alue
2 4 6 8 10 120
1
2
3
k = 02k = 2k = 1
(h)
0 50 100 150 200Iterations
ndash1865
ndash186
ndash1855
ndash185
ndash1845
ndash184
Fitn
ess v
alue
10 20 30 40
ndash1865
ndash186
ndash1855
ndash185
k = 02k = 2k = 1
(i)
50 100 150 200Iterations
0
5
10
15
Fitn
ess v
alue
10 20 300
2
4
k = 02k = 2k = 1
(j)
0 50 100 150 200Iterations
ndash840
ndash820
ndash800
ndash780
ndash760
ndash740
ndash720
Fitn
ess v
alue
k = 02k = 2k = 1
60 65 70 75 80ndash838ndash836ndash834ndash832ndash830
(k)
Figure 4 11 test functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquare function (e) f5 Zakharovfunction (f ) f6 Griewank function (g) f7 Rastrigin function (h) f8 alpine function (i) f9 Shubert function (j) f10 Ackley function (k) f11Cmfun function
12 Computational Intelligence and Neuroscience
simulation diagrams (a)ndash(e) show that although the DEalgorithm converges rapidly in the early stage its globalsearch performance in the later stage becomes lower thanthat of the IEPSO algorithm When the GA is used to solveoptimization problems the individuals in the population fallinto the local optimum and do not continue searching forthe optimum solution erefore in Figure 5 the simulationcurve of the GA converges to the local optimum
e test results in Table 8 indicate that the IEPSO has thebest performance on f6 f7 f8 f9 f10 and f11 and that the DE
and GA can obtain the theoretical optimal value on f9 and f11Although the GA and IEPSO algorithm can obtain the globaloptimal value on f9 the IEPSO algorithm is more robust thanthe GA is As shown in the simulation curve of Figure 6 thediversity of the population is maintained because the sup-plementary particles in the population are stochastic whenthe local optimal solution converges gradually e IEPSOalgorithm can jump out of the local extrema points in theface of complex multimodal test functions and the numberof iterations required is correspondingly reduced
Table 6 Parameter settings
Algorithm Population Maximum iteration Dim of each object OthersPSO 40 1000 10 C1 C2 2 R1 R2 05SPSO 40 1000 10 ω 09ndash04 C1 C2 2 R1 R2 05DE 40 1000 10 mdashGA 40 1000 10 GGAP 05 PRECI 25IEPSO 40 1000 10 ω 09ndash04 C1 C2 2 C3 2ndash0 R1 R2 R3 05
Table 7 Unimodal test functions
Functions Criteria PSO SPSO DE IEPSO GA
f1Mean 133E + 03 308E + 03 731E minus 12 892E minus 22 11696SD 253E + 05 121E + 06 225E minus 23 265E minus 39 44192Best 114E + 03 120E + 03 242E minus 12 772E minus 27 4660
f2Mean 296E minus 02 880E minus 02 837E minus 06 0 179E minus 11SD 836E minus 04 896E minus 04 158E minus 10 0 0Best 455E minus 03 8428734 755E minus 10 0 179E minus 11
f3Mean 119E + 03 251E + 03 114E minus 11 621E minus 19 7430SD 293E + 05 182E + 06 995E minus 23 263E minus 36 5833Best 106E + 03 282E minus 02 210E minus 12 181E minus 27 4542
f4Mean 8238 8210 336E minus 13 170E minus 21 3031SD 686E + 02 140E + 03 995E minus 26 131E minus 41 0835Best 115E + 02 3739 115E minus 13 282E minus 29 1968
f5Mean 126E + 04 860E + 03 702E minus 12 165E minus 10 362E + 03SD 206E + 07 215E + 07 181E minus 23 330E minus 20 344E + 05Best 104E + 04 130E + 02 267E minus 12 217E minus 11 253E + 03
Table 8 Multimodal test functions
Functions Criteria PSO SPSO DE IEPSO GA
f6Mean 1548 1752 944E minus 02 419E minus 02 1006SD 0026 0093 487E minus 04 343E minus 04 0018Best 1236 1417 006 0013 0794
f7Mean 57737 43405 11945 446E minus 03 8939SD 117768 65178 16502 173E minus 04 3608Best 35981 317E + 01 6398 231E minus 12 5040
f8Mean 4996 4665 379E minus 02 242E minus 10 0423SD 191E + 00 1056 54E minus 03 674E minus 20 0051Best 2933 3151 46E minus 03 371E minus 16 0086
f9Mean minus186448 minus186048 minus186728 minus186731 minus186731SD 119E minus 01 983E minus 01 229E minus 08 0 999E minus 12Best minus187E + 02 minus186731 minus1867309 minus1867309 minus186731
f10Mean 13134 15560 1613 113E minus 11 2515SD 14260 2163 0 221E minus 22 0166Best 2861 12719 1613 506E minus 14 1796
f11Mean minus740326 minus715438 minus837966 minus837966 minus837966SD 874E + 03 723E + 03 0 0 0Best minus837966 minus837697 minus837966 minus837966 minus837966
Computational Intelligence and Neuroscience 13
0 100 200 300 400 500 600Iterations
0
2000
4000
6000
8000
10000
12000Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
DE
GASPSO
PSOIEPSO
0 100 200 300 400 500 600Iterations
0
01
02
03
04
05
Fitn
ess v
alue
(b)
DEGAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
5000
10000
15000
Fitn
ess v
alue
(c)
DE
GAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
200
400
600
800
Fitn
ess v
alue
(d)
DEGAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
1
2
3
4
Fitn
ess v
alue
times104
(e)
Figure 5 Unimodal functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquares function (e) f5Zakharov function
14 Computational Intelligence and Neuroscience
0 100 200 300 400 500 600Iterations
0
05
1
15
2
25
3Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
0 100 200 300 400 500 600Iterations
0
20
40
60
80
100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(b)
0 100 200 300 400 500 600Iterations
0
5
10
15
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(c)
0 100 200 300 400 500 600Iterations
ndash180
ndash160
ndash140
ndash120
ndash100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(d)
0 100 200 300 400 500 600Iterations
0
5
10
15
20
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(e)
100 200 300 400 500 600Iterations
ndash850
ndash800
ndash750
ndash700
ndash650
ndash600
ndash550
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(f )
Figure 6 Multimodal functions (a) f6 Griewank function (b) f7 Rastrigin function (c) f8 alpine function (d) f9 Shubert function (e) f10Ackley function (f ) f11 Cmfun function
Computational Intelligence and Neuroscience 15
Table 9 shows the test results for the three improved PSOalgorithms e DMSDL-PSO algorithm in [25] is a PSOalgorithm combined with differential variation and thequasi-Newton method whereas the HPSOWM algorithm in[26] is a binary PSO algorithm based on wavelet transformTable 9 shows that the IEPSO algorithm obtains the bestvalue in 5 out of the 11 test functions and the above analysisindicates that the IEPSO outperforms the other improvedPSO algorithms
4 Conclusion
In contemporary engineering design solving the globaloptimization problems of multiparameter strongly coupledand nonlinear systems using conventional optimizationalgorithms is difficult In this study an improved PSO thatis the IEPSO algorithm is proposed on the basis of the last-eliminated principle and an enhanced local-global in-formation sharing capability e comparison and analysisof the simulation results indicate the following conclusions
(1) e exchange of information between global andlocal optimal particles enhances the deep searchcapability of the IEPSO algorithm
(2) e standard test function is used to simulate theparameter C3 of the local-global information sharingterm e results show that the global optimizationcapability of the IEPSO algorithm is strong when C3is linearly decreasing Moreover the proposed al-gorithm can show the best search performance whenC3 is a nonlinear convex function
(3) e last-eliminated principle is used in the IEPSO tomaintain particle population diversity MoreoverPSO is avoided in the local optimal value A com-parison of the IEPSO algorithm with the classicaloptimization algorithm and its improved versionsverifies the global search capability of the IEPSOalgorithm
In summary the comparative results of the simulationanalysis reveal that with the application of the last-eliminatedprinciple and the local-global information sharing term to the
IEPSO the proposed algorithm effectively overcomes thedisadvantages of the classical algorithms including theirprecocious convergence and tendency to fall into the localoptimum e IEPSO shows an ideal global optimizationperformance and indicates a high application value for solvingpractical engineering optimization problems
Data Availability
e data used to support the findings of this study areavailable from the corresponding author upon request
Conflicts of Interest
e authors declare that there are no conflicts of interest
Acknowledgments
is work was supported by Shanghai Rising-Star Program(no 16QB1401000) Key Project of Shanghai Science andTechnology Committee (no 16DZ1120400) and the Na-tional Natural Science Foundation of China (Project no51705187) the Postdoctoral Science Foundation of China(Grant no 2017M621202)
References
[1] Z Zhou J Wang Z Zhu D Yang and J Wu ldquoTangentnavigated robot path planning strategy using particle swarmoptimized artificial potential fieldrdquo Optik vol 158 pp 639ndash651 2018
[2] P Du R Barrio H Jiang and L Cheng ldquoAccurate Quotient-Difference algorithm error analysis improvements and ap-plicationsrdquo Applied Mathematics and Computation vol 309pp 245ndash271 2017
[3] L Jiang Z Wang Y Ye and J Jiang ldquoFast circle detectionalgorithm based on sampling from difference areardquo Optikvol 158 pp 424ndash433 2018
[4] H Garg ldquoA hybrid PSO-GA algorithm for constrained op-timization problemsrdquo Applied Mathematics amp Computationvol 274 no 11 pp 292ndash305 2016
[5] J Zhang and P Xia ldquoAn improved PSO algorithm for pa-rameter identification of nonlinear dynamic hystereticmodelsrdquo Journal of Sound and Vibration vol 389 pp 153ndash167 2017
[6] R Saini P P Roy and D P Dogra ldquoA segmental HMMbasedtrajectory classification using genetic algorithmrdquo ExpertSystems with Applications vol 93 pp 169ndash181 2018
[7] P R D O D Costa S Mauceri P Carroll et al ldquoA geneticalgorithm for a vehicle routing problemrdquo Electronic Notes inDiscrete Mathematics vol 64 pp 65ndash74 2017
[8] V Jindal and P Bedi ldquoAn improved hybrid ant particleoptimization (IHAPO) algorithm for reducing travel time inVANETsrdquo Applied Soft Computing vol 64 pp 526ndash5352018
[9] Z Peng H Manier and M A Manier ldquoParticle swarmoptimization for capacitated location-routing problemrdquoIFAC-PapersOnLine vol 50 no 1 pp 14668ndash14673 2017
[10] G Xu and G Yu ldquoReprint of on convergence analysis ofparticle swarm optimization algorithmrdquo Journal of ShanxiNormal University vol 4 no 14 pp 25ndash32 2008
[11] J Lu W Xie and H Zhou ldquoCombined fitness functionbased particle swarm optimization algorithm for system
Table 9 ree improved particle swarm algorithm test results
Functions Criteria IEPSO DMSDL-PSO [25]
BHPSOWM[26]
f1Mean 892E minus 22 473E minus 10 4240SD 265E minus 39 181E minus 09 5211
f3Mean 621E minus 19 237E + 03 761SD 263E minus 36 571E + 02 007
f6Mean 419E minus 02 866E minus 05 mdashSD 343E minus 04 296E minus 04 mdash
f7Mean 446E minus 03 915E + 01 7618SD 173E minus 04 180E + 01 2675
f8Mean 242E minus 10 131E + 02 mdashSD 674E minus 20 582E + 01 mdash
f10Mean 113E minus 11 101E + 00 172SD 221E minus 22 271E minus 01 0
16 Computational Intelligence and Neuroscience
identificationrdquo Computers amp Industrial Engineering vol 95pp 122ndash134 2016
[12] F Javidrad and M Nazari ldquoA new hybrid particle swarm andsimulated annealing stochastic optimization methodrdquo Ap-plied Soft Computing vol 60 pp 634ndash654 2017
[13] J Jie J Zhang H Zheng and B Hou ldquoFormalized model andanalysis of mixed swarm based cooperative particle swarmoptimizationrdquo Neurocomputing vol 174 pp 542ndash552 2016
[14] A Meng Z Li H Yin S Chen and Z Guo ldquoAcceleratingparticle swarm optimization using crisscross searchrdquo In-formation Sciences vol 329 pp 52ndash72 2016
[15] L Wang B Yang and J Orchard ldquoParticle swarm optimi-zation using dynamic tournament topologyrdquo Applied SoftComputing vol 48 pp 584ndash596 2016
[16] M S Kiran ldquoParticle swarm optimization with a new updatemechanismrdquo Applied Soft Computing vol 60 pp 670ndash6782017
[17] H C Tsai ldquoUnified particle swarm delivers high efficiency toparticle swarm optimizationrdquo Applied Soft Computingvol 55 pp 371ndash383 2017
[18] S F Li and C Y Cheng ldquoParticle swarm optimization withfitness adjustment parametersrdquo Computers amp Industrial En-gineering vol 113 pp 831ndash841 2017
[19] Y Chen L Li H Peng J Xiao Y Yang and Y Shi ldquoParticleswarm optimizer with two differential mutationrdquo Applied SoftComputing vol 61 pp 314ndash330 2017
[20] Q Zhang W Liu X Meng B Yang and A V VasilakosldquoVector coevolving particle swarm optimization algorithmrdquoInformation Sciences vol 394 pp 273ndash298 2017
[21] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimization[C]Evolutionary computationrdquo in Proceedingsof the 1999 Congress on Evolutionary Computation-CEC99vol 3 pp 1945ndash1950 IEEE Washington DC USA 1999
[22] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[23] A Ratnaweera S K Halgamuge and H C Watson ldquoSelf-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficientsrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 240ndash255 2004
[24] K Chen F Zhou and A Liu ldquoChaotic dynamic weightparticle swarm optimization for numerical function optimi-zationrdquo Knowledge-Based Systems vol 139 pp 23ndash40 2018
[25] Y Chen L Li H Peng J Xiao and Q Wu ldquoDynamic multi-swarm differential learning particle swarm optimizerrdquoSwarm and Evolutionary Computation vol 39 pp 209ndash2212018
[26] F Jiang H Xia Q A Tran Q M Ha N Q Tran and J HuldquoA new binary hybrid particle swarm optimization withwavelet mutationrdquo Knowledge-Based Systems vol 130pp 90ndash101 2017
[27] R Liu J Li C Mu J fan and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimiza-tion for dynamic multi-objective optimizationrdquo EuropeanJournal of Operational Research vol 261 no 3 pp 1028ndash1051 2017
[28] W Ye W Feng and S Fan ldquoA novel multi-swarm particleswarm optimization with dynamic learning strategyrdquo AppliedSoft Computing vol 61 pp 832ndash843 2017
[29] L Zhang Y Tang C Hua and X Guan ldquoA new particleswarm optimization algorithm with adaptive inertia weightbased on Bayesian techniquesrdquo Applied Soft Computingvol 28 pp 138ndash149 2015
[30] Q Cui Q Li G Li et al ldquoGlobally-optimal prediction-basedadaptive mutation particle swarm optimizationrdquo InformationSciences vol 418 pp 186ndash217 2017
[31] D Zhao and J Liu ldquoStudy on network security situationawareness based on particle swarm optimization algorithmrdquoComputers amp Industrial Engineering vol 125 pp 764ndash7752018
[32] H Samareh S H Khoshrou K Shahriar M M Ebadzadehand M Eslami ldquoOptimization of a nonlinear model forpredicting the ground vibration using the combinationalparticle swarm optimization-genetic algorithmrdquo Journal ofAfrican Earth Sciences vol 133 pp 36ndash45 2017
[33] M Dash T Panigrahi and R Sharma ldquoDistributed parameterestimation of IIR system using diffusion particle swarm op-timization algorithmrdquo Journal of King Saud University-Engineering Sciences 2017 In press
[34] B Wang S Li J Guo and Q Chen ldquoCar-like mobile robotpath planning in rough terrain using multi-objective particleswarm optimization algorithmrdquo Neurocomputing vol 282pp 42ndash51 2018
[35] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[36] R F Lopes F F Costa A Oliveira et al ldquoAlgorithm based onparticle swarm applied to electrical load scheduling in anindustrial settingrdquo Energy vol 147 pp 1007ndash1015 2018
[37] F Sheikholeslami and N J Navimipour ldquoService allocation inthe cloud environments using multi-objective particle swarmoptimization algorithm based on crowding distancerdquo Swarmamp Evolutionary Computation vol 35 pp 53ndash64 2017
[38] M Petrovic N Vukovic M Mitic et al ldquoIntegration ofprocess planning and scheduling using chaotic particle swarmoptimization algorithmrdquo Expert Systems with Applicationsvol 64 pp 569ndash588 2016
[39] Z Zhang Y Jiang S Zhang S Geng H Wang and G SangldquoAn adaptive particle swarm optimization algorithm forreservoir operation optimizationrdquo Applied Soft ComputingJournal vol 18 no 4 pp 167ndash177 2014
[40] K Li L Liu J Zhai T M Khoshgoftaar and T Li ldquoeimproved grey model based on particle swarm optimizationalgorithm for time series predictionrdquo Engineering Applica-tions of Artificial Intelligence vol 55 pp 285ndash291 2016
[41] S Gulcu and H Kodaz ldquoe estimation of the electricityenergy demand using particle swarm optimization algorithma case study of Turkeyrdquo Procedia Computer Science vol 111pp 64ndash70 2017
Computational Intelligence and Neuroscience 17
Computer Games Technology
International Journal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom
Journal ofEngineeringVolume 2018
Advances in
FuzzySystems
Hindawiwwwhindawicom
Volume 2018
International Journal of
ReconfigurableComputing
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
thinspArtificial Intelligence
Hindawiwwwhindawicom Volumethinsp2018
Hindawiwwwhindawicom Volume 2018
Civil EngineeringAdvances in
Hindawiwwwhindawicom Volume 2018
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawiwwwhindawicom Volume 2018
Hindawi
wwwhindawicom Volume 2018
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Engineering Mathematics
International Journal of
RoboticsJournal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Computational Intelligence and Neuroscience
Hindawiwwwhindawicom Volume 2018
Mathematical Problems in Engineering
Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018
Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom
The Scientific World Journal
Volume 2018
Hindawiwwwhindawicom Volume 2018
Human-ComputerInteraction
Advances in
Hindawiwwwhindawicom Volume 2018
Scientic Programming
Submit your manuscripts atwwwhindawicom
indicates an improved search performance relative to othernatural heuristic optimization algorithms it also easily fallsinto the local optimum Chen [25] proposed a dynamicmultiswarm differential learning PSO (DMSDL-PSO) algo-rithm in which the differential evolutionmethod is applied toeach subgroup combined with a differential mutation methodto conduct a global search and a quasi-Newton method isapplied for local search e DMSDL-PSO algorithm hasgood exploration and exploitation capabilities Jiang [26]proposed a new binary hybrid PSO with wavelet mutation(HPSOWM) in which the motion mechanism and mutationprocess of particles are converted into binary elements andthe problem is transformed from a continuous space prob-lem into a discrete domain one Although the convergencespeed of the HPSOWM algorithm is stable and robust itsconvergence rate is lower than those of other intelligentoptimization algorithms To solve the dynamic multiobjectiveoptimization problem with rapid environmental changea study proposed a cooperative multiswarm PSO for dynamicmultiobjective optimization (CMPSODMO) [27] In com-parison with other dynamic multiobjective optimization al-gorithms CMPSODMO indicates a better effect in addressinguncertain rapid environmental changes Ye [28] proposeda multiswarm PSO algorithm with dynamic learning strategiesin which the population is divided into ordinary and com-munication particlesedynamic communication informationof communication particles was applied to the algorithm tomaintain particle population diversity Although this methodimproves the capability of the algorithm to handle complexmultimodal functions it increases the computational com-plexity of the algorithm Cui [29] proposed a globally optimalprediction-based adaptive mutation PSO (GPAM-PSO) toavoid the local optimal problem of traditional PSO algorithmsHowever GPAM-PSO is limited to the dimensionality re-duction of nonzero mean data Zhang [30] proposed a vectorcovariance PSO algorithm that divides all the dimensions ofa particle into several parts randomly and optimizes each part toenhance the global and local search capabilities However thealgorithm continues to fall into the local extrema
PSO has attracted considerable research attention due toits easy implementation few parameter adjustments andadaptability Scholars use PSO to solve engineering opti-mization problems and gradually penetrate various fields ofapplication such as parameter optimization path planningpredictive control and global optimization Zhao [31] usedPSO to optimize wavelet neural network parameters reducethe limitations of the assessment of network security situ-ational awareness and thereby meet the requirements ofnetwork security in a big data environment e parameter-related coefficients in a nonlinear regression analysis modelwere optimized by combining particle swarm with a geneticphase [32] to reduce the vibrations caused by mine blastingthat damages the structures around the blasting area ederived diffusion-free PSO algorithm was used to estimatethe parameters of an infinite impulse response system andimprove the energy utilization of an infinite sensor network[33] Wang [34] used a multiobjective PSO algorithm tosolve a path-planning problem of mobile robots in a staticrough terrain environment Wang [35] combined PSO with
chaos optimization theory to establish amathematical modelof a path-planning problem in the radioactive environmentof nuclear facilities to ensure personnel safety Lopes [36]proposed a novel particle swarm-based heuristic techniqueto allocate electrical loads in an industrial setting throughoutthe day Multiobjective PSOwas used to solve the problem ofservice allocation in cloud computing [37] Petrovic [38]proposed chaos PSO to achieve an optimal process dis-patching plan Zhang [39] proposed an adaptive PSO tosolve problems in reservoir operation optimization withcomplex and dynamic nonlinear constraints
An improved PSO algorithm was used for a time-seriesprediction of a grayscale model [40] e algorithm reducesthe average relative error between the recovery and mea-sured values of the model to avoid the problems caused bythe optimization of background values Gulcu [41] used PSOto establish a power demand forecasting model
In view of the aforementioned methods an improvedPSO (IEPSO) algorithm is proposed in the present work InIEPSO the last-eliminated principle is used to update thepopulation and maintain particle population diversity eglobal search capability of the IEPSO algorithm is improvedby adding local-global information sharing terms A mul-tigroup test function is used for comparison with the IEPSOA classical optimization algorithm and its improved versionsare used to test and verify the global optimization perfor-mance of the IEPSO algorithm
2 IEPSO
21 Standard PSO e initial population of the PSO al-gorithm is randomized e IEPSO updates the position andspeed of the particle swarm by adaptive learning as shown infollowing formulas
vt+1id ωv
tid + C1R1 p
tid minus x
tid1113872 1113873 + C2R2 p
tgd minusx
tid1113872 1113873
xtid x
tid + v
t+1id
(1)
where ω is the inertial weight C1 and C2 are the accelerationterms R1 and R2 are the random variables uniformly dis-tributed in the range of (0 1) Pt
g is the global better positionPit is the particle that finds the best position in history xidis the particle position in the current iteration and vidt+1 is theparticle update speed at the next iteration
22 IEPSO e IEPSO algorithm is mainly based on thelast-eliminated principle and enhances the local-global in-formation sharing capability to improve its global optimi-zation performance e specific implementation of theIEPSO algorithm is shown in Figure 1
e position and velocity of particles in a population arerandomly initialized and the fitness value of the particles iscalculated Information on the current individual and globaloptimal particles including their positions and fitness valuesis saved en the particle swarm operation is conducted Inthe IEPSO algorithm Formula (2) is used to update the speedto balance the exploration and exploitation capabilities of theparticles in the global optimization process Formula (3) is thelocal-global information sharing term
2 Computational Intelligence and Neuroscience
vt+1id ωvtid + C1R1 ptid minusx
tid( ) + C2R2 p
tgd minusx
tid( )
+ C3R3 ptgd minusp
tid
∣∣∣∣∣∣∣∣∣∣
(2)
φ3 C3R3 ptgd minusp
tid
∣∣∣∣∣∣∣∣∣∣ (3)
Formula (2) comprises four parts namely the in-heritance of the previous speed particle self-cognition localinformation sharing and ldquolocal-global information sharingrdquo
e IEPSO algorithm is not limited to one-way com-munication between global and individual particles elocal-global information sharing term (φ3) is added to the
Initial parameter setting JD1 JD2 SL1 SL2 UB LB
PopSize Dim C1 C2 C3 R1 R2 R3 w Vmax MaxIter X and V
Particle swarm operation
Update speed position inertia weight
and acceleration term
If the particlersquos position and velocity exceed the set boundary range
Eliminate inferior particles and supplement new
particles
(i) Solve and save the population fitness value
If the current particle is the best
Give up new particles
Replace the local optimum
End
No
Yes
No
Yes
No
Yes
The principle of eliminationupdate the population with
the lattermost
If the global convergence condition is satisfied
If the local convergence condition is satisfied
No
Yes
(i) Solve and save the population fitness value and position
Randomize particle position andvelocity
(ii) Select the current best particle
(ii) Select the current best particle
Figure 1 IEPSO algorithm owcharts
Computational Intelligence and Neuroscience 3
information exchange between the local optimum and globaloptimal particles obtained by the current iteration and thepopulation velocity is updated by Formula (2) In the earlystage of the algorithm the entire search space is searched ata relatively high speed to determine the approximate rangeof the optimal solution the result is beneficial for globalsearch In the latter stage most of the particle search space isgradually reduced and concentrated in the neighborhood ofthe optimal value for deep search the result is beneficial forlocal search
e particles that have not exceeded the predeterminedrange after the speed update continue to retain their originalspeed e maximum value of the velocity is assigned to theparticle that is beyond the predetermined range after thespeed is updated e particles that have not exceeded thepredetermined range after the location update continue toretain their original positionsWhen the particles are beyondthe predetermined range inferior particles are eliminated byadding new particles to the population within the pre-determined range thereby forming a new population efitness value of the new population is recalculated and theinformation of the individual particle and its global optimalposition and fitness value obtained by the current iterationare preserved In all the algorithms particles have goodglobal search capability at the beginning of the iteration andas individual particles move closer to the local optimalparticle the algorithms gradually lose particle diversity Onthe basis of the idea of population variation of the traditionalgenetic algorithm (GA) the last-eliminated principle isapplied in the IEPSO algorithm to maintain particle pop-ulation diversity When the PSO satisfies the local conver-gence condition the optimal value obtained at this time maybe the local optimal value Particle population diversity ismaintained by using the particle fitness function as theevaluation criterion thereby eliminating particles with poorfitness or high similarity New particles are added to a newspecies in a predetermined range and the particle swarmoperations are reexecuted If the number of the currentiteration reaches the required predefined convergence ac-curacy the iteration is stopped and the optimal solution isproduced e complexity and runtime of the algorithmincrease due to the increased local-global informationsharing and the last-eliminated principle Neverthelessexperimental results show that the improved method canenhance the accuracy of the algorithm
3 Experimental Study
Eleven test functions are adopted in this study to test theperformance of the proposed IEPSO In this test f1ndashf5 areunimodal functions whereas f6ndashf11 are multimodal func-tions f6 (Griewank) is a multimodal function with multiplelocal extrema in which achieving the theoretically globaloptimum is difficult f7 (Rastrigin) possesses several localminima in which finding the global optimal value is dif-ficult f10 (Ackley) is an almost flat area modulated bya cosine wave to form a hole or a peak the surface isuneven and entry to a local optimum during optimizationis easy f11 (Cmfun) possesses multiple local extrema
around the global extremum point and falling into thelocal optimum is easy Table 1 presents the 11 test func-tions whereD is the space dimension S is the search rangeand CF is the theoretically optimal value31 Parameter InfluenceAnalysis of Local-Global InformationSharing Term is study proposes the addition of a local-global information sharing term which involves the pa-rameter C3 erefore the following exploration is con-ducted in a manner in which C3 is selected by using the 11test functions
(1) When C3 takes a constant value constant 2 isselected
(2) e linear variation formula of C3 is as follows
C3 k C3 startminus C3 startminusC3 end( 1113857 timest
tmax1113888 1113889 (4)
where k is the control factor When k 1 C3 is a linearlydecreasing function when k minus1 C3 is a linearly increasingfunctionC3_start andC3_end are the initial and terminationvalues of C3 respectively T is the iteration number and tmaxis the maximum number of iterations
Tables 2 and 3 and Figure 2 show that C3 is a constantthat linearly declines and linearly increases in three casesWhen the parameter C3 of the local-global informationsharing term is a linearly decreasing function the averagefitness value of the testing function is optimal and theconvergence speed and capability to jump out of the localextrema are higher than those in the other two cases WhenC3 takes a constant the algorithm cannot balance the globaland local search resulting in a ldquoprematurerdquo phenomenonWhen C3 adopts the linearly decreasing form the entire areacan be quickly searched at an early stage and close attentionis paid to local search in the latter part of the iteration toenhance the deep search ability of the algorithm While C3adopts a linearly increasing form it focuses on the global-local information exchange in the latter stage of the iterationAlthough this condition can increase the deep search abilityof the algorithm it will cause the convergence speed tostagnate erefore compared with the linearly increasingform the linearly decreasing form shows a simulation curvethat converges faster and with higher precision
erefore the selection rules of the parameter C3 oflocal-global information sharing in a decreasing function areinvestigated in this studye nonlinear variation formula ofC3 is as follows
C3 C3 startminusC3 end( 1113857 times tan 0875 times 1minust
tmax1113888 1113889
k
⎛⎝ ⎞⎠
+ C3 end
(5)where C3_start and C3_end are the initial and terminationvalues of the acceleration term C3 respectively and k is thecontrol factor When k 02 C3 is a convex decreasingfunction when k 2 C3 is a concave decreasing function t isthe iteration number and tmax is the maximum number ofiterations
4 Computational Intelligence and Neuroscience
Table 4 shows that when C3 is a convex function theprecision and robustness of the algorithm can obtain sat-isfactory results on f1ndashf5 Table 5 shows that when C3 is
a convex function the algorithm obtains a satisfactory so-lution and shows a fast convergence rate on f6 f8 f9 f10 andf11 In the unimodal test function the IEPSO algorithm does
Table 1 11 test functions
No Test function S CFf1 Sphere f1(x) 1113936
Di1x
2i [10] [minus100100]D 0
f2 Schaffer f(x y) 05 + (sin2x2 + y2
1113968minus 05)(1 + 0001(x2 + y2))2 [33] [minus100100]D 0
f3 Step f3(x) 1113936Di1[xi + 05]2 [10] [minus100100]D 0
f4 SumSquares f4(x) 1113936Di1ix
2i [10] [minus1010]D 0
f5 Zakharov f5(x) 1113936Di1x
2i + (1113936
Di105 times ix2
i )2 + (1113936Di105 times ixi)
4 [10] [minus100100]D 0
f6 Griewank f6(x) 140001113936Di1x
2i minus1113937
Di1cos(xi
i
radic+ 1) [10] [minus600600]D 0
f7 Rastrigin f7(x) 1113936Di1[x2
i minus 10 cos(2πxi + 10)] [10] [minus512512]D 0f8 Alpine f8(x) 1113936
Di1(|xi sinxi| + 01xi) [6] [minus1010]D 0
f9 Shubert minf9(x y) 11139365i i cos[(i + 1)x + i]1113966 1113967 times 1113936
5i i cos[(i + 1)y + i]1113966 1113967 [minus1010]D minus186731
f10 Ackley f10(x) minus20 exp(minus02
1D1113936Di1x
2i
1113969
)minus exp(1D1113936Di1cos 2πxi) + 20 + e [10] [minus3232]D 0
f11 Cmfun f11(x y) x sin(|x|
radic) + y sin(
|y|
1113968) [minus500500] minus837966
Table 2 Unimodal test functions
Functions Criteria C3 2 C3 2sim0 k minus1 C3 2sim0 k 1
f1Mean 722E + 02 107E minus 06 450E minus 20SD 397E + 04 111E minus 12 375E minus 16Best 405E + 02 241E minus 08 155E minus 25
f2Mean 250E minus 06 222E minus 17 0SD 232E minus 12 259E minus 33 0Best 285E minus 07 0 0
f3Mean 199E + 02 803E minus 07 182E minus 20SD 215E + 04 104E minus 12 105E minus 39Best 3581 395E minus 08 322E minus 24
f4Mean 7110 245E minus 08 820E minus 20SD 957 795E minus 16 511E minus 38Best 147 408E minus 09 843E minus 26
f5Mean 174E + 03 386E minus 04 556E minus 11SD 244E + 05 149E minus 07 488E minus 14Best 829E + 02 978E minus 06 354E minus 11
Table 3 Multimodal test functions
Functions Criteria C3 2 C3 2sim0 k minus1 C3 2sim0 k 1
f6Mean 110 818E minus 02 492E minus 02SD 46E minus 03 837E minus 04 596E minus 04Best 096 433E minus 02 123E minus 02
f7Mean 3503 410 19Ee minus 04SD 844 25461 5649Best 2967 2057 225E minus 05
f8Mean 293 133E minus 03 528E minus 10SD 030 310E minus 08 223E minus 12Best 202 134E minus 05 583E minus 13
f9Mean minus1867295 minus1867309 minus1867309SD 120E minus 06 0 0Best minus1867307 minus1867309 minus1867309
f10Mean 7649 235E minus 04 184E minus 11SD 0415 439E minus 09 227E minus 22Best 6513 573E minus 05 250E minus 12
f11Mean minus8379658 minus8379658 minus8379658SD 450E minus 09 0 440E minus 09Best minus8379658 minus8379658 minus8379658
Computational Intelligence and Neuroscience 5
0 50 100 150 200Iterations
0
500
1000
1500
2000
Fitn
ess v
alue
2
15
1
05
0100
0
ndash100 ndash100ndash50
050
100
k = ndash1k = 1Constant
times104
(a)
50 100 150 200Iterations
0
002
004
006
008Fi
tnes
s val
ue
08
06
04
02
0
ndash100
0
100
ndash100ndash50
500
100
k = ndash1k = 1Constant
(b)
50 100 150 200Iterations
0
500
1000
1500
2000
2500
3000
Fitn
ess v
alue
2times104
15
1
05
0100
0
ndash100 ndash100ndash50
0 50100
k = ndash1k = 1Constant
(c)
Figure 2 Continued
6 Computational Intelligence and Neuroscience
50 100 150 200Iterations
0
10
20
30
40
50
60
Fitn
ess v
alue
600
400
200
010
0
ndash10 ndash10 ndash5
05
10
k = ndash1k = 1Constant
(d)
50 100 150 200Iterations
0
5000
10000
15000
Fitn
ess v
alue
2times108
15
1
05
0100
0
ndash100 ndash100ndash50
050
100
k = ndash1k = 1Constant
(e)
50 100 150 200Iterations
0
05
1
15
2
Fitn
ess v
alue
2
15
1
05
0200
150100
50 50100
150200
k = ndash1k = 1Constant
(f )
Figure 2 Continued
Computational Intelligence and Neuroscience 7
50 100 150 200Iterations
0
10
20
30
40
50
60
Fitn
ess v
alue
80
60
40
20
05
0
ndash5 ndash50
5
k = ndash1k = 1Constant
(g)
50 100 150 200Iterations
0
1
2
3
4
5
6
Fitn
ess v
alue
15
10
5
010
0
ndash10 ndash10ndash5 0
105
k = ndash1k = 1Constant
(h)
0 50 100 150 200Iterations
ndash1865
ndash186
ndash1855
ndash185
ndash1845
ndash184
Fitn
ess v
alue
200
100
0
ndash100
200150
10050 50
100150 200
k = ndash1k = 1Constant
(i)
Figure 2 Continued
8 Computational Intelligence and Neuroscience
50 100 150 200Iterations
0
5
10
15
Fitn
ess v
alue
15
10
5
0
200
ndash20 ndash200
20
k = ndash1k = 1Constant
(j)
0 50 100 150 200Iterations
ndash840
ndash820
ndash800
ndash780
ndash760
ndash740
ndash720
Fitn
ess v
alue
1000
500
0
ndash500
500ndash1000
0
ndash500 ndash5000
500
k = ndash1k = 1Constant
(k)
Figure 2 11 test functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquares function (e) f5 Zakharovfunction (f ) f6 Griewank function (g) f7 Rastrigin function (h) f8 alpine function (i) f9 Shubert function (j) f10 Ackley function (k) f11Cmfun function
Table 4 Unimodal test functions
Functions Criteria C3 2sim0 k 02 C3 2sim0 k 2 C3 2sim0 k 1
f1Mean 266E minus 20 551E minus 10 450E minus 20SD 265E minus 39 287E minus 19 375E minus 16Best 912E minus 24 138E minus 11 155E minus 25
f2Mean 0 0 0SD 0 0 0Best 0 0 0
f3Mean 621E minus 19 604E minus 10 182E minus 20SD 263E minus 36 779E minus 19 105E minus 39Best 181E minus 27 308E minus 11 322E minus 24
f4Mean 170E minus 21 242E minus 11 820E minus 20SD 131E minus 41 440E minus 22 511E minus 38Best 282E minus 29 436E minus 12 843E minus 26
f5Mean 165E minus 10 283E minus 11 556E minus 11SD 330E minus 20 359E minus 11 488E minus 14Best 217E minus 11 100E minus 11 354E minus 11
Computational Intelligence and Neuroscience 9
not show its advantages because of its strong deep searchcapability In the complex multimodal test function whenthe convex function is used in C3 the downward trend isslow in the early stage thus beneting the global search andthe downward speed increases in the later stage thusbeneting the local search When the concave function isused for C3 the descent speed is fast in the early stageAlthough the search speed is improved the coverage area ofthe search is reduced thereby leading to the convergence ofthe algorithm to the nonoptimal value From the simulationdiagrams (f)ndash(k) the convergence speed is observed to beslightly slow when C3 is a convex function but its ability tojump out of the local extremum and the accuracy of theglobal search are higher than those in the other two casesWhen C3 is a concave function the convergence speed isfaster than those in the other two cases and the searchaccuracy is lower than that when C3 is a convex function
32 Comparison of Test Results e 11 test functions inFigure 1 are used to compare the IEPSO algorithm withclassical PSO SPSO dierential algorithm (DE) and GAe DE GA and PSO algorithms are all stochastic in-telligent optimization algorithms with population iterationse evaluation criteria of algorithm performance includespeed of convergence and size of individual populationsearch coverage e dierential optimization algorithm hasa low space complexity and obvious advantages in dealingwith large-scale and complex optimization problems eGA has good convergence when solving discrete multipeakand noise-containing optimization problems Based on thetraditional PSO algorithm the SPSO algorithm achieves thebalance between global search and local search by adjustingthe inertial weight (Figures 3 and 4)
e experimental parameters of the ve algorithmsare set as shown in Table 6 Each test function is runindependently 10 times and the average is recorded toreduce the data error e iteration is stopped when the
convergence condition meets the convergence accuracy ebest average tness value of the ve algorithms is blackenede standard deviation average tness and optimal value ofeach algorithm are shown in Tables 7 and 8 Figures 5 and 6plot the convergence curves of the 11 test functions
Table 7 shows that the IEPSO has the best performanceon f1 f2 f3 and f4 e IEPSO algorithm obtains the the-oretical optimal value on f2 DE can search the global so-lution on f5 e deep search capability of the IEPSOalgorithm is considerably higher than that of the PSO andSPSO algorithms due to the increased global-local in-formation sharing term and the last-eliminated principlee crossover mutation and selection mechanisms makethe DE algorithm perform well in the early stage of the globalsearch However the diversity of the population declines inthe latter stage because of population dierences e
0 2 4 6 8 10Iterations
0
02
04
06
08
1
C3
k = 2k = 02k = 1
Figure 3 e change curve of C3 with the number of iterations
Table 5 Multimodal test functions
Functions Criteria C3 2sim0 k 02 C3 2sim0 k 2 C3 2sim0 k 1
f6Mean 419E minus 02 479E minus 02 492E minus 02SD 343E minus 04 707E minus 04 596E minus 04Best 125E minus 02 57E minus 03 123E minus 02
f7Mean 446E minus 03 500E minus 05 19Ee minus 04SD 173E minus 04 303E minus 06 5649Best 231E minus 12 389E minus 11 225E minus 05
f8Mean 242E minus 10 374E minus 10 528E minus 10SD 674E minus 20 247E minus 12 223E minus 12Best 371E minus 16 436E minus 11 583E minus 13
f9Mean minus1867309 minus1867309 minus1867309SD 0 0 0Best minus1867309 minus1867309 minus1867309
f10Mean 113E minus 11 205E minus 10 184E minus 11SD 221E minus 22 437E minus 12 227E minus 22Best 506E minus 14 175E minus 10 250E minus 12
f11Mean minus8379658 minus8379658 minus8379658SD 0 0 440E minus 09Best minus8379658 minus8379658 minus8379658
10 Computational Intelligence and Neuroscience
0 50 100 150 200Iterations
0
500
1000
1500
2000
2500
3000Fi
tnes
s val
ue
2 4 6 8 10 120
1000
2000
k = 2k = 02k = 1
(a)
50 100 150 200Iterations
0
0005
001
0015
Fitn
ess v
alue
5 10 15 200
1
2
3times10ndash3
k = 02k = 2k = 1
(b)
50 100 150 200Iterations
0
500
1000
1500
2000
2500
3000
Fitn
ess v
alue
2 4 60
1000
2000
k = 1k = 2k = 02
(c)
50 100 150 200Iterations
0
10
20
30
40
50
60
Fitn
ess v
alue
2 4 6 8 100
20
40
k = 02k = 2k = 1
(d)
50 100 150 200Iterations
0
2000
4000
6000
8000
10000
12000
14000
Fitn
ess v
alue
5 10 15 200
2000
4000
6000
k = 02k = 2k = 1
(e)
50 100 150 200Iterations
0
05
1
15
2
Fitn
ess v
alue
5 10 15 20 25 300
05
1
15
k = 1k = 2k = 02
(f )
Figure 4 Continued
Computational Intelligence and Neuroscience 11
50 100 150 200Iterations
0
10
20
30
40
50Fi
tnes
s val
ue
20 40 60 800
10
20
30
k = 02k = 2k = 1
(g)
50 100 150 200Iterations
0
1
2
3
4
Fitn
ess v
alue
2 4 6 8 10 120
1
2
3
k = 02k = 2k = 1
(h)
0 50 100 150 200Iterations
ndash1865
ndash186
ndash1855
ndash185
ndash1845
ndash184
Fitn
ess v
alue
10 20 30 40
ndash1865
ndash186
ndash1855
ndash185
k = 02k = 2k = 1
(i)
50 100 150 200Iterations
0
5
10
15
Fitn
ess v
alue
10 20 300
2
4
k = 02k = 2k = 1
(j)
0 50 100 150 200Iterations
ndash840
ndash820
ndash800
ndash780
ndash760
ndash740
ndash720
Fitn
ess v
alue
k = 02k = 2k = 1
60 65 70 75 80ndash838ndash836ndash834ndash832ndash830
(k)
Figure 4 11 test functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquare function (e) f5 Zakharovfunction (f ) f6 Griewank function (g) f7 Rastrigin function (h) f8 alpine function (i) f9 Shubert function (j) f10 Ackley function (k) f11Cmfun function
12 Computational Intelligence and Neuroscience
simulation diagrams (a)ndash(e) show that although the DEalgorithm converges rapidly in the early stage its globalsearch performance in the later stage becomes lower thanthat of the IEPSO algorithm When the GA is used to solveoptimization problems the individuals in the population fallinto the local optimum and do not continue searching forthe optimum solution erefore in Figure 5 the simulationcurve of the GA converges to the local optimum
e test results in Table 8 indicate that the IEPSO has thebest performance on f6 f7 f8 f9 f10 and f11 and that the DE
and GA can obtain the theoretical optimal value on f9 and f11Although the GA and IEPSO algorithm can obtain the globaloptimal value on f9 the IEPSO algorithm is more robust thanthe GA is As shown in the simulation curve of Figure 6 thediversity of the population is maintained because the sup-plementary particles in the population are stochastic whenthe local optimal solution converges gradually e IEPSOalgorithm can jump out of the local extrema points in theface of complex multimodal test functions and the numberof iterations required is correspondingly reduced
Table 6 Parameter settings
Algorithm Population Maximum iteration Dim of each object OthersPSO 40 1000 10 C1 C2 2 R1 R2 05SPSO 40 1000 10 ω 09ndash04 C1 C2 2 R1 R2 05DE 40 1000 10 mdashGA 40 1000 10 GGAP 05 PRECI 25IEPSO 40 1000 10 ω 09ndash04 C1 C2 2 C3 2ndash0 R1 R2 R3 05
Table 7 Unimodal test functions
Functions Criteria PSO SPSO DE IEPSO GA
f1Mean 133E + 03 308E + 03 731E minus 12 892E minus 22 11696SD 253E + 05 121E + 06 225E minus 23 265E minus 39 44192Best 114E + 03 120E + 03 242E minus 12 772E minus 27 4660
f2Mean 296E minus 02 880E minus 02 837E minus 06 0 179E minus 11SD 836E minus 04 896E minus 04 158E minus 10 0 0Best 455E minus 03 8428734 755E minus 10 0 179E minus 11
f3Mean 119E + 03 251E + 03 114E minus 11 621E minus 19 7430SD 293E + 05 182E + 06 995E minus 23 263E minus 36 5833Best 106E + 03 282E minus 02 210E minus 12 181E minus 27 4542
f4Mean 8238 8210 336E minus 13 170E minus 21 3031SD 686E + 02 140E + 03 995E minus 26 131E minus 41 0835Best 115E + 02 3739 115E minus 13 282E minus 29 1968
f5Mean 126E + 04 860E + 03 702E minus 12 165E minus 10 362E + 03SD 206E + 07 215E + 07 181E minus 23 330E minus 20 344E + 05Best 104E + 04 130E + 02 267E minus 12 217E minus 11 253E + 03
Table 8 Multimodal test functions
Functions Criteria PSO SPSO DE IEPSO GA
f6Mean 1548 1752 944E minus 02 419E minus 02 1006SD 0026 0093 487E minus 04 343E minus 04 0018Best 1236 1417 006 0013 0794
f7Mean 57737 43405 11945 446E minus 03 8939SD 117768 65178 16502 173E minus 04 3608Best 35981 317E + 01 6398 231E minus 12 5040
f8Mean 4996 4665 379E minus 02 242E minus 10 0423SD 191E + 00 1056 54E minus 03 674E minus 20 0051Best 2933 3151 46E minus 03 371E minus 16 0086
f9Mean minus186448 minus186048 minus186728 minus186731 minus186731SD 119E minus 01 983E minus 01 229E minus 08 0 999E minus 12Best minus187E + 02 minus186731 minus1867309 minus1867309 minus186731
f10Mean 13134 15560 1613 113E minus 11 2515SD 14260 2163 0 221E minus 22 0166Best 2861 12719 1613 506E minus 14 1796
f11Mean minus740326 minus715438 minus837966 minus837966 minus837966SD 874E + 03 723E + 03 0 0 0Best minus837966 minus837697 minus837966 minus837966 minus837966
Computational Intelligence and Neuroscience 13
0 100 200 300 400 500 600Iterations
0
2000
4000
6000
8000
10000
12000Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
DE
GASPSO
PSOIEPSO
0 100 200 300 400 500 600Iterations
0
01
02
03
04
05
Fitn
ess v
alue
(b)
DEGAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
5000
10000
15000
Fitn
ess v
alue
(c)
DE
GAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
200
400
600
800
Fitn
ess v
alue
(d)
DEGAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
1
2
3
4
Fitn
ess v
alue
times104
(e)
Figure 5 Unimodal functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquares function (e) f5Zakharov function
14 Computational Intelligence and Neuroscience
0 100 200 300 400 500 600Iterations
0
05
1
15
2
25
3Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
0 100 200 300 400 500 600Iterations
0
20
40
60
80
100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(b)
0 100 200 300 400 500 600Iterations
0
5
10
15
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(c)
0 100 200 300 400 500 600Iterations
ndash180
ndash160
ndash140
ndash120
ndash100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(d)
0 100 200 300 400 500 600Iterations
0
5
10
15
20
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(e)
100 200 300 400 500 600Iterations
ndash850
ndash800
ndash750
ndash700
ndash650
ndash600
ndash550
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(f )
Figure 6 Multimodal functions (a) f6 Griewank function (b) f7 Rastrigin function (c) f8 alpine function (d) f9 Shubert function (e) f10Ackley function (f ) f11 Cmfun function
Computational Intelligence and Neuroscience 15
Table 9 shows the test results for the three improved PSOalgorithms e DMSDL-PSO algorithm in [25] is a PSOalgorithm combined with differential variation and thequasi-Newton method whereas the HPSOWM algorithm in[26] is a binary PSO algorithm based on wavelet transformTable 9 shows that the IEPSO algorithm obtains the bestvalue in 5 out of the 11 test functions and the above analysisindicates that the IEPSO outperforms the other improvedPSO algorithms
4 Conclusion
In contemporary engineering design solving the globaloptimization problems of multiparameter strongly coupledand nonlinear systems using conventional optimizationalgorithms is difficult In this study an improved PSO thatis the IEPSO algorithm is proposed on the basis of the last-eliminated principle and an enhanced local-global in-formation sharing capability e comparison and analysisof the simulation results indicate the following conclusions
(1) e exchange of information between global andlocal optimal particles enhances the deep searchcapability of the IEPSO algorithm
(2) e standard test function is used to simulate theparameter C3 of the local-global information sharingterm e results show that the global optimizationcapability of the IEPSO algorithm is strong when C3is linearly decreasing Moreover the proposed al-gorithm can show the best search performance whenC3 is a nonlinear convex function
(3) e last-eliminated principle is used in the IEPSO tomaintain particle population diversity MoreoverPSO is avoided in the local optimal value A com-parison of the IEPSO algorithm with the classicaloptimization algorithm and its improved versionsverifies the global search capability of the IEPSOalgorithm
In summary the comparative results of the simulationanalysis reveal that with the application of the last-eliminatedprinciple and the local-global information sharing term to the
IEPSO the proposed algorithm effectively overcomes thedisadvantages of the classical algorithms including theirprecocious convergence and tendency to fall into the localoptimum e IEPSO shows an ideal global optimizationperformance and indicates a high application value for solvingpractical engineering optimization problems
Data Availability
e data used to support the findings of this study areavailable from the corresponding author upon request
Conflicts of Interest
e authors declare that there are no conflicts of interest
Acknowledgments
is work was supported by Shanghai Rising-Star Program(no 16QB1401000) Key Project of Shanghai Science andTechnology Committee (no 16DZ1120400) and the Na-tional Natural Science Foundation of China (Project no51705187) the Postdoctoral Science Foundation of China(Grant no 2017M621202)
References
[1] Z Zhou J Wang Z Zhu D Yang and J Wu ldquoTangentnavigated robot path planning strategy using particle swarmoptimized artificial potential fieldrdquo Optik vol 158 pp 639ndash651 2018
[2] P Du R Barrio H Jiang and L Cheng ldquoAccurate Quotient-Difference algorithm error analysis improvements and ap-plicationsrdquo Applied Mathematics and Computation vol 309pp 245ndash271 2017
[3] L Jiang Z Wang Y Ye and J Jiang ldquoFast circle detectionalgorithm based on sampling from difference areardquo Optikvol 158 pp 424ndash433 2018
[4] H Garg ldquoA hybrid PSO-GA algorithm for constrained op-timization problemsrdquo Applied Mathematics amp Computationvol 274 no 11 pp 292ndash305 2016
[5] J Zhang and P Xia ldquoAn improved PSO algorithm for pa-rameter identification of nonlinear dynamic hystereticmodelsrdquo Journal of Sound and Vibration vol 389 pp 153ndash167 2017
[6] R Saini P P Roy and D P Dogra ldquoA segmental HMMbasedtrajectory classification using genetic algorithmrdquo ExpertSystems with Applications vol 93 pp 169ndash181 2018
[7] P R D O D Costa S Mauceri P Carroll et al ldquoA geneticalgorithm for a vehicle routing problemrdquo Electronic Notes inDiscrete Mathematics vol 64 pp 65ndash74 2017
[8] V Jindal and P Bedi ldquoAn improved hybrid ant particleoptimization (IHAPO) algorithm for reducing travel time inVANETsrdquo Applied Soft Computing vol 64 pp 526ndash5352018
[9] Z Peng H Manier and M A Manier ldquoParticle swarmoptimization for capacitated location-routing problemrdquoIFAC-PapersOnLine vol 50 no 1 pp 14668ndash14673 2017
[10] G Xu and G Yu ldquoReprint of on convergence analysis ofparticle swarm optimization algorithmrdquo Journal of ShanxiNormal University vol 4 no 14 pp 25ndash32 2008
[11] J Lu W Xie and H Zhou ldquoCombined fitness functionbased particle swarm optimization algorithm for system
Table 9 ree improved particle swarm algorithm test results
Functions Criteria IEPSO DMSDL-PSO [25]
BHPSOWM[26]
f1Mean 892E minus 22 473E minus 10 4240SD 265E minus 39 181E minus 09 5211
f3Mean 621E minus 19 237E + 03 761SD 263E minus 36 571E + 02 007
f6Mean 419E minus 02 866E minus 05 mdashSD 343E minus 04 296E minus 04 mdash
f7Mean 446E minus 03 915E + 01 7618SD 173E minus 04 180E + 01 2675
f8Mean 242E minus 10 131E + 02 mdashSD 674E minus 20 582E + 01 mdash
f10Mean 113E minus 11 101E + 00 172SD 221E minus 22 271E minus 01 0
16 Computational Intelligence and Neuroscience
identificationrdquo Computers amp Industrial Engineering vol 95pp 122ndash134 2016
[12] F Javidrad and M Nazari ldquoA new hybrid particle swarm andsimulated annealing stochastic optimization methodrdquo Ap-plied Soft Computing vol 60 pp 634ndash654 2017
[13] J Jie J Zhang H Zheng and B Hou ldquoFormalized model andanalysis of mixed swarm based cooperative particle swarmoptimizationrdquo Neurocomputing vol 174 pp 542ndash552 2016
[14] A Meng Z Li H Yin S Chen and Z Guo ldquoAcceleratingparticle swarm optimization using crisscross searchrdquo In-formation Sciences vol 329 pp 52ndash72 2016
[15] L Wang B Yang and J Orchard ldquoParticle swarm optimi-zation using dynamic tournament topologyrdquo Applied SoftComputing vol 48 pp 584ndash596 2016
[16] M S Kiran ldquoParticle swarm optimization with a new updatemechanismrdquo Applied Soft Computing vol 60 pp 670ndash6782017
[17] H C Tsai ldquoUnified particle swarm delivers high efficiency toparticle swarm optimizationrdquo Applied Soft Computingvol 55 pp 371ndash383 2017
[18] S F Li and C Y Cheng ldquoParticle swarm optimization withfitness adjustment parametersrdquo Computers amp Industrial En-gineering vol 113 pp 831ndash841 2017
[19] Y Chen L Li H Peng J Xiao Y Yang and Y Shi ldquoParticleswarm optimizer with two differential mutationrdquo Applied SoftComputing vol 61 pp 314ndash330 2017
[20] Q Zhang W Liu X Meng B Yang and A V VasilakosldquoVector coevolving particle swarm optimization algorithmrdquoInformation Sciences vol 394 pp 273ndash298 2017
[21] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimization[C]Evolutionary computationrdquo in Proceedingsof the 1999 Congress on Evolutionary Computation-CEC99vol 3 pp 1945ndash1950 IEEE Washington DC USA 1999
[22] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[23] A Ratnaweera S K Halgamuge and H C Watson ldquoSelf-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficientsrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 240ndash255 2004
[24] K Chen F Zhou and A Liu ldquoChaotic dynamic weightparticle swarm optimization for numerical function optimi-zationrdquo Knowledge-Based Systems vol 139 pp 23ndash40 2018
[25] Y Chen L Li H Peng J Xiao and Q Wu ldquoDynamic multi-swarm differential learning particle swarm optimizerrdquoSwarm and Evolutionary Computation vol 39 pp 209ndash2212018
[26] F Jiang H Xia Q A Tran Q M Ha N Q Tran and J HuldquoA new binary hybrid particle swarm optimization withwavelet mutationrdquo Knowledge-Based Systems vol 130pp 90ndash101 2017
[27] R Liu J Li C Mu J fan and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimiza-tion for dynamic multi-objective optimizationrdquo EuropeanJournal of Operational Research vol 261 no 3 pp 1028ndash1051 2017
[28] W Ye W Feng and S Fan ldquoA novel multi-swarm particleswarm optimization with dynamic learning strategyrdquo AppliedSoft Computing vol 61 pp 832ndash843 2017
[29] L Zhang Y Tang C Hua and X Guan ldquoA new particleswarm optimization algorithm with adaptive inertia weightbased on Bayesian techniquesrdquo Applied Soft Computingvol 28 pp 138ndash149 2015
[30] Q Cui Q Li G Li et al ldquoGlobally-optimal prediction-basedadaptive mutation particle swarm optimizationrdquo InformationSciences vol 418 pp 186ndash217 2017
[31] D Zhao and J Liu ldquoStudy on network security situationawareness based on particle swarm optimization algorithmrdquoComputers amp Industrial Engineering vol 125 pp 764ndash7752018
[32] H Samareh S H Khoshrou K Shahriar M M Ebadzadehand M Eslami ldquoOptimization of a nonlinear model forpredicting the ground vibration using the combinationalparticle swarm optimization-genetic algorithmrdquo Journal ofAfrican Earth Sciences vol 133 pp 36ndash45 2017
[33] M Dash T Panigrahi and R Sharma ldquoDistributed parameterestimation of IIR system using diffusion particle swarm op-timization algorithmrdquo Journal of King Saud University-Engineering Sciences 2017 In press
[34] B Wang S Li J Guo and Q Chen ldquoCar-like mobile robotpath planning in rough terrain using multi-objective particleswarm optimization algorithmrdquo Neurocomputing vol 282pp 42ndash51 2018
[35] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[36] R F Lopes F F Costa A Oliveira et al ldquoAlgorithm based onparticle swarm applied to electrical load scheduling in anindustrial settingrdquo Energy vol 147 pp 1007ndash1015 2018
[37] F Sheikholeslami and N J Navimipour ldquoService allocation inthe cloud environments using multi-objective particle swarmoptimization algorithm based on crowding distancerdquo Swarmamp Evolutionary Computation vol 35 pp 53ndash64 2017
[38] M Petrovic N Vukovic M Mitic et al ldquoIntegration ofprocess planning and scheduling using chaotic particle swarmoptimization algorithmrdquo Expert Systems with Applicationsvol 64 pp 569ndash588 2016
[39] Z Zhang Y Jiang S Zhang S Geng H Wang and G SangldquoAn adaptive particle swarm optimization algorithm forreservoir operation optimizationrdquo Applied Soft ComputingJournal vol 18 no 4 pp 167ndash177 2014
[40] K Li L Liu J Zhai T M Khoshgoftaar and T Li ldquoeimproved grey model based on particle swarm optimizationalgorithm for time series predictionrdquo Engineering Applica-tions of Artificial Intelligence vol 55 pp 285ndash291 2016
[41] S Gulcu and H Kodaz ldquoe estimation of the electricityenergy demand using particle swarm optimization algorithma case study of Turkeyrdquo Procedia Computer Science vol 111pp 64ndash70 2017
Computational Intelligence and Neuroscience 17
Computer Games Technology
International Journal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom
Journal ofEngineeringVolume 2018
Advances in
FuzzySystems
Hindawiwwwhindawicom
Volume 2018
International Journal of
ReconfigurableComputing
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
thinspArtificial Intelligence
Hindawiwwwhindawicom Volumethinsp2018
Hindawiwwwhindawicom Volume 2018
Civil EngineeringAdvances in
Hindawiwwwhindawicom Volume 2018
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawiwwwhindawicom Volume 2018
Hindawi
wwwhindawicom Volume 2018
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Engineering Mathematics
International Journal of
RoboticsJournal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Computational Intelligence and Neuroscience
Hindawiwwwhindawicom Volume 2018
Mathematical Problems in Engineering
Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018
Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom
The Scientific World Journal
Volume 2018
Hindawiwwwhindawicom Volume 2018
Human-ComputerInteraction
Advances in
Hindawiwwwhindawicom Volume 2018
Scientic Programming
Submit your manuscripts atwwwhindawicom
vt+1id ωvtid + C1R1 ptid minusx
tid( ) + C2R2 p
tgd minusx
tid( )
+ C3R3 ptgd minusp
tid
∣∣∣∣∣∣∣∣∣∣
(2)
φ3 C3R3 ptgd minusp
tid
∣∣∣∣∣∣∣∣∣∣ (3)
Formula (2) comprises four parts namely the in-heritance of the previous speed particle self-cognition localinformation sharing and ldquolocal-global information sharingrdquo
e IEPSO algorithm is not limited to one-way com-munication between global and individual particles elocal-global information sharing term (φ3) is added to the
Initial parameter setting JD1 JD2 SL1 SL2 UB LB
PopSize Dim C1 C2 C3 R1 R2 R3 w Vmax MaxIter X and V
Particle swarm operation
Update speed position inertia weight
and acceleration term
If the particlersquos position and velocity exceed the set boundary range
Eliminate inferior particles and supplement new
particles
(i) Solve and save the population fitness value
If the current particle is the best
Give up new particles
Replace the local optimum
End
No
Yes
No
Yes
No
Yes
The principle of eliminationupdate the population with
the lattermost
If the global convergence condition is satisfied
If the local convergence condition is satisfied
No
Yes
(i) Solve and save the population fitness value and position
Randomize particle position andvelocity
(ii) Select the current best particle
(ii) Select the current best particle
Figure 1 IEPSO algorithm owcharts
Computational Intelligence and Neuroscience 3
information exchange between the local optimum and globaloptimal particles obtained by the current iteration and thepopulation velocity is updated by Formula (2) In the earlystage of the algorithm the entire search space is searched ata relatively high speed to determine the approximate rangeof the optimal solution the result is beneficial for globalsearch In the latter stage most of the particle search space isgradually reduced and concentrated in the neighborhood ofthe optimal value for deep search the result is beneficial forlocal search
e particles that have not exceeded the predeterminedrange after the speed update continue to retain their originalspeed e maximum value of the velocity is assigned to theparticle that is beyond the predetermined range after thespeed is updated e particles that have not exceeded thepredetermined range after the location update continue toretain their original positionsWhen the particles are beyondthe predetermined range inferior particles are eliminated byadding new particles to the population within the pre-determined range thereby forming a new population efitness value of the new population is recalculated and theinformation of the individual particle and its global optimalposition and fitness value obtained by the current iterationare preserved In all the algorithms particles have goodglobal search capability at the beginning of the iteration andas individual particles move closer to the local optimalparticle the algorithms gradually lose particle diversity Onthe basis of the idea of population variation of the traditionalgenetic algorithm (GA) the last-eliminated principle isapplied in the IEPSO algorithm to maintain particle pop-ulation diversity When the PSO satisfies the local conver-gence condition the optimal value obtained at this time maybe the local optimal value Particle population diversity ismaintained by using the particle fitness function as theevaluation criterion thereby eliminating particles with poorfitness or high similarity New particles are added to a newspecies in a predetermined range and the particle swarmoperations are reexecuted If the number of the currentiteration reaches the required predefined convergence ac-curacy the iteration is stopped and the optimal solution isproduced e complexity and runtime of the algorithmincrease due to the increased local-global informationsharing and the last-eliminated principle Neverthelessexperimental results show that the improved method canenhance the accuracy of the algorithm
3 Experimental Study
Eleven test functions are adopted in this study to test theperformance of the proposed IEPSO In this test f1ndashf5 areunimodal functions whereas f6ndashf11 are multimodal func-tions f6 (Griewank) is a multimodal function with multiplelocal extrema in which achieving the theoretically globaloptimum is difficult f7 (Rastrigin) possesses several localminima in which finding the global optimal value is dif-ficult f10 (Ackley) is an almost flat area modulated bya cosine wave to form a hole or a peak the surface isuneven and entry to a local optimum during optimizationis easy f11 (Cmfun) possesses multiple local extrema
around the global extremum point and falling into thelocal optimum is easy Table 1 presents the 11 test func-tions whereD is the space dimension S is the search rangeand CF is the theoretically optimal value31 Parameter InfluenceAnalysis of Local-Global InformationSharing Term is study proposes the addition of a local-global information sharing term which involves the pa-rameter C3 erefore the following exploration is con-ducted in a manner in which C3 is selected by using the 11test functions
(1) When C3 takes a constant value constant 2 isselected
(2) e linear variation formula of C3 is as follows
C3 k C3 startminus C3 startminusC3 end( 1113857 timest
tmax1113888 1113889 (4)
where k is the control factor When k 1 C3 is a linearlydecreasing function when k minus1 C3 is a linearly increasingfunctionC3_start andC3_end are the initial and terminationvalues of C3 respectively T is the iteration number and tmaxis the maximum number of iterations
Tables 2 and 3 and Figure 2 show that C3 is a constantthat linearly declines and linearly increases in three casesWhen the parameter C3 of the local-global informationsharing term is a linearly decreasing function the averagefitness value of the testing function is optimal and theconvergence speed and capability to jump out of the localextrema are higher than those in the other two cases WhenC3 takes a constant the algorithm cannot balance the globaland local search resulting in a ldquoprematurerdquo phenomenonWhen C3 adopts the linearly decreasing form the entire areacan be quickly searched at an early stage and close attentionis paid to local search in the latter part of the iteration toenhance the deep search ability of the algorithm While C3adopts a linearly increasing form it focuses on the global-local information exchange in the latter stage of the iterationAlthough this condition can increase the deep search abilityof the algorithm it will cause the convergence speed tostagnate erefore compared with the linearly increasingform the linearly decreasing form shows a simulation curvethat converges faster and with higher precision
erefore the selection rules of the parameter C3 oflocal-global information sharing in a decreasing function areinvestigated in this studye nonlinear variation formula ofC3 is as follows
C3 C3 startminusC3 end( 1113857 times tan 0875 times 1minust
tmax1113888 1113889
k
⎛⎝ ⎞⎠
+ C3 end
(5)where C3_start and C3_end are the initial and terminationvalues of the acceleration term C3 respectively and k is thecontrol factor When k 02 C3 is a convex decreasingfunction when k 2 C3 is a concave decreasing function t isthe iteration number and tmax is the maximum number ofiterations
4 Computational Intelligence and Neuroscience
Table 4 shows that when C3 is a convex function theprecision and robustness of the algorithm can obtain sat-isfactory results on f1ndashf5 Table 5 shows that when C3 is
a convex function the algorithm obtains a satisfactory so-lution and shows a fast convergence rate on f6 f8 f9 f10 andf11 In the unimodal test function the IEPSO algorithm does
Table 1 11 test functions
No Test function S CFf1 Sphere f1(x) 1113936
Di1x
2i [10] [minus100100]D 0
f2 Schaffer f(x y) 05 + (sin2x2 + y2
1113968minus 05)(1 + 0001(x2 + y2))2 [33] [minus100100]D 0
f3 Step f3(x) 1113936Di1[xi + 05]2 [10] [minus100100]D 0
f4 SumSquares f4(x) 1113936Di1ix
2i [10] [minus1010]D 0
f5 Zakharov f5(x) 1113936Di1x
2i + (1113936
Di105 times ix2
i )2 + (1113936Di105 times ixi)
4 [10] [minus100100]D 0
f6 Griewank f6(x) 140001113936Di1x
2i minus1113937
Di1cos(xi
i
radic+ 1) [10] [minus600600]D 0
f7 Rastrigin f7(x) 1113936Di1[x2
i minus 10 cos(2πxi + 10)] [10] [minus512512]D 0f8 Alpine f8(x) 1113936
Di1(|xi sinxi| + 01xi) [6] [minus1010]D 0
f9 Shubert minf9(x y) 11139365i i cos[(i + 1)x + i]1113966 1113967 times 1113936
5i i cos[(i + 1)y + i]1113966 1113967 [minus1010]D minus186731
f10 Ackley f10(x) minus20 exp(minus02
1D1113936Di1x
2i
1113969
)minus exp(1D1113936Di1cos 2πxi) + 20 + e [10] [minus3232]D 0
f11 Cmfun f11(x y) x sin(|x|
radic) + y sin(
|y|
1113968) [minus500500] minus837966
Table 2 Unimodal test functions
Functions Criteria C3 2 C3 2sim0 k minus1 C3 2sim0 k 1
f1Mean 722E + 02 107E minus 06 450E minus 20SD 397E + 04 111E minus 12 375E minus 16Best 405E + 02 241E minus 08 155E minus 25
f2Mean 250E minus 06 222E minus 17 0SD 232E minus 12 259E minus 33 0Best 285E minus 07 0 0
f3Mean 199E + 02 803E minus 07 182E minus 20SD 215E + 04 104E minus 12 105E minus 39Best 3581 395E minus 08 322E minus 24
f4Mean 7110 245E minus 08 820E minus 20SD 957 795E minus 16 511E minus 38Best 147 408E minus 09 843E minus 26
f5Mean 174E + 03 386E minus 04 556E minus 11SD 244E + 05 149E minus 07 488E minus 14Best 829E + 02 978E minus 06 354E minus 11
Table 3 Multimodal test functions
Functions Criteria C3 2 C3 2sim0 k minus1 C3 2sim0 k 1
f6Mean 110 818E minus 02 492E minus 02SD 46E minus 03 837E minus 04 596E minus 04Best 096 433E minus 02 123E minus 02
f7Mean 3503 410 19Ee minus 04SD 844 25461 5649Best 2967 2057 225E minus 05
f8Mean 293 133E minus 03 528E minus 10SD 030 310E minus 08 223E minus 12Best 202 134E minus 05 583E minus 13
f9Mean minus1867295 minus1867309 minus1867309SD 120E minus 06 0 0Best minus1867307 minus1867309 minus1867309
f10Mean 7649 235E minus 04 184E minus 11SD 0415 439E minus 09 227E minus 22Best 6513 573E minus 05 250E minus 12
f11Mean minus8379658 minus8379658 minus8379658SD 450E minus 09 0 440E minus 09Best minus8379658 minus8379658 minus8379658
Computational Intelligence and Neuroscience 5
0 50 100 150 200Iterations
0
500
1000
1500
2000
Fitn
ess v
alue
2
15
1
05
0100
0
ndash100 ndash100ndash50
050
100
k = ndash1k = 1Constant
times104
(a)
50 100 150 200Iterations
0
002
004
006
008Fi
tnes
s val
ue
08
06
04
02
0
ndash100
0
100
ndash100ndash50
500
100
k = ndash1k = 1Constant
(b)
50 100 150 200Iterations
0
500
1000
1500
2000
2500
3000
Fitn
ess v
alue
2times104
15
1
05
0100
0
ndash100 ndash100ndash50
0 50100
k = ndash1k = 1Constant
(c)
Figure 2 Continued
6 Computational Intelligence and Neuroscience
50 100 150 200Iterations
0
10
20
30
40
50
60
Fitn
ess v
alue
600
400
200
010
0
ndash10 ndash10 ndash5
05
10
k = ndash1k = 1Constant
(d)
50 100 150 200Iterations
0
5000
10000
15000
Fitn
ess v
alue
2times108
15
1
05
0100
0
ndash100 ndash100ndash50
050
100
k = ndash1k = 1Constant
(e)
50 100 150 200Iterations
0
05
1
15
2
Fitn
ess v
alue
2
15
1
05
0200
150100
50 50100
150200
k = ndash1k = 1Constant
(f )
Figure 2 Continued
Computational Intelligence and Neuroscience 7
50 100 150 200Iterations
0
10
20
30
40
50
60
Fitn
ess v
alue
80
60
40
20
05
0
ndash5 ndash50
5
k = ndash1k = 1Constant
(g)
50 100 150 200Iterations
0
1
2
3
4
5
6
Fitn
ess v
alue
15
10
5
010
0
ndash10 ndash10ndash5 0
105
k = ndash1k = 1Constant
(h)
0 50 100 150 200Iterations
ndash1865
ndash186
ndash1855
ndash185
ndash1845
ndash184
Fitn
ess v
alue
200
100
0
ndash100
200150
10050 50
100150 200
k = ndash1k = 1Constant
(i)
Figure 2 Continued
8 Computational Intelligence and Neuroscience
50 100 150 200Iterations
0
5
10
15
Fitn
ess v
alue
15
10
5
0
200
ndash20 ndash200
20
k = ndash1k = 1Constant
(j)
0 50 100 150 200Iterations
ndash840
ndash820
ndash800
ndash780
ndash760
ndash740
ndash720
Fitn
ess v
alue
1000
500
0
ndash500
500ndash1000
0
ndash500 ndash5000
500
k = ndash1k = 1Constant
(k)
Figure 2 11 test functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquares function (e) f5 Zakharovfunction (f ) f6 Griewank function (g) f7 Rastrigin function (h) f8 alpine function (i) f9 Shubert function (j) f10 Ackley function (k) f11Cmfun function
Table 4 Unimodal test functions
Functions Criteria C3 2sim0 k 02 C3 2sim0 k 2 C3 2sim0 k 1
f1Mean 266E minus 20 551E minus 10 450E minus 20SD 265E minus 39 287E minus 19 375E minus 16Best 912E minus 24 138E minus 11 155E minus 25
f2Mean 0 0 0SD 0 0 0Best 0 0 0
f3Mean 621E minus 19 604E minus 10 182E minus 20SD 263E minus 36 779E minus 19 105E minus 39Best 181E minus 27 308E minus 11 322E minus 24
f4Mean 170E minus 21 242E minus 11 820E minus 20SD 131E minus 41 440E minus 22 511E minus 38Best 282E minus 29 436E minus 12 843E minus 26
f5Mean 165E minus 10 283E minus 11 556E minus 11SD 330E minus 20 359E minus 11 488E minus 14Best 217E minus 11 100E minus 11 354E minus 11
Computational Intelligence and Neuroscience 9
not show its advantages because of its strong deep searchcapability In the complex multimodal test function whenthe convex function is used in C3 the downward trend isslow in the early stage thus beneting the global search andthe downward speed increases in the later stage thusbeneting the local search When the concave function isused for C3 the descent speed is fast in the early stageAlthough the search speed is improved the coverage area ofthe search is reduced thereby leading to the convergence ofthe algorithm to the nonoptimal value From the simulationdiagrams (f)ndash(k) the convergence speed is observed to beslightly slow when C3 is a convex function but its ability tojump out of the local extremum and the accuracy of theglobal search are higher than those in the other two casesWhen C3 is a concave function the convergence speed isfaster than those in the other two cases and the searchaccuracy is lower than that when C3 is a convex function
32 Comparison of Test Results e 11 test functions inFigure 1 are used to compare the IEPSO algorithm withclassical PSO SPSO dierential algorithm (DE) and GAe DE GA and PSO algorithms are all stochastic in-telligent optimization algorithms with population iterationse evaluation criteria of algorithm performance includespeed of convergence and size of individual populationsearch coverage e dierential optimization algorithm hasa low space complexity and obvious advantages in dealingwith large-scale and complex optimization problems eGA has good convergence when solving discrete multipeakand noise-containing optimization problems Based on thetraditional PSO algorithm the SPSO algorithm achieves thebalance between global search and local search by adjustingthe inertial weight (Figures 3 and 4)
e experimental parameters of the ve algorithmsare set as shown in Table 6 Each test function is runindependently 10 times and the average is recorded toreduce the data error e iteration is stopped when the
convergence condition meets the convergence accuracy ebest average tness value of the ve algorithms is blackenede standard deviation average tness and optimal value ofeach algorithm are shown in Tables 7 and 8 Figures 5 and 6plot the convergence curves of the 11 test functions
Table 7 shows that the IEPSO has the best performanceon f1 f2 f3 and f4 e IEPSO algorithm obtains the the-oretical optimal value on f2 DE can search the global so-lution on f5 e deep search capability of the IEPSOalgorithm is considerably higher than that of the PSO andSPSO algorithms due to the increased global-local in-formation sharing term and the last-eliminated principlee crossover mutation and selection mechanisms makethe DE algorithm perform well in the early stage of the globalsearch However the diversity of the population declines inthe latter stage because of population dierences e
0 2 4 6 8 10Iterations
0
02
04
06
08
1
C3
k = 2k = 02k = 1
Figure 3 e change curve of C3 with the number of iterations
Table 5 Multimodal test functions
Functions Criteria C3 2sim0 k 02 C3 2sim0 k 2 C3 2sim0 k 1
f6Mean 419E minus 02 479E minus 02 492E minus 02SD 343E minus 04 707E minus 04 596E minus 04Best 125E minus 02 57E minus 03 123E minus 02
f7Mean 446E minus 03 500E minus 05 19Ee minus 04SD 173E minus 04 303E minus 06 5649Best 231E minus 12 389E minus 11 225E minus 05
f8Mean 242E minus 10 374E minus 10 528E minus 10SD 674E minus 20 247E minus 12 223E minus 12Best 371E minus 16 436E minus 11 583E minus 13
f9Mean minus1867309 minus1867309 minus1867309SD 0 0 0Best minus1867309 minus1867309 minus1867309
f10Mean 113E minus 11 205E minus 10 184E minus 11SD 221E minus 22 437E minus 12 227E minus 22Best 506E minus 14 175E minus 10 250E minus 12
f11Mean minus8379658 minus8379658 minus8379658SD 0 0 440E minus 09Best minus8379658 minus8379658 minus8379658
10 Computational Intelligence and Neuroscience
0 50 100 150 200Iterations
0
500
1000
1500
2000
2500
3000Fi
tnes
s val
ue
2 4 6 8 10 120
1000
2000
k = 2k = 02k = 1
(a)
50 100 150 200Iterations
0
0005
001
0015
Fitn
ess v
alue
5 10 15 200
1
2
3times10ndash3
k = 02k = 2k = 1
(b)
50 100 150 200Iterations
0
500
1000
1500
2000
2500
3000
Fitn
ess v
alue
2 4 60
1000
2000
k = 1k = 2k = 02
(c)
50 100 150 200Iterations
0
10
20
30
40
50
60
Fitn
ess v
alue
2 4 6 8 100
20
40
k = 02k = 2k = 1
(d)
50 100 150 200Iterations
0
2000
4000
6000
8000
10000
12000
14000
Fitn
ess v
alue
5 10 15 200
2000
4000
6000
k = 02k = 2k = 1
(e)
50 100 150 200Iterations
0
05
1
15
2
Fitn
ess v
alue
5 10 15 20 25 300
05
1
15
k = 1k = 2k = 02
(f )
Figure 4 Continued
Computational Intelligence and Neuroscience 11
50 100 150 200Iterations
0
10
20
30
40
50Fi
tnes
s val
ue
20 40 60 800
10
20
30
k = 02k = 2k = 1
(g)
50 100 150 200Iterations
0
1
2
3
4
Fitn
ess v
alue
2 4 6 8 10 120
1
2
3
k = 02k = 2k = 1
(h)
0 50 100 150 200Iterations
ndash1865
ndash186
ndash1855
ndash185
ndash1845
ndash184
Fitn
ess v
alue
10 20 30 40
ndash1865
ndash186
ndash1855
ndash185
k = 02k = 2k = 1
(i)
50 100 150 200Iterations
0
5
10
15
Fitn
ess v
alue
10 20 300
2
4
k = 02k = 2k = 1
(j)
0 50 100 150 200Iterations
ndash840
ndash820
ndash800
ndash780
ndash760
ndash740
ndash720
Fitn
ess v
alue
k = 02k = 2k = 1
60 65 70 75 80ndash838ndash836ndash834ndash832ndash830
(k)
Figure 4 11 test functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquare function (e) f5 Zakharovfunction (f ) f6 Griewank function (g) f7 Rastrigin function (h) f8 alpine function (i) f9 Shubert function (j) f10 Ackley function (k) f11Cmfun function
12 Computational Intelligence and Neuroscience
simulation diagrams (a)ndash(e) show that although the DEalgorithm converges rapidly in the early stage its globalsearch performance in the later stage becomes lower thanthat of the IEPSO algorithm When the GA is used to solveoptimization problems the individuals in the population fallinto the local optimum and do not continue searching forthe optimum solution erefore in Figure 5 the simulationcurve of the GA converges to the local optimum
e test results in Table 8 indicate that the IEPSO has thebest performance on f6 f7 f8 f9 f10 and f11 and that the DE
and GA can obtain the theoretical optimal value on f9 and f11Although the GA and IEPSO algorithm can obtain the globaloptimal value on f9 the IEPSO algorithm is more robust thanthe GA is As shown in the simulation curve of Figure 6 thediversity of the population is maintained because the sup-plementary particles in the population are stochastic whenthe local optimal solution converges gradually e IEPSOalgorithm can jump out of the local extrema points in theface of complex multimodal test functions and the numberof iterations required is correspondingly reduced
Table 6 Parameter settings
Algorithm Population Maximum iteration Dim of each object OthersPSO 40 1000 10 C1 C2 2 R1 R2 05SPSO 40 1000 10 ω 09ndash04 C1 C2 2 R1 R2 05DE 40 1000 10 mdashGA 40 1000 10 GGAP 05 PRECI 25IEPSO 40 1000 10 ω 09ndash04 C1 C2 2 C3 2ndash0 R1 R2 R3 05
Table 7 Unimodal test functions
Functions Criteria PSO SPSO DE IEPSO GA
f1Mean 133E + 03 308E + 03 731E minus 12 892E minus 22 11696SD 253E + 05 121E + 06 225E minus 23 265E minus 39 44192Best 114E + 03 120E + 03 242E minus 12 772E minus 27 4660
f2Mean 296E minus 02 880E minus 02 837E minus 06 0 179E minus 11SD 836E minus 04 896E minus 04 158E minus 10 0 0Best 455E minus 03 8428734 755E minus 10 0 179E minus 11
f3Mean 119E + 03 251E + 03 114E minus 11 621E minus 19 7430SD 293E + 05 182E + 06 995E minus 23 263E minus 36 5833Best 106E + 03 282E minus 02 210E minus 12 181E minus 27 4542
f4Mean 8238 8210 336E minus 13 170E minus 21 3031SD 686E + 02 140E + 03 995E minus 26 131E minus 41 0835Best 115E + 02 3739 115E minus 13 282E minus 29 1968
f5Mean 126E + 04 860E + 03 702E minus 12 165E minus 10 362E + 03SD 206E + 07 215E + 07 181E minus 23 330E minus 20 344E + 05Best 104E + 04 130E + 02 267E minus 12 217E minus 11 253E + 03
Table 8 Multimodal test functions
Functions Criteria PSO SPSO DE IEPSO GA
f6Mean 1548 1752 944E minus 02 419E minus 02 1006SD 0026 0093 487E minus 04 343E minus 04 0018Best 1236 1417 006 0013 0794
f7Mean 57737 43405 11945 446E minus 03 8939SD 117768 65178 16502 173E minus 04 3608Best 35981 317E + 01 6398 231E minus 12 5040
f8Mean 4996 4665 379E minus 02 242E minus 10 0423SD 191E + 00 1056 54E minus 03 674E minus 20 0051Best 2933 3151 46E minus 03 371E minus 16 0086
f9Mean minus186448 minus186048 minus186728 minus186731 minus186731SD 119E minus 01 983E minus 01 229E minus 08 0 999E minus 12Best minus187E + 02 minus186731 minus1867309 minus1867309 minus186731
f10Mean 13134 15560 1613 113E minus 11 2515SD 14260 2163 0 221E minus 22 0166Best 2861 12719 1613 506E minus 14 1796
f11Mean minus740326 minus715438 minus837966 minus837966 minus837966SD 874E + 03 723E + 03 0 0 0Best minus837966 minus837697 minus837966 minus837966 minus837966
Computational Intelligence and Neuroscience 13
0 100 200 300 400 500 600Iterations
0
2000
4000
6000
8000
10000
12000Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
DE
GASPSO
PSOIEPSO
0 100 200 300 400 500 600Iterations
0
01
02
03
04
05
Fitn
ess v
alue
(b)
DEGAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
5000
10000
15000
Fitn
ess v
alue
(c)
DE
GAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
200
400
600
800
Fitn
ess v
alue
(d)
DEGAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
1
2
3
4
Fitn
ess v
alue
times104
(e)
Figure 5 Unimodal functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquares function (e) f5Zakharov function
14 Computational Intelligence and Neuroscience
0 100 200 300 400 500 600Iterations
0
05
1
15
2
25
3Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
0 100 200 300 400 500 600Iterations
0
20
40
60
80
100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(b)
0 100 200 300 400 500 600Iterations
0
5
10
15
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(c)
0 100 200 300 400 500 600Iterations
ndash180
ndash160
ndash140
ndash120
ndash100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(d)
0 100 200 300 400 500 600Iterations
0
5
10
15
20
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(e)
100 200 300 400 500 600Iterations
ndash850
ndash800
ndash750
ndash700
ndash650
ndash600
ndash550
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(f )
Figure 6 Multimodal functions (a) f6 Griewank function (b) f7 Rastrigin function (c) f8 alpine function (d) f9 Shubert function (e) f10Ackley function (f ) f11 Cmfun function
Computational Intelligence and Neuroscience 15
Table 9 shows the test results for the three improved PSOalgorithms e DMSDL-PSO algorithm in [25] is a PSOalgorithm combined with differential variation and thequasi-Newton method whereas the HPSOWM algorithm in[26] is a binary PSO algorithm based on wavelet transformTable 9 shows that the IEPSO algorithm obtains the bestvalue in 5 out of the 11 test functions and the above analysisindicates that the IEPSO outperforms the other improvedPSO algorithms
4 Conclusion
In contemporary engineering design solving the globaloptimization problems of multiparameter strongly coupledand nonlinear systems using conventional optimizationalgorithms is difficult In this study an improved PSO thatis the IEPSO algorithm is proposed on the basis of the last-eliminated principle and an enhanced local-global in-formation sharing capability e comparison and analysisof the simulation results indicate the following conclusions
(1) e exchange of information between global andlocal optimal particles enhances the deep searchcapability of the IEPSO algorithm
(2) e standard test function is used to simulate theparameter C3 of the local-global information sharingterm e results show that the global optimizationcapability of the IEPSO algorithm is strong when C3is linearly decreasing Moreover the proposed al-gorithm can show the best search performance whenC3 is a nonlinear convex function
(3) e last-eliminated principle is used in the IEPSO tomaintain particle population diversity MoreoverPSO is avoided in the local optimal value A com-parison of the IEPSO algorithm with the classicaloptimization algorithm and its improved versionsverifies the global search capability of the IEPSOalgorithm
In summary the comparative results of the simulationanalysis reveal that with the application of the last-eliminatedprinciple and the local-global information sharing term to the
IEPSO the proposed algorithm effectively overcomes thedisadvantages of the classical algorithms including theirprecocious convergence and tendency to fall into the localoptimum e IEPSO shows an ideal global optimizationperformance and indicates a high application value for solvingpractical engineering optimization problems
Data Availability
e data used to support the findings of this study areavailable from the corresponding author upon request
Conflicts of Interest
e authors declare that there are no conflicts of interest
Acknowledgments
is work was supported by Shanghai Rising-Star Program(no 16QB1401000) Key Project of Shanghai Science andTechnology Committee (no 16DZ1120400) and the Na-tional Natural Science Foundation of China (Project no51705187) the Postdoctoral Science Foundation of China(Grant no 2017M621202)
References
[1] Z Zhou J Wang Z Zhu D Yang and J Wu ldquoTangentnavigated robot path planning strategy using particle swarmoptimized artificial potential fieldrdquo Optik vol 158 pp 639ndash651 2018
[2] P Du R Barrio H Jiang and L Cheng ldquoAccurate Quotient-Difference algorithm error analysis improvements and ap-plicationsrdquo Applied Mathematics and Computation vol 309pp 245ndash271 2017
[3] L Jiang Z Wang Y Ye and J Jiang ldquoFast circle detectionalgorithm based on sampling from difference areardquo Optikvol 158 pp 424ndash433 2018
[4] H Garg ldquoA hybrid PSO-GA algorithm for constrained op-timization problemsrdquo Applied Mathematics amp Computationvol 274 no 11 pp 292ndash305 2016
[5] J Zhang and P Xia ldquoAn improved PSO algorithm for pa-rameter identification of nonlinear dynamic hystereticmodelsrdquo Journal of Sound and Vibration vol 389 pp 153ndash167 2017
[6] R Saini P P Roy and D P Dogra ldquoA segmental HMMbasedtrajectory classification using genetic algorithmrdquo ExpertSystems with Applications vol 93 pp 169ndash181 2018
[7] P R D O D Costa S Mauceri P Carroll et al ldquoA geneticalgorithm for a vehicle routing problemrdquo Electronic Notes inDiscrete Mathematics vol 64 pp 65ndash74 2017
[8] V Jindal and P Bedi ldquoAn improved hybrid ant particleoptimization (IHAPO) algorithm for reducing travel time inVANETsrdquo Applied Soft Computing vol 64 pp 526ndash5352018
[9] Z Peng H Manier and M A Manier ldquoParticle swarmoptimization for capacitated location-routing problemrdquoIFAC-PapersOnLine vol 50 no 1 pp 14668ndash14673 2017
[10] G Xu and G Yu ldquoReprint of on convergence analysis ofparticle swarm optimization algorithmrdquo Journal of ShanxiNormal University vol 4 no 14 pp 25ndash32 2008
[11] J Lu W Xie and H Zhou ldquoCombined fitness functionbased particle swarm optimization algorithm for system
Table 9 ree improved particle swarm algorithm test results
Functions Criteria IEPSO DMSDL-PSO [25]
BHPSOWM[26]
f1Mean 892E minus 22 473E minus 10 4240SD 265E minus 39 181E minus 09 5211
f3Mean 621E minus 19 237E + 03 761SD 263E minus 36 571E + 02 007
f6Mean 419E minus 02 866E minus 05 mdashSD 343E minus 04 296E minus 04 mdash
f7Mean 446E minus 03 915E + 01 7618SD 173E minus 04 180E + 01 2675
f8Mean 242E minus 10 131E + 02 mdashSD 674E minus 20 582E + 01 mdash
f10Mean 113E minus 11 101E + 00 172SD 221E minus 22 271E minus 01 0
16 Computational Intelligence and Neuroscience
identificationrdquo Computers amp Industrial Engineering vol 95pp 122ndash134 2016
[12] F Javidrad and M Nazari ldquoA new hybrid particle swarm andsimulated annealing stochastic optimization methodrdquo Ap-plied Soft Computing vol 60 pp 634ndash654 2017
[13] J Jie J Zhang H Zheng and B Hou ldquoFormalized model andanalysis of mixed swarm based cooperative particle swarmoptimizationrdquo Neurocomputing vol 174 pp 542ndash552 2016
[14] A Meng Z Li H Yin S Chen and Z Guo ldquoAcceleratingparticle swarm optimization using crisscross searchrdquo In-formation Sciences vol 329 pp 52ndash72 2016
[15] L Wang B Yang and J Orchard ldquoParticle swarm optimi-zation using dynamic tournament topologyrdquo Applied SoftComputing vol 48 pp 584ndash596 2016
[16] M S Kiran ldquoParticle swarm optimization with a new updatemechanismrdquo Applied Soft Computing vol 60 pp 670ndash6782017
[17] H C Tsai ldquoUnified particle swarm delivers high efficiency toparticle swarm optimizationrdquo Applied Soft Computingvol 55 pp 371ndash383 2017
[18] S F Li and C Y Cheng ldquoParticle swarm optimization withfitness adjustment parametersrdquo Computers amp Industrial En-gineering vol 113 pp 831ndash841 2017
[19] Y Chen L Li H Peng J Xiao Y Yang and Y Shi ldquoParticleswarm optimizer with two differential mutationrdquo Applied SoftComputing vol 61 pp 314ndash330 2017
[20] Q Zhang W Liu X Meng B Yang and A V VasilakosldquoVector coevolving particle swarm optimization algorithmrdquoInformation Sciences vol 394 pp 273ndash298 2017
[21] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimization[C]Evolutionary computationrdquo in Proceedingsof the 1999 Congress on Evolutionary Computation-CEC99vol 3 pp 1945ndash1950 IEEE Washington DC USA 1999
[22] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[23] A Ratnaweera S K Halgamuge and H C Watson ldquoSelf-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficientsrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 240ndash255 2004
[24] K Chen F Zhou and A Liu ldquoChaotic dynamic weightparticle swarm optimization for numerical function optimi-zationrdquo Knowledge-Based Systems vol 139 pp 23ndash40 2018
[25] Y Chen L Li H Peng J Xiao and Q Wu ldquoDynamic multi-swarm differential learning particle swarm optimizerrdquoSwarm and Evolutionary Computation vol 39 pp 209ndash2212018
[26] F Jiang H Xia Q A Tran Q M Ha N Q Tran and J HuldquoA new binary hybrid particle swarm optimization withwavelet mutationrdquo Knowledge-Based Systems vol 130pp 90ndash101 2017
[27] R Liu J Li C Mu J fan and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimiza-tion for dynamic multi-objective optimizationrdquo EuropeanJournal of Operational Research vol 261 no 3 pp 1028ndash1051 2017
[28] W Ye W Feng and S Fan ldquoA novel multi-swarm particleswarm optimization with dynamic learning strategyrdquo AppliedSoft Computing vol 61 pp 832ndash843 2017
[29] L Zhang Y Tang C Hua and X Guan ldquoA new particleswarm optimization algorithm with adaptive inertia weightbased on Bayesian techniquesrdquo Applied Soft Computingvol 28 pp 138ndash149 2015
[30] Q Cui Q Li G Li et al ldquoGlobally-optimal prediction-basedadaptive mutation particle swarm optimizationrdquo InformationSciences vol 418 pp 186ndash217 2017
[31] D Zhao and J Liu ldquoStudy on network security situationawareness based on particle swarm optimization algorithmrdquoComputers amp Industrial Engineering vol 125 pp 764ndash7752018
[32] H Samareh S H Khoshrou K Shahriar M M Ebadzadehand M Eslami ldquoOptimization of a nonlinear model forpredicting the ground vibration using the combinationalparticle swarm optimization-genetic algorithmrdquo Journal ofAfrican Earth Sciences vol 133 pp 36ndash45 2017
[33] M Dash T Panigrahi and R Sharma ldquoDistributed parameterestimation of IIR system using diffusion particle swarm op-timization algorithmrdquo Journal of King Saud University-Engineering Sciences 2017 In press
[34] B Wang S Li J Guo and Q Chen ldquoCar-like mobile robotpath planning in rough terrain using multi-objective particleswarm optimization algorithmrdquo Neurocomputing vol 282pp 42ndash51 2018
[35] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[36] R F Lopes F F Costa A Oliveira et al ldquoAlgorithm based onparticle swarm applied to electrical load scheduling in anindustrial settingrdquo Energy vol 147 pp 1007ndash1015 2018
[37] F Sheikholeslami and N J Navimipour ldquoService allocation inthe cloud environments using multi-objective particle swarmoptimization algorithm based on crowding distancerdquo Swarmamp Evolutionary Computation vol 35 pp 53ndash64 2017
[38] M Petrovic N Vukovic M Mitic et al ldquoIntegration ofprocess planning and scheduling using chaotic particle swarmoptimization algorithmrdquo Expert Systems with Applicationsvol 64 pp 569ndash588 2016
[39] Z Zhang Y Jiang S Zhang S Geng H Wang and G SangldquoAn adaptive particle swarm optimization algorithm forreservoir operation optimizationrdquo Applied Soft ComputingJournal vol 18 no 4 pp 167ndash177 2014
[40] K Li L Liu J Zhai T M Khoshgoftaar and T Li ldquoeimproved grey model based on particle swarm optimizationalgorithm for time series predictionrdquo Engineering Applica-tions of Artificial Intelligence vol 55 pp 285ndash291 2016
[41] S Gulcu and H Kodaz ldquoe estimation of the electricityenergy demand using particle swarm optimization algorithma case study of Turkeyrdquo Procedia Computer Science vol 111pp 64ndash70 2017
Computational Intelligence and Neuroscience 17
Computer Games Technology
International Journal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom
Journal ofEngineeringVolume 2018
Advances in
FuzzySystems
Hindawiwwwhindawicom
Volume 2018
International Journal of
ReconfigurableComputing
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
thinspArtificial Intelligence
Hindawiwwwhindawicom Volumethinsp2018
Hindawiwwwhindawicom Volume 2018
Civil EngineeringAdvances in
Hindawiwwwhindawicom Volume 2018
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawiwwwhindawicom Volume 2018
Hindawi
wwwhindawicom Volume 2018
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Engineering Mathematics
International Journal of
RoboticsJournal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Computational Intelligence and Neuroscience
Hindawiwwwhindawicom Volume 2018
Mathematical Problems in Engineering
Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018
Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom
The Scientific World Journal
Volume 2018
Hindawiwwwhindawicom Volume 2018
Human-ComputerInteraction
Advances in
Hindawiwwwhindawicom Volume 2018
Scientic Programming
Submit your manuscripts atwwwhindawicom
information exchange between the local optimum and globaloptimal particles obtained by the current iteration and thepopulation velocity is updated by Formula (2) In the earlystage of the algorithm the entire search space is searched ata relatively high speed to determine the approximate rangeof the optimal solution the result is beneficial for globalsearch In the latter stage most of the particle search space isgradually reduced and concentrated in the neighborhood ofthe optimal value for deep search the result is beneficial forlocal search
e particles that have not exceeded the predeterminedrange after the speed update continue to retain their originalspeed e maximum value of the velocity is assigned to theparticle that is beyond the predetermined range after thespeed is updated e particles that have not exceeded thepredetermined range after the location update continue toretain their original positionsWhen the particles are beyondthe predetermined range inferior particles are eliminated byadding new particles to the population within the pre-determined range thereby forming a new population efitness value of the new population is recalculated and theinformation of the individual particle and its global optimalposition and fitness value obtained by the current iterationare preserved In all the algorithms particles have goodglobal search capability at the beginning of the iteration andas individual particles move closer to the local optimalparticle the algorithms gradually lose particle diversity Onthe basis of the idea of population variation of the traditionalgenetic algorithm (GA) the last-eliminated principle isapplied in the IEPSO algorithm to maintain particle pop-ulation diversity When the PSO satisfies the local conver-gence condition the optimal value obtained at this time maybe the local optimal value Particle population diversity ismaintained by using the particle fitness function as theevaluation criterion thereby eliminating particles with poorfitness or high similarity New particles are added to a newspecies in a predetermined range and the particle swarmoperations are reexecuted If the number of the currentiteration reaches the required predefined convergence ac-curacy the iteration is stopped and the optimal solution isproduced e complexity and runtime of the algorithmincrease due to the increased local-global informationsharing and the last-eliminated principle Neverthelessexperimental results show that the improved method canenhance the accuracy of the algorithm
3 Experimental Study
Eleven test functions are adopted in this study to test theperformance of the proposed IEPSO In this test f1ndashf5 areunimodal functions whereas f6ndashf11 are multimodal func-tions f6 (Griewank) is a multimodal function with multiplelocal extrema in which achieving the theoretically globaloptimum is difficult f7 (Rastrigin) possesses several localminima in which finding the global optimal value is dif-ficult f10 (Ackley) is an almost flat area modulated bya cosine wave to form a hole or a peak the surface isuneven and entry to a local optimum during optimizationis easy f11 (Cmfun) possesses multiple local extrema
around the global extremum point and falling into thelocal optimum is easy Table 1 presents the 11 test func-tions whereD is the space dimension S is the search rangeand CF is the theoretically optimal value31 Parameter InfluenceAnalysis of Local-Global InformationSharing Term is study proposes the addition of a local-global information sharing term which involves the pa-rameter C3 erefore the following exploration is con-ducted in a manner in which C3 is selected by using the 11test functions
(1) When C3 takes a constant value constant 2 isselected
(2) e linear variation formula of C3 is as follows
C3 k C3 startminus C3 startminusC3 end( 1113857 timest
tmax1113888 1113889 (4)
where k is the control factor When k 1 C3 is a linearlydecreasing function when k minus1 C3 is a linearly increasingfunctionC3_start andC3_end are the initial and terminationvalues of C3 respectively T is the iteration number and tmaxis the maximum number of iterations
Tables 2 and 3 and Figure 2 show that C3 is a constantthat linearly declines and linearly increases in three casesWhen the parameter C3 of the local-global informationsharing term is a linearly decreasing function the averagefitness value of the testing function is optimal and theconvergence speed and capability to jump out of the localextrema are higher than those in the other two cases WhenC3 takes a constant the algorithm cannot balance the globaland local search resulting in a ldquoprematurerdquo phenomenonWhen C3 adopts the linearly decreasing form the entire areacan be quickly searched at an early stage and close attentionis paid to local search in the latter part of the iteration toenhance the deep search ability of the algorithm While C3adopts a linearly increasing form it focuses on the global-local information exchange in the latter stage of the iterationAlthough this condition can increase the deep search abilityof the algorithm it will cause the convergence speed tostagnate erefore compared with the linearly increasingform the linearly decreasing form shows a simulation curvethat converges faster and with higher precision
erefore the selection rules of the parameter C3 oflocal-global information sharing in a decreasing function areinvestigated in this studye nonlinear variation formula ofC3 is as follows
C3 C3 startminusC3 end( 1113857 times tan 0875 times 1minust
tmax1113888 1113889
k
⎛⎝ ⎞⎠
+ C3 end
(5)where C3_start and C3_end are the initial and terminationvalues of the acceleration term C3 respectively and k is thecontrol factor When k 02 C3 is a convex decreasingfunction when k 2 C3 is a concave decreasing function t isthe iteration number and tmax is the maximum number ofiterations
4 Computational Intelligence and Neuroscience
Table 4 shows that when C3 is a convex function theprecision and robustness of the algorithm can obtain sat-isfactory results on f1ndashf5 Table 5 shows that when C3 is
a convex function the algorithm obtains a satisfactory so-lution and shows a fast convergence rate on f6 f8 f9 f10 andf11 In the unimodal test function the IEPSO algorithm does
Table 1 11 test functions
No Test function S CFf1 Sphere f1(x) 1113936
Di1x
2i [10] [minus100100]D 0
f2 Schaffer f(x y) 05 + (sin2x2 + y2
1113968minus 05)(1 + 0001(x2 + y2))2 [33] [minus100100]D 0
f3 Step f3(x) 1113936Di1[xi + 05]2 [10] [minus100100]D 0
f4 SumSquares f4(x) 1113936Di1ix
2i [10] [minus1010]D 0
f5 Zakharov f5(x) 1113936Di1x
2i + (1113936
Di105 times ix2
i )2 + (1113936Di105 times ixi)
4 [10] [minus100100]D 0
f6 Griewank f6(x) 140001113936Di1x
2i minus1113937
Di1cos(xi
i
radic+ 1) [10] [minus600600]D 0
f7 Rastrigin f7(x) 1113936Di1[x2
i minus 10 cos(2πxi + 10)] [10] [minus512512]D 0f8 Alpine f8(x) 1113936
Di1(|xi sinxi| + 01xi) [6] [minus1010]D 0
f9 Shubert minf9(x y) 11139365i i cos[(i + 1)x + i]1113966 1113967 times 1113936
5i i cos[(i + 1)y + i]1113966 1113967 [minus1010]D minus186731
f10 Ackley f10(x) minus20 exp(minus02
1D1113936Di1x
2i
1113969
)minus exp(1D1113936Di1cos 2πxi) + 20 + e [10] [minus3232]D 0
f11 Cmfun f11(x y) x sin(|x|
radic) + y sin(
|y|
1113968) [minus500500] minus837966
Table 2 Unimodal test functions
Functions Criteria C3 2 C3 2sim0 k minus1 C3 2sim0 k 1
f1Mean 722E + 02 107E minus 06 450E minus 20SD 397E + 04 111E minus 12 375E minus 16Best 405E + 02 241E minus 08 155E minus 25
f2Mean 250E minus 06 222E minus 17 0SD 232E minus 12 259E minus 33 0Best 285E minus 07 0 0
f3Mean 199E + 02 803E minus 07 182E minus 20SD 215E + 04 104E minus 12 105E minus 39Best 3581 395E minus 08 322E minus 24
f4Mean 7110 245E minus 08 820E minus 20SD 957 795E minus 16 511E minus 38Best 147 408E minus 09 843E minus 26
f5Mean 174E + 03 386E minus 04 556E minus 11SD 244E + 05 149E minus 07 488E minus 14Best 829E + 02 978E minus 06 354E minus 11
Table 3 Multimodal test functions
Functions Criteria C3 2 C3 2sim0 k minus1 C3 2sim0 k 1
f6Mean 110 818E minus 02 492E minus 02SD 46E minus 03 837E minus 04 596E minus 04Best 096 433E minus 02 123E minus 02
f7Mean 3503 410 19Ee minus 04SD 844 25461 5649Best 2967 2057 225E minus 05
f8Mean 293 133E minus 03 528E minus 10SD 030 310E minus 08 223E minus 12Best 202 134E minus 05 583E minus 13
f9Mean minus1867295 minus1867309 minus1867309SD 120E minus 06 0 0Best minus1867307 minus1867309 minus1867309
f10Mean 7649 235E minus 04 184E minus 11SD 0415 439E minus 09 227E minus 22Best 6513 573E minus 05 250E minus 12
f11Mean minus8379658 minus8379658 minus8379658SD 450E minus 09 0 440E minus 09Best minus8379658 minus8379658 minus8379658
Computational Intelligence and Neuroscience 5
0 50 100 150 200Iterations
0
500
1000
1500
2000
Fitn
ess v
alue
2
15
1
05
0100
0
ndash100 ndash100ndash50
050
100
k = ndash1k = 1Constant
times104
(a)
50 100 150 200Iterations
0
002
004
006
008Fi
tnes
s val
ue
08
06
04
02
0
ndash100
0
100
ndash100ndash50
500
100
k = ndash1k = 1Constant
(b)
50 100 150 200Iterations
0
500
1000
1500
2000
2500
3000
Fitn
ess v
alue
2times104
15
1
05
0100
0
ndash100 ndash100ndash50
0 50100
k = ndash1k = 1Constant
(c)
Figure 2 Continued
6 Computational Intelligence and Neuroscience
50 100 150 200Iterations
0
10
20
30
40
50
60
Fitn
ess v
alue
600
400
200
010
0
ndash10 ndash10 ndash5
05
10
k = ndash1k = 1Constant
(d)
50 100 150 200Iterations
0
5000
10000
15000
Fitn
ess v
alue
2times108
15
1
05
0100
0
ndash100 ndash100ndash50
050
100
k = ndash1k = 1Constant
(e)
50 100 150 200Iterations
0
05
1
15
2
Fitn
ess v
alue
2
15
1
05
0200
150100
50 50100
150200
k = ndash1k = 1Constant
(f )
Figure 2 Continued
Computational Intelligence and Neuroscience 7
50 100 150 200Iterations
0
10
20
30
40
50
60
Fitn
ess v
alue
80
60
40
20
05
0
ndash5 ndash50
5
k = ndash1k = 1Constant
(g)
50 100 150 200Iterations
0
1
2
3
4
5
6
Fitn
ess v
alue
15
10
5
010
0
ndash10 ndash10ndash5 0
105
k = ndash1k = 1Constant
(h)
0 50 100 150 200Iterations
ndash1865
ndash186
ndash1855
ndash185
ndash1845
ndash184
Fitn
ess v
alue
200
100
0
ndash100
200150
10050 50
100150 200
k = ndash1k = 1Constant
(i)
Figure 2 Continued
8 Computational Intelligence and Neuroscience
50 100 150 200Iterations
0
5
10
15
Fitn
ess v
alue
15
10
5
0
200
ndash20 ndash200
20
k = ndash1k = 1Constant
(j)
0 50 100 150 200Iterations
ndash840
ndash820
ndash800
ndash780
ndash760
ndash740
ndash720
Fitn
ess v
alue
1000
500
0
ndash500
500ndash1000
0
ndash500 ndash5000
500
k = ndash1k = 1Constant
(k)
Figure 2 11 test functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquares function (e) f5 Zakharovfunction (f ) f6 Griewank function (g) f7 Rastrigin function (h) f8 alpine function (i) f9 Shubert function (j) f10 Ackley function (k) f11Cmfun function
Table 4 Unimodal test functions
Functions Criteria C3 2sim0 k 02 C3 2sim0 k 2 C3 2sim0 k 1
f1Mean 266E minus 20 551E minus 10 450E minus 20SD 265E minus 39 287E minus 19 375E minus 16Best 912E minus 24 138E minus 11 155E minus 25
f2Mean 0 0 0SD 0 0 0Best 0 0 0
f3Mean 621E minus 19 604E minus 10 182E minus 20SD 263E minus 36 779E minus 19 105E minus 39Best 181E minus 27 308E minus 11 322E minus 24
f4Mean 170E minus 21 242E minus 11 820E minus 20SD 131E minus 41 440E minus 22 511E minus 38Best 282E minus 29 436E minus 12 843E minus 26
f5Mean 165E minus 10 283E minus 11 556E minus 11SD 330E minus 20 359E minus 11 488E minus 14Best 217E minus 11 100E minus 11 354E minus 11
Computational Intelligence and Neuroscience 9
not show its advantages because of its strong deep searchcapability In the complex multimodal test function whenthe convex function is used in C3 the downward trend isslow in the early stage thus beneting the global search andthe downward speed increases in the later stage thusbeneting the local search When the concave function isused for C3 the descent speed is fast in the early stageAlthough the search speed is improved the coverage area ofthe search is reduced thereby leading to the convergence ofthe algorithm to the nonoptimal value From the simulationdiagrams (f)ndash(k) the convergence speed is observed to beslightly slow when C3 is a convex function but its ability tojump out of the local extremum and the accuracy of theglobal search are higher than those in the other two casesWhen C3 is a concave function the convergence speed isfaster than those in the other two cases and the searchaccuracy is lower than that when C3 is a convex function
32 Comparison of Test Results e 11 test functions inFigure 1 are used to compare the IEPSO algorithm withclassical PSO SPSO dierential algorithm (DE) and GAe DE GA and PSO algorithms are all stochastic in-telligent optimization algorithms with population iterationse evaluation criteria of algorithm performance includespeed of convergence and size of individual populationsearch coverage e dierential optimization algorithm hasa low space complexity and obvious advantages in dealingwith large-scale and complex optimization problems eGA has good convergence when solving discrete multipeakand noise-containing optimization problems Based on thetraditional PSO algorithm the SPSO algorithm achieves thebalance between global search and local search by adjustingthe inertial weight (Figures 3 and 4)
e experimental parameters of the ve algorithmsare set as shown in Table 6 Each test function is runindependently 10 times and the average is recorded toreduce the data error e iteration is stopped when the
convergence condition meets the convergence accuracy ebest average tness value of the ve algorithms is blackenede standard deviation average tness and optimal value ofeach algorithm are shown in Tables 7 and 8 Figures 5 and 6plot the convergence curves of the 11 test functions
Table 7 shows that the IEPSO has the best performanceon f1 f2 f3 and f4 e IEPSO algorithm obtains the the-oretical optimal value on f2 DE can search the global so-lution on f5 e deep search capability of the IEPSOalgorithm is considerably higher than that of the PSO andSPSO algorithms due to the increased global-local in-formation sharing term and the last-eliminated principlee crossover mutation and selection mechanisms makethe DE algorithm perform well in the early stage of the globalsearch However the diversity of the population declines inthe latter stage because of population dierences e
0 2 4 6 8 10Iterations
0
02
04
06
08
1
C3
k = 2k = 02k = 1
Figure 3 e change curve of C3 with the number of iterations
Table 5 Multimodal test functions
Functions Criteria C3 2sim0 k 02 C3 2sim0 k 2 C3 2sim0 k 1
f6Mean 419E minus 02 479E minus 02 492E minus 02SD 343E minus 04 707E minus 04 596E minus 04Best 125E minus 02 57E minus 03 123E minus 02
f7Mean 446E minus 03 500E minus 05 19Ee minus 04SD 173E minus 04 303E minus 06 5649Best 231E minus 12 389E minus 11 225E minus 05
f8Mean 242E minus 10 374E minus 10 528E minus 10SD 674E minus 20 247E minus 12 223E minus 12Best 371E minus 16 436E minus 11 583E minus 13
f9Mean minus1867309 minus1867309 minus1867309SD 0 0 0Best minus1867309 minus1867309 minus1867309
f10Mean 113E minus 11 205E minus 10 184E minus 11SD 221E minus 22 437E minus 12 227E minus 22Best 506E minus 14 175E minus 10 250E minus 12
f11Mean minus8379658 minus8379658 minus8379658SD 0 0 440E minus 09Best minus8379658 minus8379658 minus8379658
10 Computational Intelligence and Neuroscience
0 50 100 150 200Iterations
0
500
1000
1500
2000
2500
3000Fi
tnes
s val
ue
2 4 6 8 10 120
1000
2000
k = 2k = 02k = 1
(a)
50 100 150 200Iterations
0
0005
001
0015
Fitn
ess v
alue
5 10 15 200
1
2
3times10ndash3
k = 02k = 2k = 1
(b)
50 100 150 200Iterations
0
500
1000
1500
2000
2500
3000
Fitn
ess v
alue
2 4 60
1000
2000
k = 1k = 2k = 02
(c)
50 100 150 200Iterations
0
10
20
30
40
50
60
Fitn
ess v
alue
2 4 6 8 100
20
40
k = 02k = 2k = 1
(d)
50 100 150 200Iterations
0
2000
4000
6000
8000
10000
12000
14000
Fitn
ess v
alue
5 10 15 200
2000
4000
6000
k = 02k = 2k = 1
(e)
50 100 150 200Iterations
0
05
1
15
2
Fitn
ess v
alue
5 10 15 20 25 300
05
1
15
k = 1k = 2k = 02
(f )
Figure 4 Continued
Computational Intelligence and Neuroscience 11
50 100 150 200Iterations
0
10
20
30
40
50Fi
tnes
s val
ue
20 40 60 800
10
20
30
k = 02k = 2k = 1
(g)
50 100 150 200Iterations
0
1
2
3
4
Fitn
ess v
alue
2 4 6 8 10 120
1
2
3
k = 02k = 2k = 1
(h)
0 50 100 150 200Iterations
ndash1865
ndash186
ndash1855
ndash185
ndash1845
ndash184
Fitn
ess v
alue
10 20 30 40
ndash1865
ndash186
ndash1855
ndash185
k = 02k = 2k = 1
(i)
50 100 150 200Iterations
0
5
10
15
Fitn
ess v
alue
10 20 300
2
4
k = 02k = 2k = 1
(j)
0 50 100 150 200Iterations
ndash840
ndash820
ndash800
ndash780
ndash760
ndash740
ndash720
Fitn
ess v
alue
k = 02k = 2k = 1
60 65 70 75 80ndash838ndash836ndash834ndash832ndash830
(k)
Figure 4 11 test functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquare function (e) f5 Zakharovfunction (f ) f6 Griewank function (g) f7 Rastrigin function (h) f8 alpine function (i) f9 Shubert function (j) f10 Ackley function (k) f11Cmfun function
12 Computational Intelligence and Neuroscience
simulation diagrams (a)ndash(e) show that although the DEalgorithm converges rapidly in the early stage its globalsearch performance in the later stage becomes lower thanthat of the IEPSO algorithm When the GA is used to solveoptimization problems the individuals in the population fallinto the local optimum and do not continue searching forthe optimum solution erefore in Figure 5 the simulationcurve of the GA converges to the local optimum
e test results in Table 8 indicate that the IEPSO has thebest performance on f6 f7 f8 f9 f10 and f11 and that the DE
and GA can obtain the theoretical optimal value on f9 and f11Although the GA and IEPSO algorithm can obtain the globaloptimal value on f9 the IEPSO algorithm is more robust thanthe GA is As shown in the simulation curve of Figure 6 thediversity of the population is maintained because the sup-plementary particles in the population are stochastic whenthe local optimal solution converges gradually e IEPSOalgorithm can jump out of the local extrema points in theface of complex multimodal test functions and the numberof iterations required is correspondingly reduced
Table 6 Parameter settings
Algorithm Population Maximum iteration Dim of each object OthersPSO 40 1000 10 C1 C2 2 R1 R2 05SPSO 40 1000 10 ω 09ndash04 C1 C2 2 R1 R2 05DE 40 1000 10 mdashGA 40 1000 10 GGAP 05 PRECI 25IEPSO 40 1000 10 ω 09ndash04 C1 C2 2 C3 2ndash0 R1 R2 R3 05
Table 7 Unimodal test functions
Functions Criteria PSO SPSO DE IEPSO GA
f1Mean 133E + 03 308E + 03 731E minus 12 892E minus 22 11696SD 253E + 05 121E + 06 225E minus 23 265E minus 39 44192Best 114E + 03 120E + 03 242E minus 12 772E minus 27 4660
f2Mean 296E minus 02 880E minus 02 837E minus 06 0 179E minus 11SD 836E minus 04 896E minus 04 158E minus 10 0 0Best 455E minus 03 8428734 755E minus 10 0 179E minus 11
f3Mean 119E + 03 251E + 03 114E minus 11 621E minus 19 7430SD 293E + 05 182E + 06 995E minus 23 263E minus 36 5833Best 106E + 03 282E minus 02 210E minus 12 181E minus 27 4542
f4Mean 8238 8210 336E minus 13 170E minus 21 3031SD 686E + 02 140E + 03 995E minus 26 131E minus 41 0835Best 115E + 02 3739 115E minus 13 282E minus 29 1968
f5Mean 126E + 04 860E + 03 702E minus 12 165E minus 10 362E + 03SD 206E + 07 215E + 07 181E minus 23 330E minus 20 344E + 05Best 104E + 04 130E + 02 267E minus 12 217E minus 11 253E + 03
Table 8 Multimodal test functions
Functions Criteria PSO SPSO DE IEPSO GA
f6Mean 1548 1752 944E minus 02 419E minus 02 1006SD 0026 0093 487E minus 04 343E minus 04 0018Best 1236 1417 006 0013 0794
f7Mean 57737 43405 11945 446E minus 03 8939SD 117768 65178 16502 173E minus 04 3608Best 35981 317E + 01 6398 231E minus 12 5040
f8Mean 4996 4665 379E minus 02 242E minus 10 0423SD 191E + 00 1056 54E minus 03 674E minus 20 0051Best 2933 3151 46E minus 03 371E minus 16 0086
f9Mean minus186448 minus186048 minus186728 minus186731 minus186731SD 119E minus 01 983E minus 01 229E minus 08 0 999E minus 12Best minus187E + 02 minus186731 minus1867309 minus1867309 minus186731
f10Mean 13134 15560 1613 113E minus 11 2515SD 14260 2163 0 221E minus 22 0166Best 2861 12719 1613 506E minus 14 1796
f11Mean minus740326 minus715438 minus837966 minus837966 minus837966SD 874E + 03 723E + 03 0 0 0Best minus837966 minus837697 minus837966 minus837966 minus837966
Computational Intelligence and Neuroscience 13
0 100 200 300 400 500 600Iterations
0
2000
4000
6000
8000
10000
12000Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
DE
GASPSO
PSOIEPSO
0 100 200 300 400 500 600Iterations
0
01
02
03
04
05
Fitn
ess v
alue
(b)
DEGAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
5000
10000
15000
Fitn
ess v
alue
(c)
DE
GAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
200
400
600
800
Fitn
ess v
alue
(d)
DEGAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
1
2
3
4
Fitn
ess v
alue
times104
(e)
Figure 5 Unimodal functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquares function (e) f5Zakharov function
14 Computational Intelligence and Neuroscience
0 100 200 300 400 500 600Iterations
0
05
1
15
2
25
3Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
0 100 200 300 400 500 600Iterations
0
20
40
60
80
100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(b)
0 100 200 300 400 500 600Iterations
0
5
10
15
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(c)
0 100 200 300 400 500 600Iterations
ndash180
ndash160
ndash140
ndash120
ndash100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(d)
0 100 200 300 400 500 600Iterations
0
5
10
15
20
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(e)
100 200 300 400 500 600Iterations
ndash850
ndash800
ndash750
ndash700
ndash650
ndash600
ndash550
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(f )
Figure 6 Multimodal functions (a) f6 Griewank function (b) f7 Rastrigin function (c) f8 alpine function (d) f9 Shubert function (e) f10Ackley function (f ) f11 Cmfun function
Computational Intelligence and Neuroscience 15
Table 9 shows the test results for the three improved PSOalgorithms e DMSDL-PSO algorithm in [25] is a PSOalgorithm combined with differential variation and thequasi-Newton method whereas the HPSOWM algorithm in[26] is a binary PSO algorithm based on wavelet transformTable 9 shows that the IEPSO algorithm obtains the bestvalue in 5 out of the 11 test functions and the above analysisindicates that the IEPSO outperforms the other improvedPSO algorithms
4 Conclusion
In contemporary engineering design solving the globaloptimization problems of multiparameter strongly coupledand nonlinear systems using conventional optimizationalgorithms is difficult In this study an improved PSO thatis the IEPSO algorithm is proposed on the basis of the last-eliminated principle and an enhanced local-global in-formation sharing capability e comparison and analysisof the simulation results indicate the following conclusions
(1) e exchange of information between global andlocal optimal particles enhances the deep searchcapability of the IEPSO algorithm
(2) e standard test function is used to simulate theparameter C3 of the local-global information sharingterm e results show that the global optimizationcapability of the IEPSO algorithm is strong when C3is linearly decreasing Moreover the proposed al-gorithm can show the best search performance whenC3 is a nonlinear convex function
(3) e last-eliminated principle is used in the IEPSO tomaintain particle population diversity MoreoverPSO is avoided in the local optimal value A com-parison of the IEPSO algorithm with the classicaloptimization algorithm and its improved versionsverifies the global search capability of the IEPSOalgorithm
In summary the comparative results of the simulationanalysis reveal that with the application of the last-eliminatedprinciple and the local-global information sharing term to the
IEPSO the proposed algorithm effectively overcomes thedisadvantages of the classical algorithms including theirprecocious convergence and tendency to fall into the localoptimum e IEPSO shows an ideal global optimizationperformance and indicates a high application value for solvingpractical engineering optimization problems
Data Availability
e data used to support the findings of this study areavailable from the corresponding author upon request
Conflicts of Interest
e authors declare that there are no conflicts of interest
Acknowledgments
is work was supported by Shanghai Rising-Star Program(no 16QB1401000) Key Project of Shanghai Science andTechnology Committee (no 16DZ1120400) and the Na-tional Natural Science Foundation of China (Project no51705187) the Postdoctoral Science Foundation of China(Grant no 2017M621202)
References
[1] Z Zhou J Wang Z Zhu D Yang and J Wu ldquoTangentnavigated robot path planning strategy using particle swarmoptimized artificial potential fieldrdquo Optik vol 158 pp 639ndash651 2018
[2] P Du R Barrio H Jiang and L Cheng ldquoAccurate Quotient-Difference algorithm error analysis improvements and ap-plicationsrdquo Applied Mathematics and Computation vol 309pp 245ndash271 2017
[3] L Jiang Z Wang Y Ye and J Jiang ldquoFast circle detectionalgorithm based on sampling from difference areardquo Optikvol 158 pp 424ndash433 2018
[4] H Garg ldquoA hybrid PSO-GA algorithm for constrained op-timization problemsrdquo Applied Mathematics amp Computationvol 274 no 11 pp 292ndash305 2016
[5] J Zhang and P Xia ldquoAn improved PSO algorithm for pa-rameter identification of nonlinear dynamic hystereticmodelsrdquo Journal of Sound and Vibration vol 389 pp 153ndash167 2017
[6] R Saini P P Roy and D P Dogra ldquoA segmental HMMbasedtrajectory classification using genetic algorithmrdquo ExpertSystems with Applications vol 93 pp 169ndash181 2018
[7] P R D O D Costa S Mauceri P Carroll et al ldquoA geneticalgorithm for a vehicle routing problemrdquo Electronic Notes inDiscrete Mathematics vol 64 pp 65ndash74 2017
[8] V Jindal and P Bedi ldquoAn improved hybrid ant particleoptimization (IHAPO) algorithm for reducing travel time inVANETsrdquo Applied Soft Computing vol 64 pp 526ndash5352018
[9] Z Peng H Manier and M A Manier ldquoParticle swarmoptimization for capacitated location-routing problemrdquoIFAC-PapersOnLine vol 50 no 1 pp 14668ndash14673 2017
[10] G Xu and G Yu ldquoReprint of on convergence analysis ofparticle swarm optimization algorithmrdquo Journal of ShanxiNormal University vol 4 no 14 pp 25ndash32 2008
[11] J Lu W Xie and H Zhou ldquoCombined fitness functionbased particle swarm optimization algorithm for system
Table 9 ree improved particle swarm algorithm test results
Functions Criteria IEPSO DMSDL-PSO [25]
BHPSOWM[26]
f1Mean 892E minus 22 473E minus 10 4240SD 265E minus 39 181E minus 09 5211
f3Mean 621E minus 19 237E + 03 761SD 263E minus 36 571E + 02 007
f6Mean 419E minus 02 866E minus 05 mdashSD 343E minus 04 296E minus 04 mdash
f7Mean 446E minus 03 915E + 01 7618SD 173E minus 04 180E + 01 2675
f8Mean 242E minus 10 131E + 02 mdashSD 674E minus 20 582E + 01 mdash
f10Mean 113E minus 11 101E + 00 172SD 221E minus 22 271E minus 01 0
16 Computational Intelligence and Neuroscience
identificationrdquo Computers amp Industrial Engineering vol 95pp 122ndash134 2016
[12] F Javidrad and M Nazari ldquoA new hybrid particle swarm andsimulated annealing stochastic optimization methodrdquo Ap-plied Soft Computing vol 60 pp 634ndash654 2017
[13] J Jie J Zhang H Zheng and B Hou ldquoFormalized model andanalysis of mixed swarm based cooperative particle swarmoptimizationrdquo Neurocomputing vol 174 pp 542ndash552 2016
[14] A Meng Z Li H Yin S Chen and Z Guo ldquoAcceleratingparticle swarm optimization using crisscross searchrdquo In-formation Sciences vol 329 pp 52ndash72 2016
[15] L Wang B Yang and J Orchard ldquoParticle swarm optimi-zation using dynamic tournament topologyrdquo Applied SoftComputing vol 48 pp 584ndash596 2016
[16] M S Kiran ldquoParticle swarm optimization with a new updatemechanismrdquo Applied Soft Computing vol 60 pp 670ndash6782017
[17] H C Tsai ldquoUnified particle swarm delivers high efficiency toparticle swarm optimizationrdquo Applied Soft Computingvol 55 pp 371ndash383 2017
[18] S F Li and C Y Cheng ldquoParticle swarm optimization withfitness adjustment parametersrdquo Computers amp Industrial En-gineering vol 113 pp 831ndash841 2017
[19] Y Chen L Li H Peng J Xiao Y Yang and Y Shi ldquoParticleswarm optimizer with two differential mutationrdquo Applied SoftComputing vol 61 pp 314ndash330 2017
[20] Q Zhang W Liu X Meng B Yang and A V VasilakosldquoVector coevolving particle swarm optimization algorithmrdquoInformation Sciences vol 394 pp 273ndash298 2017
[21] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimization[C]Evolutionary computationrdquo in Proceedingsof the 1999 Congress on Evolutionary Computation-CEC99vol 3 pp 1945ndash1950 IEEE Washington DC USA 1999
[22] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[23] A Ratnaweera S K Halgamuge and H C Watson ldquoSelf-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficientsrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 240ndash255 2004
[24] K Chen F Zhou and A Liu ldquoChaotic dynamic weightparticle swarm optimization for numerical function optimi-zationrdquo Knowledge-Based Systems vol 139 pp 23ndash40 2018
[25] Y Chen L Li H Peng J Xiao and Q Wu ldquoDynamic multi-swarm differential learning particle swarm optimizerrdquoSwarm and Evolutionary Computation vol 39 pp 209ndash2212018
[26] F Jiang H Xia Q A Tran Q M Ha N Q Tran and J HuldquoA new binary hybrid particle swarm optimization withwavelet mutationrdquo Knowledge-Based Systems vol 130pp 90ndash101 2017
[27] R Liu J Li C Mu J fan and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimiza-tion for dynamic multi-objective optimizationrdquo EuropeanJournal of Operational Research vol 261 no 3 pp 1028ndash1051 2017
[28] W Ye W Feng and S Fan ldquoA novel multi-swarm particleswarm optimization with dynamic learning strategyrdquo AppliedSoft Computing vol 61 pp 832ndash843 2017
[29] L Zhang Y Tang C Hua and X Guan ldquoA new particleswarm optimization algorithm with adaptive inertia weightbased on Bayesian techniquesrdquo Applied Soft Computingvol 28 pp 138ndash149 2015
[30] Q Cui Q Li G Li et al ldquoGlobally-optimal prediction-basedadaptive mutation particle swarm optimizationrdquo InformationSciences vol 418 pp 186ndash217 2017
[31] D Zhao and J Liu ldquoStudy on network security situationawareness based on particle swarm optimization algorithmrdquoComputers amp Industrial Engineering vol 125 pp 764ndash7752018
[32] H Samareh S H Khoshrou K Shahriar M M Ebadzadehand M Eslami ldquoOptimization of a nonlinear model forpredicting the ground vibration using the combinationalparticle swarm optimization-genetic algorithmrdquo Journal ofAfrican Earth Sciences vol 133 pp 36ndash45 2017
[33] M Dash T Panigrahi and R Sharma ldquoDistributed parameterestimation of IIR system using diffusion particle swarm op-timization algorithmrdquo Journal of King Saud University-Engineering Sciences 2017 In press
[34] B Wang S Li J Guo and Q Chen ldquoCar-like mobile robotpath planning in rough terrain using multi-objective particleswarm optimization algorithmrdquo Neurocomputing vol 282pp 42ndash51 2018
[35] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[36] R F Lopes F F Costa A Oliveira et al ldquoAlgorithm based onparticle swarm applied to electrical load scheduling in anindustrial settingrdquo Energy vol 147 pp 1007ndash1015 2018
[37] F Sheikholeslami and N J Navimipour ldquoService allocation inthe cloud environments using multi-objective particle swarmoptimization algorithm based on crowding distancerdquo Swarmamp Evolutionary Computation vol 35 pp 53ndash64 2017
[38] M Petrovic N Vukovic M Mitic et al ldquoIntegration ofprocess planning and scheduling using chaotic particle swarmoptimization algorithmrdquo Expert Systems with Applicationsvol 64 pp 569ndash588 2016
[39] Z Zhang Y Jiang S Zhang S Geng H Wang and G SangldquoAn adaptive particle swarm optimization algorithm forreservoir operation optimizationrdquo Applied Soft ComputingJournal vol 18 no 4 pp 167ndash177 2014
[40] K Li L Liu J Zhai T M Khoshgoftaar and T Li ldquoeimproved grey model based on particle swarm optimizationalgorithm for time series predictionrdquo Engineering Applica-tions of Artificial Intelligence vol 55 pp 285ndash291 2016
[41] S Gulcu and H Kodaz ldquoe estimation of the electricityenergy demand using particle swarm optimization algorithma case study of Turkeyrdquo Procedia Computer Science vol 111pp 64ndash70 2017
Computational Intelligence and Neuroscience 17
Computer Games Technology
International Journal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom
Journal ofEngineeringVolume 2018
Advances in
FuzzySystems
Hindawiwwwhindawicom
Volume 2018
International Journal of
ReconfigurableComputing
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
thinspArtificial Intelligence
Hindawiwwwhindawicom Volumethinsp2018
Hindawiwwwhindawicom Volume 2018
Civil EngineeringAdvances in
Hindawiwwwhindawicom Volume 2018
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawiwwwhindawicom Volume 2018
Hindawi
wwwhindawicom Volume 2018
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Engineering Mathematics
International Journal of
RoboticsJournal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Computational Intelligence and Neuroscience
Hindawiwwwhindawicom Volume 2018
Mathematical Problems in Engineering
Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018
Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom
The Scientific World Journal
Volume 2018
Hindawiwwwhindawicom Volume 2018
Human-ComputerInteraction
Advances in
Hindawiwwwhindawicom Volume 2018
Scientic Programming
Submit your manuscripts atwwwhindawicom
Table 4 shows that when C3 is a convex function theprecision and robustness of the algorithm can obtain sat-isfactory results on f1ndashf5 Table 5 shows that when C3 is
a convex function the algorithm obtains a satisfactory so-lution and shows a fast convergence rate on f6 f8 f9 f10 andf11 In the unimodal test function the IEPSO algorithm does
Table 1 11 test functions
No Test function S CFf1 Sphere f1(x) 1113936
Di1x
2i [10] [minus100100]D 0
f2 Schaffer f(x y) 05 + (sin2x2 + y2
1113968minus 05)(1 + 0001(x2 + y2))2 [33] [minus100100]D 0
f3 Step f3(x) 1113936Di1[xi + 05]2 [10] [minus100100]D 0
f4 SumSquares f4(x) 1113936Di1ix
2i [10] [minus1010]D 0
f5 Zakharov f5(x) 1113936Di1x
2i + (1113936
Di105 times ix2
i )2 + (1113936Di105 times ixi)
4 [10] [minus100100]D 0
f6 Griewank f6(x) 140001113936Di1x
2i minus1113937
Di1cos(xi
i
radic+ 1) [10] [minus600600]D 0
f7 Rastrigin f7(x) 1113936Di1[x2
i minus 10 cos(2πxi + 10)] [10] [minus512512]D 0f8 Alpine f8(x) 1113936
Di1(|xi sinxi| + 01xi) [6] [minus1010]D 0
f9 Shubert minf9(x y) 11139365i i cos[(i + 1)x + i]1113966 1113967 times 1113936
5i i cos[(i + 1)y + i]1113966 1113967 [minus1010]D minus186731
f10 Ackley f10(x) minus20 exp(minus02
1D1113936Di1x
2i
1113969
)minus exp(1D1113936Di1cos 2πxi) + 20 + e [10] [minus3232]D 0
f11 Cmfun f11(x y) x sin(|x|
radic) + y sin(
|y|
1113968) [minus500500] minus837966
Table 2 Unimodal test functions
Functions Criteria C3 2 C3 2sim0 k minus1 C3 2sim0 k 1
f1Mean 722E + 02 107E minus 06 450E minus 20SD 397E + 04 111E minus 12 375E minus 16Best 405E + 02 241E minus 08 155E minus 25
f2Mean 250E minus 06 222E minus 17 0SD 232E minus 12 259E minus 33 0Best 285E minus 07 0 0
f3Mean 199E + 02 803E minus 07 182E minus 20SD 215E + 04 104E minus 12 105E minus 39Best 3581 395E minus 08 322E minus 24
f4Mean 7110 245E minus 08 820E minus 20SD 957 795E minus 16 511E minus 38Best 147 408E minus 09 843E minus 26
f5Mean 174E + 03 386E minus 04 556E minus 11SD 244E + 05 149E minus 07 488E minus 14Best 829E + 02 978E minus 06 354E minus 11
Table 3 Multimodal test functions
Functions Criteria C3 2 C3 2sim0 k minus1 C3 2sim0 k 1
f6Mean 110 818E minus 02 492E minus 02SD 46E minus 03 837E minus 04 596E minus 04Best 096 433E minus 02 123E minus 02
f7Mean 3503 410 19Ee minus 04SD 844 25461 5649Best 2967 2057 225E minus 05
f8Mean 293 133E minus 03 528E minus 10SD 030 310E minus 08 223E minus 12Best 202 134E minus 05 583E minus 13
f9Mean minus1867295 minus1867309 minus1867309SD 120E minus 06 0 0Best minus1867307 minus1867309 minus1867309
f10Mean 7649 235E minus 04 184E minus 11SD 0415 439E minus 09 227E minus 22Best 6513 573E minus 05 250E minus 12
f11Mean minus8379658 minus8379658 minus8379658SD 450E minus 09 0 440E minus 09Best minus8379658 minus8379658 minus8379658
Computational Intelligence and Neuroscience 5
0 50 100 150 200Iterations
0
500
1000
1500
2000
Fitn
ess v
alue
2
15
1
05
0100
0
ndash100 ndash100ndash50
050
100
k = ndash1k = 1Constant
times104
(a)
50 100 150 200Iterations
0
002
004
006
008Fi
tnes
s val
ue
08
06
04
02
0
ndash100
0
100
ndash100ndash50
500
100
k = ndash1k = 1Constant
(b)
50 100 150 200Iterations
0
500
1000
1500
2000
2500
3000
Fitn
ess v
alue
2times104
15
1
05
0100
0
ndash100 ndash100ndash50
0 50100
k = ndash1k = 1Constant
(c)
Figure 2 Continued
6 Computational Intelligence and Neuroscience
50 100 150 200Iterations
0
10
20
30
40
50
60
Fitn
ess v
alue
600
400
200
010
0
ndash10 ndash10 ndash5
05
10
k = ndash1k = 1Constant
(d)
50 100 150 200Iterations
0
5000
10000
15000
Fitn
ess v
alue
2times108
15
1
05
0100
0
ndash100 ndash100ndash50
050
100
k = ndash1k = 1Constant
(e)
50 100 150 200Iterations
0
05
1
15
2
Fitn
ess v
alue
2
15
1
05
0200
150100
50 50100
150200
k = ndash1k = 1Constant
(f )
Figure 2 Continued
Computational Intelligence and Neuroscience 7
50 100 150 200Iterations
0
10
20
30
40
50
60
Fitn
ess v
alue
80
60
40
20
05
0
ndash5 ndash50
5
k = ndash1k = 1Constant
(g)
50 100 150 200Iterations
0
1
2
3
4
5
6
Fitn
ess v
alue
15
10
5
010
0
ndash10 ndash10ndash5 0
105
k = ndash1k = 1Constant
(h)
0 50 100 150 200Iterations
ndash1865
ndash186
ndash1855
ndash185
ndash1845
ndash184
Fitn
ess v
alue
200
100
0
ndash100
200150
10050 50
100150 200
k = ndash1k = 1Constant
(i)
Figure 2 Continued
8 Computational Intelligence and Neuroscience
50 100 150 200Iterations
0
5
10
15
Fitn
ess v
alue
15
10
5
0
200
ndash20 ndash200
20
k = ndash1k = 1Constant
(j)
0 50 100 150 200Iterations
ndash840
ndash820
ndash800
ndash780
ndash760
ndash740
ndash720
Fitn
ess v
alue
1000
500
0
ndash500
500ndash1000
0
ndash500 ndash5000
500
k = ndash1k = 1Constant
(k)
Figure 2 11 test functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquares function (e) f5 Zakharovfunction (f ) f6 Griewank function (g) f7 Rastrigin function (h) f8 alpine function (i) f9 Shubert function (j) f10 Ackley function (k) f11Cmfun function
Table 4 Unimodal test functions
Functions Criteria C3 2sim0 k 02 C3 2sim0 k 2 C3 2sim0 k 1
f1Mean 266E minus 20 551E minus 10 450E minus 20SD 265E minus 39 287E minus 19 375E minus 16Best 912E minus 24 138E minus 11 155E minus 25
f2Mean 0 0 0SD 0 0 0Best 0 0 0
f3Mean 621E minus 19 604E minus 10 182E minus 20SD 263E minus 36 779E minus 19 105E minus 39Best 181E minus 27 308E minus 11 322E minus 24
f4Mean 170E minus 21 242E minus 11 820E minus 20SD 131E minus 41 440E minus 22 511E minus 38Best 282E minus 29 436E minus 12 843E minus 26
f5Mean 165E minus 10 283E minus 11 556E minus 11SD 330E minus 20 359E minus 11 488E minus 14Best 217E minus 11 100E minus 11 354E minus 11
Computational Intelligence and Neuroscience 9
not show its advantages because of its strong deep searchcapability In the complex multimodal test function whenthe convex function is used in C3 the downward trend isslow in the early stage thus beneting the global search andthe downward speed increases in the later stage thusbeneting the local search When the concave function isused for C3 the descent speed is fast in the early stageAlthough the search speed is improved the coverage area ofthe search is reduced thereby leading to the convergence ofthe algorithm to the nonoptimal value From the simulationdiagrams (f)ndash(k) the convergence speed is observed to beslightly slow when C3 is a convex function but its ability tojump out of the local extremum and the accuracy of theglobal search are higher than those in the other two casesWhen C3 is a concave function the convergence speed isfaster than those in the other two cases and the searchaccuracy is lower than that when C3 is a convex function
32 Comparison of Test Results e 11 test functions inFigure 1 are used to compare the IEPSO algorithm withclassical PSO SPSO dierential algorithm (DE) and GAe DE GA and PSO algorithms are all stochastic in-telligent optimization algorithms with population iterationse evaluation criteria of algorithm performance includespeed of convergence and size of individual populationsearch coverage e dierential optimization algorithm hasa low space complexity and obvious advantages in dealingwith large-scale and complex optimization problems eGA has good convergence when solving discrete multipeakand noise-containing optimization problems Based on thetraditional PSO algorithm the SPSO algorithm achieves thebalance between global search and local search by adjustingthe inertial weight (Figures 3 and 4)
e experimental parameters of the ve algorithmsare set as shown in Table 6 Each test function is runindependently 10 times and the average is recorded toreduce the data error e iteration is stopped when the
convergence condition meets the convergence accuracy ebest average tness value of the ve algorithms is blackenede standard deviation average tness and optimal value ofeach algorithm are shown in Tables 7 and 8 Figures 5 and 6plot the convergence curves of the 11 test functions
Table 7 shows that the IEPSO has the best performanceon f1 f2 f3 and f4 e IEPSO algorithm obtains the the-oretical optimal value on f2 DE can search the global so-lution on f5 e deep search capability of the IEPSOalgorithm is considerably higher than that of the PSO andSPSO algorithms due to the increased global-local in-formation sharing term and the last-eliminated principlee crossover mutation and selection mechanisms makethe DE algorithm perform well in the early stage of the globalsearch However the diversity of the population declines inthe latter stage because of population dierences e
0 2 4 6 8 10Iterations
0
02
04
06
08
1
C3
k = 2k = 02k = 1
Figure 3 e change curve of C3 with the number of iterations
Table 5 Multimodal test functions
Functions Criteria C3 2sim0 k 02 C3 2sim0 k 2 C3 2sim0 k 1
f6Mean 419E minus 02 479E minus 02 492E minus 02SD 343E minus 04 707E minus 04 596E minus 04Best 125E minus 02 57E minus 03 123E minus 02
f7Mean 446E minus 03 500E minus 05 19Ee minus 04SD 173E minus 04 303E minus 06 5649Best 231E minus 12 389E minus 11 225E minus 05
f8Mean 242E minus 10 374E minus 10 528E minus 10SD 674E minus 20 247E minus 12 223E minus 12Best 371E minus 16 436E minus 11 583E minus 13
f9Mean minus1867309 minus1867309 minus1867309SD 0 0 0Best minus1867309 minus1867309 minus1867309
f10Mean 113E minus 11 205E minus 10 184E minus 11SD 221E minus 22 437E minus 12 227E minus 22Best 506E minus 14 175E minus 10 250E minus 12
f11Mean minus8379658 minus8379658 minus8379658SD 0 0 440E minus 09Best minus8379658 minus8379658 minus8379658
10 Computational Intelligence and Neuroscience
0 50 100 150 200Iterations
0
500
1000
1500
2000
2500
3000Fi
tnes
s val
ue
2 4 6 8 10 120
1000
2000
k = 2k = 02k = 1
(a)
50 100 150 200Iterations
0
0005
001
0015
Fitn
ess v
alue
5 10 15 200
1
2
3times10ndash3
k = 02k = 2k = 1
(b)
50 100 150 200Iterations
0
500
1000
1500
2000
2500
3000
Fitn
ess v
alue
2 4 60
1000
2000
k = 1k = 2k = 02
(c)
50 100 150 200Iterations
0
10
20
30
40
50
60
Fitn
ess v
alue
2 4 6 8 100
20
40
k = 02k = 2k = 1
(d)
50 100 150 200Iterations
0
2000
4000
6000
8000
10000
12000
14000
Fitn
ess v
alue
5 10 15 200
2000
4000
6000
k = 02k = 2k = 1
(e)
50 100 150 200Iterations
0
05
1
15
2
Fitn
ess v
alue
5 10 15 20 25 300
05
1
15
k = 1k = 2k = 02
(f )
Figure 4 Continued
Computational Intelligence and Neuroscience 11
50 100 150 200Iterations
0
10
20
30
40
50Fi
tnes
s val
ue
20 40 60 800
10
20
30
k = 02k = 2k = 1
(g)
50 100 150 200Iterations
0
1
2
3
4
Fitn
ess v
alue
2 4 6 8 10 120
1
2
3
k = 02k = 2k = 1
(h)
0 50 100 150 200Iterations
ndash1865
ndash186
ndash1855
ndash185
ndash1845
ndash184
Fitn
ess v
alue
10 20 30 40
ndash1865
ndash186
ndash1855
ndash185
k = 02k = 2k = 1
(i)
50 100 150 200Iterations
0
5
10
15
Fitn
ess v
alue
10 20 300
2
4
k = 02k = 2k = 1
(j)
0 50 100 150 200Iterations
ndash840
ndash820
ndash800
ndash780
ndash760
ndash740
ndash720
Fitn
ess v
alue
k = 02k = 2k = 1
60 65 70 75 80ndash838ndash836ndash834ndash832ndash830
(k)
Figure 4 11 test functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquare function (e) f5 Zakharovfunction (f ) f6 Griewank function (g) f7 Rastrigin function (h) f8 alpine function (i) f9 Shubert function (j) f10 Ackley function (k) f11Cmfun function
12 Computational Intelligence and Neuroscience
simulation diagrams (a)ndash(e) show that although the DEalgorithm converges rapidly in the early stage its globalsearch performance in the later stage becomes lower thanthat of the IEPSO algorithm When the GA is used to solveoptimization problems the individuals in the population fallinto the local optimum and do not continue searching forthe optimum solution erefore in Figure 5 the simulationcurve of the GA converges to the local optimum
e test results in Table 8 indicate that the IEPSO has thebest performance on f6 f7 f8 f9 f10 and f11 and that the DE
and GA can obtain the theoretical optimal value on f9 and f11Although the GA and IEPSO algorithm can obtain the globaloptimal value on f9 the IEPSO algorithm is more robust thanthe GA is As shown in the simulation curve of Figure 6 thediversity of the population is maintained because the sup-plementary particles in the population are stochastic whenthe local optimal solution converges gradually e IEPSOalgorithm can jump out of the local extrema points in theface of complex multimodal test functions and the numberof iterations required is correspondingly reduced
Table 6 Parameter settings
Algorithm Population Maximum iteration Dim of each object OthersPSO 40 1000 10 C1 C2 2 R1 R2 05SPSO 40 1000 10 ω 09ndash04 C1 C2 2 R1 R2 05DE 40 1000 10 mdashGA 40 1000 10 GGAP 05 PRECI 25IEPSO 40 1000 10 ω 09ndash04 C1 C2 2 C3 2ndash0 R1 R2 R3 05
Table 7 Unimodal test functions
Functions Criteria PSO SPSO DE IEPSO GA
f1Mean 133E + 03 308E + 03 731E minus 12 892E minus 22 11696SD 253E + 05 121E + 06 225E minus 23 265E minus 39 44192Best 114E + 03 120E + 03 242E minus 12 772E minus 27 4660
f2Mean 296E minus 02 880E minus 02 837E minus 06 0 179E minus 11SD 836E minus 04 896E minus 04 158E minus 10 0 0Best 455E minus 03 8428734 755E minus 10 0 179E minus 11
f3Mean 119E + 03 251E + 03 114E minus 11 621E minus 19 7430SD 293E + 05 182E + 06 995E minus 23 263E minus 36 5833Best 106E + 03 282E minus 02 210E minus 12 181E minus 27 4542
f4Mean 8238 8210 336E minus 13 170E minus 21 3031SD 686E + 02 140E + 03 995E minus 26 131E minus 41 0835Best 115E + 02 3739 115E minus 13 282E minus 29 1968
f5Mean 126E + 04 860E + 03 702E minus 12 165E minus 10 362E + 03SD 206E + 07 215E + 07 181E minus 23 330E minus 20 344E + 05Best 104E + 04 130E + 02 267E minus 12 217E minus 11 253E + 03
Table 8 Multimodal test functions
Functions Criteria PSO SPSO DE IEPSO GA
f6Mean 1548 1752 944E minus 02 419E minus 02 1006SD 0026 0093 487E minus 04 343E minus 04 0018Best 1236 1417 006 0013 0794
f7Mean 57737 43405 11945 446E minus 03 8939SD 117768 65178 16502 173E minus 04 3608Best 35981 317E + 01 6398 231E minus 12 5040
f8Mean 4996 4665 379E minus 02 242E minus 10 0423SD 191E + 00 1056 54E minus 03 674E minus 20 0051Best 2933 3151 46E minus 03 371E minus 16 0086
f9Mean minus186448 minus186048 minus186728 minus186731 minus186731SD 119E minus 01 983E minus 01 229E minus 08 0 999E minus 12Best minus187E + 02 minus186731 minus1867309 minus1867309 minus186731
f10Mean 13134 15560 1613 113E minus 11 2515SD 14260 2163 0 221E minus 22 0166Best 2861 12719 1613 506E minus 14 1796
f11Mean minus740326 minus715438 minus837966 minus837966 minus837966SD 874E + 03 723E + 03 0 0 0Best minus837966 minus837697 minus837966 minus837966 minus837966
Computational Intelligence and Neuroscience 13
0 100 200 300 400 500 600Iterations
0
2000
4000
6000
8000
10000
12000Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
DE
GASPSO
PSOIEPSO
0 100 200 300 400 500 600Iterations
0
01
02
03
04
05
Fitn
ess v
alue
(b)
DEGAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
5000
10000
15000
Fitn
ess v
alue
(c)
DE
GAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
200
400
600
800
Fitn
ess v
alue
(d)
DEGAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
1
2
3
4
Fitn
ess v
alue
times104
(e)
Figure 5 Unimodal functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquares function (e) f5Zakharov function
14 Computational Intelligence and Neuroscience
0 100 200 300 400 500 600Iterations
0
05
1
15
2
25
3Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
0 100 200 300 400 500 600Iterations
0
20
40
60
80
100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(b)
0 100 200 300 400 500 600Iterations
0
5
10
15
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(c)
0 100 200 300 400 500 600Iterations
ndash180
ndash160
ndash140
ndash120
ndash100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(d)
0 100 200 300 400 500 600Iterations
0
5
10
15
20
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(e)
100 200 300 400 500 600Iterations
ndash850
ndash800
ndash750
ndash700
ndash650
ndash600
ndash550
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(f )
Figure 6 Multimodal functions (a) f6 Griewank function (b) f7 Rastrigin function (c) f8 alpine function (d) f9 Shubert function (e) f10Ackley function (f ) f11 Cmfun function
Computational Intelligence and Neuroscience 15
Table 9 shows the test results for the three improved PSOalgorithms e DMSDL-PSO algorithm in [25] is a PSOalgorithm combined with differential variation and thequasi-Newton method whereas the HPSOWM algorithm in[26] is a binary PSO algorithm based on wavelet transformTable 9 shows that the IEPSO algorithm obtains the bestvalue in 5 out of the 11 test functions and the above analysisindicates that the IEPSO outperforms the other improvedPSO algorithms
4 Conclusion
In contemporary engineering design solving the globaloptimization problems of multiparameter strongly coupledand nonlinear systems using conventional optimizationalgorithms is difficult In this study an improved PSO thatis the IEPSO algorithm is proposed on the basis of the last-eliminated principle and an enhanced local-global in-formation sharing capability e comparison and analysisof the simulation results indicate the following conclusions
(1) e exchange of information between global andlocal optimal particles enhances the deep searchcapability of the IEPSO algorithm
(2) e standard test function is used to simulate theparameter C3 of the local-global information sharingterm e results show that the global optimizationcapability of the IEPSO algorithm is strong when C3is linearly decreasing Moreover the proposed al-gorithm can show the best search performance whenC3 is a nonlinear convex function
(3) e last-eliminated principle is used in the IEPSO tomaintain particle population diversity MoreoverPSO is avoided in the local optimal value A com-parison of the IEPSO algorithm with the classicaloptimization algorithm and its improved versionsverifies the global search capability of the IEPSOalgorithm
In summary the comparative results of the simulationanalysis reveal that with the application of the last-eliminatedprinciple and the local-global information sharing term to the
IEPSO the proposed algorithm effectively overcomes thedisadvantages of the classical algorithms including theirprecocious convergence and tendency to fall into the localoptimum e IEPSO shows an ideal global optimizationperformance and indicates a high application value for solvingpractical engineering optimization problems
Data Availability
e data used to support the findings of this study areavailable from the corresponding author upon request
Conflicts of Interest
e authors declare that there are no conflicts of interest
Acknowledgments
is work was supported by Shanghai Rising-Star Program(no 16QB1401000) Key Project of Shanghai Science andTechnology Committee (no 16DZ1120400) and the Na-tional Natural Science Foundation of China (Project no51705187) the Postdoctoral Science Foundation of China(Grant no 2017M621202)
References
[1] Z Zhou J Wang Z Zhu D Yang and J Wu ldquoTangentnavigated robot path planning strategy using particle swarmoptimized artificial potential fieldrdquo Optik vol 158 pp 639ndash651 2018
[2] P Du R Barrio H Jiang and L Cheng ldquoAccurate Quotient-Difference algorithm error analysis improvements and ap-plicationsrdquo Applied Mathematics and Computation vol 309pp 245ndash271 2017
[3] L Jiang Z Wang Y Ye and J Jiang ldquoFast circle detectionalgorithm based on sampling from difference areardquo Optikvol 158 pp 424ndash433 2018
[4] H Garg ldquoA hybrid PSO-GA algorithm for constrained op-timization problemsrdquo Applied Mathematics amp Computationvol 274 no 11 pp 292ndash305 2016
[5] J Zhang and P Xia ldquoAn improved PSO algorithm for pa-rameter identification of nonlinear dynamic hystereticmodelsrdquo Journal of Sound and Vibration vol 389 pp 153ndash167 2017
[6] R Saini P P Roy and D P Dogra ldquoA segmental HMMbasedtrajectory classification using genetic algorithmrdquo ExpertSystems with Applications vol 93 pp 169ndash181 2018
[7] P R D O D Costa S Mauceri P Carroll et al ldquoA geneticalgorithm for a vehicle routing problemrdquo Electronic Notes inDiscrete Mathematics vol 64 pp 65ndash74 2017
[8] V Jindal and P Bedi ldquoAn improved hybrid ant particleoptimization (IHAPO) algorithm for reducing travel time inVANETsrdquo Applied Soft Computing vol 64 pp 526ndash5352018
[9] Z Peng H Manier and M A Manier ldquoParticle swarmoptimization for capacitated location-routing problemrdquoIFAC-PapersOnLine vol 50 no 1 pp 14668ndash14673 2017
[10] G Xu and G Yu ldquoReprint of on convergence analysis ofparticle swarm optimization algorithmrdquo Journal of ShanxiNormal University vol 4 no 14 pp 25ndash32 2008
[11] J Lu W Xie and H Zhou ldquoCombined fitness functionbased particle swarm optimization algorithm for system
Table 9 ree improved particle swarm algorithm test results
Functions Criteria IEPSO DMSDL-PSO [25]
BHPSOWM[26]
f1Mean 892E minus 22 473E minus 10 4240SD 265E minus 39 181E minus 09 5211
f3Mean 621E minus 19 237E + 03 761SD 263E minus 36 571E + 02 007
f6Mean 419E minus 02 866E minus 05 mdashSD 343E minus 04 296E minus 04 mdash
f7Mean 446E minus 03 915E + 01 7618SD 173E minus 04 180E + 01 2675
f8Mean 242E minus 10 131E + 02 mdashSD 674E minus 20 582E + 01 mdash
f10Mean 113E minus 11 101E + 00 172SD 221E minus 22 271E minus 01 0
16 Computational Intelligence and Neuroscience
identificationrdquo Computers amp Industrial Engineering vol 95pp 122ndash134 2016
[12] F Javidrad and M Nazari ldquoA new hybrid particle swarm andsimulated annealing stochastic optimization methodrdquo Ap-plied Soft Computing vol 60 pp 634ndash654 2017
[13] J Jie J Zhang H Zheng and B Hou ldquoFormalized model andanalysis of mixed swarm based cooperative particle swarmoptimizationrdquo Neurocomputing vol 174 pp 542ndash552 2016
[14] A Meng Z Li H Yin S Chen and Z Guo ldquoAcceleratingparticle swarm optimization using crisscross searchrdquo In-formation Sciences vol 329 pp 52ndash72 2016
[15] L Wang B Yang and J Orchard ldquoParticle swarm optimi-zation using dynamic tournament topologyrdquo Applied SoftComputing vol 48 pp 584ndash596 2016
[16] M S Kiran ldquoParticle swarm optimization with a new updatemechanismrdquo Applied Soft Computing vol 60 pp 670ndash6782017
[17] H C Tsai ldquoUnified particle swarm delivers high efficiency toparticle swarm optimizationrdquo Applied Soft Computingvol 55 pp 371ndash383 2017
[18] S F Li and C Y Cheng ldquoParticle swarm optimization withfitness adjustment parametersrdquo Computers amp Industrial En-gineering vol 113 pp 831ndash841 2017
[19] Y Chen L Li H Peng J Xiao Y Yang and Y Shi ldquoParticleswarm optimizer with two differential mutationrdquo Applied SoftComputing vol 61 pp 314ndash330 2017
[20] Q Zhang W Liu X Meng B Yang and A V VasilakosldquoVector coevolving particle swarm optimization algorithmrdquoInformation Sciences vol 394 pp 273ndash298 2017
[21] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimization[C]Evolutionary computationrdquo in Proceedingsof the 1999 Congress on Evolutionary Computation-CEC99vol 3 pp 1945ndash1950 IEEE Washington DC USA 1999
[22] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[23] A Ratnaweera S K Halgamuge and H C Watson ldquoSelf-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficientsrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 240ndash255 2004
[24] K Chen F Zhou and A Liu ldquoChaotic dynamic weightparticle swarm optimization for numerical function optimi-zationrdquo Knowledge-Based Systems vol 139 pp 23ndash40 2018
[25] Y Chen L Li H Peng J Xiao and Q Wu ldquoDynamic multi-swarm differential learning particle swarm optimizerrdquoSwarm and Evolutionary Computation vol 39 pp 209ndash2212018
[26] F Jiang H Xia Q A Tran Q M Ha N Q Tran and J HuldquoA new binary hybrid particle swarm optimization withwavelet mutationrdquo Knowledge-Based Systems vol 130pp 90ndash101 2017
[27] R Liu J Li C Mu J fan and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimiza-tion for dynamic multi-objective optimizationrdquo EuropeanJournal of Operational Research vol 261 no 3 pp 1028ndash1051 2017
[28] W Ye W Feng and S Fan ldquoA novel multi-swarm particleswarm optimization with dynamic learning strategyrdquo AppliedSoft Computing vol 61 pp 832ndash843 2017
[29] L Zhang Y Tang C Hua and X Guan ldquoA new particleswarm optimization algorithm with adaptive inertia weightbased on Bayesian techniquesrdquo Applied Soft Computingvol 28 pp 138ndash149 2015
[30] Q Cui Q Li G Li et al ldquoGlobally-optimal prediction-basedadaptive mutation particle swarm optimizationrdquo InformationSciences vol 418 pp 186ndash217 2017
[31] D Zhao and J Liu ldquoStudy on network security situationawareness based on particle swarm optimization algorithmrdquoComputers amp Industrial Engineering vol 125 pp 764ndash7752018
[32] H Samareh S H Khoshrou K Shahriar M M Ebadzadehand M Eslami ldquoOptimization of a nonlinear model forpredicting the ground vibration using the combinationalparticle swarm optimization-genetic algorithmrdquo Journal ofAfrican Earth Sciences vol 133 pp 36ndash45 2017
[33] M Dash T Panigrahi and R Sharma ldquoDistributed parameterestimation of IIR system using diffusion particle swarm op-timization algorithmrdquo Journal of King Saud University-Engineering Sciences 2017 In press
[34] B Wang S Li J Guo and Q Chen ldquoCar-like mobile robotpath planning in rough terrain using multi-objective particleswarm optimization algorithmrdquo Neurocomputing vol 282pp 42ndash51 2018
[35] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[36] R F Lopes F F Costa A Oliveira et al ldquoAlgorithm based onparticle swarm applied to electrical load scheduling in anindustrial settingrdquo Energy vol 147 pp 1007ndash1015 2018
[37] F Sheikholeslami and N J Navimipour ldquoService allocation inthe cloud environments using multi-objective particle swarmoptimization algorithm based on crowding distancerdquo Swarmamp Evolutionary Computation vol 35 pp 53ndash64 2017
[38] M Petrovic N Vukovic M Mitic et al ldquoIntegration ofprocess planning and scheduling using chaotic particle swarmoptimization algorithmrdquo Expert Systems with Applicationsvol 64 pp 569ndash588 2016
[39] Z Zhang Y Jiang S Zhang S Geng H Wang and G SangldquoAn adaptive particle swarm optimization algorithm forreservoir operation optimizationrdquo Applied Soft ComputingJournal vol 18 no 4 pp 167ndash177 2014
[40] K Li L Liu J Zhai T M Khoshgoftaar and T Li ldquoeimproved grey model based on particle swarm optimizationalgorithm for time series predictionrdquo Engineering Applica-tions of Artificial Intelligence vol 55 pp 285ndash291 2016
[41] S Gulcu and H Kodaz ldquoe estimation of the electricityenergy demand using particle swarm optimization algorithma case study of Turkeyrdquo Procedia Computer Science vol 111pp 64ndash70 2017
Computational Intelligence and Neuroscience 17
Computer Games Technology
International Journal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom
Journal ofEngineeringVolume 2018
Advances in
FuzzySystems
Hindawiwwwhindawicom
Volume 2018
International Journal of
ReconfigurableComputing
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
thinspArtificial Intelligence
Hindawiwwwhindawicom Volumethinsp2018
Hindawiwwwhindawicom Volume 2018
Civil EngineeringAdvances in
Hindawiwwwhindawicom Volume 2018
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawiwwwhindawicom Volume 2018
Hindawi
wwwhindawicom Volume 2018
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Engineering Mathematics
International Journal of
RoboticsJournal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Computational Intelligence and Neuroscience
Hindawiwwwhindawicom Volume 2018
Mathematical Problems in Engineering
Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018
Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom
The Scientific World Journal
Volume 2018
Hindawiwwwhindawicom Volume 2018
Human-ComputerInteraction
Advances in
Hindawiwwwhindawicom Volume 2018
Scientic Programming
Submit your manuscripts atwwwhindawicom
0 50 100 150 200Iterations
0
500
1000
1500
2000
Fitn
ess v
alue
2
15
1
05
0100
0
ndash100 ndash100ndash50
050
100
k = ndash1k = 1Constant
times104
(a)
50 100 150 200Iterations
0
002
004
006
008Fi
tnes
s val
ue
08
06
04
02
0
ndash100
0
100
ndash100ndash50
500
100
k = ndash1k = 1Constant
(b)
50 100 150 200Iterations
0
500
1000
1500
2000
2500
3000
Fitn
ess v
alue
2times104
15
1
05
0100
0
ndash100 ndash100ndash50
0 50100
k = ndash1k = 1Constant
(c)
Figure 2 Continued
6 Computational Intelligence and Neuroscience
50 100 150 200Iterations
0
10
20
30
40
50
60
Fitn
ess v
alue
600
400
200
010
0
ndash10 ndash10 ndash5
05
10
k = ndash1k = 1Constant
(d)
50 100 150 200Iterations
0
5000
10000
15000
Fitn
ess v
alue
2times108
15
1
05
0100
0
ndash100 ndash100ndash50
050
100
k = ndash1k = 1Constant
(e)
50 100 150 200Iterations
0
05
1
15
2
Fitn
ess v
alue
2
15
1
05
0200
150100
50 50100
150200
k = ndash1k = 1Constant
(f )
Figure 2 Continued
Computational Intelligence and Neuroscience 7
50 100 150 200Iterations
0
10
20
30
40
50
60
Fitn
ess v
alue
80
60
40
20
05
0
ndash5 ndash50
5
k = ndash1k = 1Constant
(g)
50 100 150 200Iterations
0
1
2
3
4
5
6
Fitn
ess v
alue
15
10
5
010
0
ndash10 ndash10ndash5 0
105
k = ndash1k = 1Constant
(h)
0 50 100 150 200Iterations
ndash1865
ndash186
ndash1855
ndash185
ndash1845
ndash184
Fitn
ess v
alue
200
100
0
ndash100
200150
10050 50
100150 200
k = ndash1k = 1Constant
(i)
Figure 2 Continued
8 Computational Intelligence and Neuroscience
50 100 150 200Iterations
0
5
10
15
Fitn
ess v
alue
15
10
5
0
200
ndash20 ndash200
20
k = ndash1k = 1Constant
(j)
0 50 100 150 200Iterations
ndash840
ndash820
ndash800
ndash780
ndash760
ndash740
ndash720
Fitn
ess v
alue
1000
500
0
ndash500
500ndash1000
0
ndash500 ndash5000
500
k = ndash1k = 1Constant
(k)
Figure 2 11 test functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquares function (e) f5 Zakharovfunction (f ) f6 Griewank function (g) f7 Rastrigin function (h) f8 alpine function (i) f9 Shubert function (j) f10 Ackley function (k) f11Cmfun function
Table 4 Unimodal test functions
Functions Criteria C3 2sim0 k 02 C3 2sim0 k 2 C3 2sim0 k 1
f1Mean 266E minus 20 551E minus 10 450E minus 20SD 265E minus 39 287E minus 19 375E minus 16Best 912E minus 24 138E minus 11 155E minus 25
f2Mean 0 0 0SD 0 0 0Best 0 0 0
f3Mean 621E minus 19 604E minus 10 182E minus 20SD 263E minus 36 779E minus 19 105E minus 39Best 181E minus 27 308E minus 11 322E minus 24
f4Mean 170E minus 21 242E minus 11 820E minus 20SD 131E minus 41 440E minus 22 511E minus 38Best 282E minus 29 436E minus 12 843E minus 26
f5Mean 165E minus 10 283E minus 11 556E minus 11SD 330E minus 20 359E minus 11 488E minus 14Best 217E minus 11 100E minus 11 354E minus 11
Computational Intelligence and Neuroscience 9
not show its advantages because of its strong deep searchcapability In the complex multimodal test function whenthe convex function is used in C3 the downward trend isslow in the early stage thus beneting the global search andthe downward speed increases in the later stage thusbeneting the local search When the concave function isused for C3 the descent speed is fast in the early stageAlthough the search speed is improved the coverage area ofthe search is reduced thereby leading to the convergence ofthe algorithm to the nonoptimal value From the simulationdiagrams (f)ndash(k) the convergence speed is observed to beslightly slow when C3 is a convex function but its ability tojump out of the local extremum and the accuracy of theglobal search are higher than those in the other two casesWhen C3 is a concave function the convergence speed isfaster than those in the other two cases and the searchaccuracy is lower than that when C3 is a convex function
32 Comparison of Test Results e 11 test functions inFigure 1 are used to compare the IEPSO algorithm withclassical PSO SPSO dierential algorithm (DE) and GAe DE GA and PSO algorithms are all stochastic in-telligent optimization algorithms with population iterationse evaluation criteria of algorithm performance includespeed of convergence and size of individual populationsearch coverage e dierential optimization algorithm hasa low space complexity and obvious advantages in dealingwith large-scale and complex optimization problems eGA has good convergence when solving discrete multipeakand noise-containing optimization problems Based on thetraditional PSO algorithm the SPSO algorithm achieves thebalance between global search and local search by adjustingthe inertial weight (Figures 3 and 4)
e experimental parameters of the ve algorithmsare set as shown in Table 6 Each test function is runindependently 10 times and the average is recorded toreduce the data error e iteration is stopped when the
convergence condition meets the convergence accuracy ebest average tness value of the ve algorithms is blackenede standard deviation average tness and optimal value ofeach algorithm are shown in Tables 7 and 8 Figures 5 and 6plot the convergence curves of the 11 test functions
Table 7 shows that the IEPSO has the best performanceon f1 f2 f3 and f4 e IEPSO algorithm obtains the the-oretical optimal value on f2 DE can search the global so-lution on f5 e deep search capability of the IEPSOalgorithm is considerably higher than that of the PSO andSPSO algorithms due to the increased global-local in-formation sharing term and the last-eliminated principlee crossover mutation and selection mechanisms makethe DE algorithm perform well in the early stage of the globalsearch However the diversity of the population declines inthe latter stage because of population dierences e
0 2 4 6 8 10Iterations
0
02
04
06
08
1
C3
k = 2k = 02k = 1
Figure 3 e change curve of C3 with the number of iterations
Table 5 Multimodal test functions
Functions Criteria C3 2sim0 k 02 C3 2sim0 k 2 C3 2sim0 k 1
f6Mean 419E minus 02 479E minus 02 492E minus 02SD 343E minus 04 707E minus 04 596E minus 04Best 125E minus 02 57E minus 03 123E minus 02
f7Mean 446E minus 03 500E minus 05 19Ee minus 04SD 173E minus 04 303E minus 06 5649Best 231E minus 12 389E minus 11 225E minus 05
f8Mean 242E minus 10 374E minus 10 528E minus 10SD 674E minus 20 247E minus 12 223E minus 12Best 371E minus 16 436E minus 11 583E minus 13
f9Mean minus1867309 minus1867309 minus1867309SD 0 0 0Best minus1867309 minus1867309 minus1867309
f10Mean 113E minus 11 205E minus 10 184E minus 11SD 221E minus 22 437E minus 12 227E minus 22Best 506E minus 14 175E minus 10 250E minus 12
f11Mean minus8379658 minus8379658 minus8379658SD 0 0 440E minus 09Best minus8379658 minus8379658 minus8379658
10 Computational Intelligence and Neuroscience
0 50 100 150 200Iterations
0
500
1000
1500
2000
2500
3000Fi
tnes
s val
ue
2 4 6 8 10 120
1000
2000
k = 2k = 02k = 1
(a)
50 100 150 200Iterations
0
0005
001
0015
Fitn
ess v
alue
5 10 15 200
1
2
3times10ndash3
k = 02k = 2k = 1
(b)
50 100 150 200Iterations
0
500
1000
1500
2000
2500
3000
Fitn
ess v
alue
2 4 60
1000
2000
k = 1k = 2k = 02
(c)
50 100 150 200Iterations
0
10
20
30
40
50
60
Fitn
ess v
alue
2 4 6 8 100
20
40
k = 02k = 2k = 1
(d)
50 100 150 200Iterations
0
2000
4000
6000
8000
10000
12000
14000
Fitn
ess v
alue
5 10 15 200
2000
4000
6000
k = 02k = 2k = 1
(e)
50 100 150 200Iterations
0
05
1
15
2
Fitn
ess v
alue
5 10 15 20 25 300
05
1
15
k = 1k = 2k = 02
(f )
Figure 4 Continued
Computational Intelligence and Neuroscience 11
50 100 150 200Iterations
0
10
20
30
40
50Fi
tnes
s val
ue
20 40 60 800
10
20
30
k = 02k = 2k = 1
(g)
50 100 150 200Iterations
0
1
2
3
4
Fitn
ess v
alue
2 4 6 8 10 120
1
2
3
k = 02k = 2k = 1
(h)
0 50 100 150 200Iterations
ndash1865
ndash186
ndash1855
ndash185
ndash1845
ndash184
Fitn
ess v
alue
10 20 30 40
ndash1865
ndash186
ndash1855
ndash185
k = 02k = 2k = 1
(i)
50 100 150 200Iterations
0
5
10
15
Fitn
ess v
alue
10 20 300
2
4
k = 02k = 2k = 1
(j)
0 50 100 150 200Iterations
ndash840
ndash820
ndash800
ndash780
ndash760
ndash740
ndash720
Fitn
ess v
alue
k = 02k = 2k = 1
60 65 70 75 80ndash838ndash836ndash834ndash832ndash830
(k)
Figure 4 11 test functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquare function (e) f5 Zakharovfunction (f ) f6 Griewank function (g) f7 Rastrigin function (h) f8 alpine function (i) f9 Shubert function (j) f10 Ackley function (k) f11Cmfun function
12 Computational Intelligence and Neuroscience
simulation diagrams (a)ndash(e) show that although the DEalgorithm converges rapidly in the early stage its globalsearch performance in the later stage becomes lower thanthat of the IEPSO algorithm When the GA is used to solveoptimization problems the individuals in the population fallinto the local optimum and do not continue searching forthe optimum solution erefore in Figure 5 the simulationcurve of the GA converges to the local optimum
e test results in Table 8 indicate that the IEPSO has thebest performance on f6 f7 f8 f9 f10 and f11 and that the DE
and GA can obtain the theoretical optimal value on f9 and f11Although the GA and IEPSO algorithm can obtain the globaloptimal value on f9 the IEPSO algorithm is more robust thanthe GA is As shown in the simulation curve of Figure 6 thediversity of the population is maintained because the sup-plementary particles in the population are stochastic whenthe local optimal solution converges gradually e IEPSOalgorithm can jump out of the local extrema points in theface of complex multimodal test functions and the numberof iterations required is correspondingly reduced
Table 6 Parameter settings
Algorithm Population Maximum iteration Dim of each object OthersPSO 40 1000 10 C1 C2 2 R1 R2 05SPSO 40 1000 10 ω 09ndash04 C1 C2 2 R1 R2 05DE 40 1000 10 mdashGA 40 1000 10 GGAP 05 PRECI 25IEPSO 40 1000 10 ω 09ndash04 C1 C2 2 C3 2ndash0 R1 R2 R3 05
Table 7 Unimodal test functions
Functions Criteria PSO SPSO DE IEPSO GA
f1Mean 133E + 03 308E + 03 731E minus 12 892E minus 22 11696SD 253E + 05 121E + 06 225E minus 23 265E minus 39 44192Best 114E + 03 120E + 03 242E minus 12 772E minus 27 4660
f2Mean 296E minus 02 880E minus 02 837E minus 06 0 179E minus 11SD 836E minus 04 896E minus 04 158E minus 10 0 0Best 455E minus 03 8428734 755E minus 10 0 179E minus 11
f3Mean 119E + 03 251E + 03 114E minus 11 621E minus 19 7430SD 293E + 05 182E + 06 995E minus 23 263E minus 36 5833Best 106E + 03 282E minus 02 210E minus 12 181E minus 27 4542
f4Mean 8238 8210 336E minus 13 170E minus 21 3031SD 686E + 02 140E + 03 995E minus 26 131E minus 41 0835Best 115E + 02 3739 115E minus 13 282E minus 29 1968
f5Mean 126E + 04 860E + 03 702E minus 12 165E minus 10 362E + 03SD 206E + 07 215E + 07 181E minus 23 330E minus 20 344E + 05Best 104E + 04 130E + 02 267E minus 12 217E minus 11 253E + 03
Table 8 Multimodal test functions
Functions Criteria PSO SPSO DE IEPSO GA
f6Mean 1548 1752 944E minus 02 419E minus 02 1006SD 0026 0093 487E minus 04 343E minus 04 0018Best 1236 1417 006 0013 0794
f7Mean 57737 43405 11945 446E minus 03 8939SD 117768 65178 16502 173E minus 04 3608Best 35981 317E + 01 6398 231E minus 12 5040
f8Mean 4996 4665 379E minus 02 242E minus 10 0423SD 191E + 00 1056 54E minus 03 674E minus 20 0051Best 2933 3151 46E minus 03 371E minus 16 0086
f9Mean minus186448 minus186048 minus186728 minus186731 minus186731SD 119E minus 01 983E minus 01 229E minus 08 0 999E minus 12Best minus187E + 02 minus186731 minus1867309 minus1867309 minus186731
f10Mean 13134 15560 1613 113E minus 11 2515SD 14260 2163 0 221E minus 22 0166Best 2861 12719 1613 506E minus 14 1796
f11Mean minus740326 minus715438 minus837966 minus837966 minus837966SD 874E + 03 723E + 03 0 0 0Best minus837966 minus837697 minus837966 minus837966 minus837966
Computational Intelligence and Neuroscience 13
0 100 200 300 400 500 600Iterations
0
2000
4000
6000
8000
10000
12000Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
DE
GASPSO
PSOIEPSO
0 100 200 300 400 500 600Iterations
0
01
02
03
04
05
Fitn
ess v
alue
(b)
DEGAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
5000
10000
15000
Fitn
ess v
alue
(c)
DE
GAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
200
400
600
800
Fitn
ess v
alue
(d)
DEGAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
1
2
3
4
Fitn
ess v
alue
times104
(e)
Figure 5 Unimodal functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquares function (e) f5Zakharov function
14 Computational Intelligence and Neuroscience
0 100 200 300 400 500 600Iterations
0
05
1
15
2
25
3Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
0 100 200 300 400 500 600Iterations
0
20
40
60
80
100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(b)
0 100 200 300 400 500 600Iterations
0
5
10
15
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(c)
0 100 200 300 400 500 600Iterations
ndash180
ndash160
ndash140
ndash120
ndash100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(d)
0 100 200 300 400 500 600Iterations
0
5
10
15
20
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(e)
100 200 300 400 500 600Iterations
ndash850
ndash800
ndash750
ndash700
ndash650
ndash600
ndash550
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(f )
Figure 6 Multimodal functions (a) f6 Griewank function (b) f7 Rastrigin function (c) f8 alpine function (d) f9 Shubert function (e) f10Ackley function (f ) f11 Cmfun function
Computational Intelligence and Neuroscience 15
Table 9 shows the test results for the three improved PSOalgorithms e DMSDL-PSO algorithm in [25] is a PSOalgorithm combined with differential variation and thequasi-Newton method whereas the HPSOWM algorithm in[26] is a binary PSO algorithm based on wavelet transformTable 9 shows that the IEPSO algorithm obtains the bestvalue in 5 out of the 11 test functions and the above analysisindicates that the IEPSO outperforms the other improvedPSO algorithms
4 Conclusion
In contemporary engineering design solving the globaloptimization problems of multiparameter strongly coupledand nonlinear systems using conventional optimizationalgorithms is difficult In this study an improved PSO thatis the IEPSO algorithm is proposed on the basis of the last-eliminated principle and an enhanced local-global in-formation sharing capability e comparison and analysisof the simulation results indicate the following conclusions
(1) e exchange of information between global andlocal optimal particles enhances the deep searchcapability of the IEPSO algorithm
(2) e standard test function is used to simulate theparameter C3 of the local-global information sharingterm e results show that the global optimizationcapability of the IEPSO algorithm is strong when C3is linearly decreasing Moreover the proposed al-gorithm can show the best search performance whenC3 is a nonlinear convex function
(3) e last-eliminated principle is used in the IEPSO tomaintain particle population diversity MoreoverPSO is avoided in the local optimal value A com-parison of the IEPSO algorithm with the classicaloptimization algorithm and its improved versionsverifies the global search capability of the IEPSOalgorithm
In summary the comparative results of the simulationanalysis reveal that with the application of the last-eliminatedprinciple and the local-global information sharing term to the
IEPSO the proposed algorithm effectively overcomes thedisadvantages of the classical algorithms including theirprecocious convergence and tendency to fall into the localoptimum e IEPSO shows an ideal global optimizationperformance and indicates a high application value for solvingpractical engineering optimization problems
Data Availability
e data used to support the findings of this study areavailable from the corresponding author upon request
Conflicts of Interest
e authors declare that there are no conflicts of interest
Acknowledgments
is work was supported by Shanghai Rising-Star Program(no 16QB1401000) Key Project of Shanghai Science andTechnology Committee (no 16DZ1120400) and the Na-tional Natural Science Foundation of China (Project no51705187) the Postdoctoral Science Foundation of China(Grant no 2017M621202)
References
[1] Z Zhou J Wang Z Zhu D Yang and J Wu ldquoTangentnavigated robot path planning strategy using particle swarmoptimized artificial potential fieldrdquo Optik vol 158 pp 639ndash651 2018
[2] P Du R Barrio H Jiang and L Cheng ldquoAccurate Quotient-Difference algorithm error analysis improvements and ap-plicationsrdquo Applied Mathematics and Computation vol 309pp 245ndash271 2017
[3] L Jiang Z Wang Y Ye and J Jiang ldquoFast circle detectionalgorithm based on sampling from difference areardquo Optikvol 158 pp 424ndash433 2018
[4] H Garg ldquoA hybrid PSO-GA algorithm for constrained op-timization problemsrdquo Applied Mathematics amp Computationvol 274 no 11 pp 292ndash305 2016
[5] J Zhang and P Xia ldquoAn improved PSO algorithm for pa-rameter identification of nonlinear dynamic hystereticmodelsrdquo Journal of Sound and Vibration vol 389 pp 153ndash167 2017
[6] R Saini P P Roy and D P Dogra ldquoA segmental HMMbasedtrajectory classification using genetic algorithmrdquo ExpertSystems with Applications vol 93 pp 169ndash181 2018
[7] P R D O D Costa S Mauceri P Carroll et al ldquoA geneticalgorithm for a vehicle routing problemrdquo Electronic Notes inDiscrete Mathematics vol 64 pp 65ndash74 2017
[8] V Jindal and P Bedi ldquoAn improved hybrid ant particleoptimization (IHAPO) algorithm for reducing travel time inVANETsrdquo Applied Soft Computing vol 64 pp 526ndash5352018
[9] Z Peng H Manier and M A Manier ldquoParticle swarmoptimization for capacitated location-routing problemrdquoIFAC-PapersOnLine vol 50 no 1 pp 14668ndash14673 2017
[10] G Xu and G Yu ldquoReprint of on convergence analysis ofparticle swarm optimization algorithmrdquo Journal of ShanxiNormal University vol 4 no 14 pp 25ndash32 2008
[11] J Lu W Xie and H Zhou ldquoCombined fitness functionbased particle swarm optimization algorithm for system
Table 9 ree improved particle swarm algorithm test results
Functions Criteria IEPSO DMSDL-PSO [25]
BHPSOWM[26]
f1Mean 892E minus 22 473E minus 10 4240SD 265E minus 39 181E minus 09 5211
f3Mean 621E minus 19 237E + 03 761SD 263E minus 36 571E + 02 007
f6Mean 419E minus 02 866E minus 05 mdashSD 343E minus 04 296E minus 04 mdash
f7Mean 446E minus 03 915E + 01 7618SD 173E minus 04 180E + 01 2675
f8Mean 242E minus 10 131E + 02 mdashSD 674E minus 20 582E + 01 mdash
f10Mean 113E minus 11 101E + 00 172SD 221E minus 22 271E minus 01 0
16 Computational Intelligence and Neuroscience
identificationrdquo Computers amp Industrial Engineering vol 95pp 122ndash134 2016
[12] F Javidrad and M Nazari ldquoA new hybrid particle swarm andsimulated annealing stochastic optimization methodrdquo Ap-plied Soft Computing vol 60 pp 634ndash654 2017
[13] J Jie J Zhang H Zheng and B Hou ldquoFormalized model andanalysis of mixed swarm based cooperative particle swarmoptimizationrdquo Neurocomputing vol 174 pp 542ndash552 2016
[14] A Meng Z Li H Yin S Chen and Z Guo ldquoAcceleratingparticle swarm optimization using crisscross searchrdquo In-formation Sciences vol 329 pp 52ndash72 2016
[15] L Wang B Yang and J Orchard ldquoParticle swarm optimi-zation using dynamic tournament topologyrdquo Applied SoftComputing vol 48 pp 584ndash596 2016
[16] M S Kiran ldquoParticle swarm optimization with a new updatemechanismrdquo Applied Soft Computing vol 60 pp 670ndash6782017
[17] H C Tsai ldquoUnified particle swarm delivers high efficiency toparticle swarm optimizationrdquo Applied Soft Computingvol 55 pp 371ndash383 2017
[18] S F Li and C Y Cheng ldquoParticle swarm optimization withfitness adjustment parametersrdquo Computers amp Industrial En-gineering vol 113 pp 831ndash841 2017
[19] Y Chen L Li H Peng J Xiao Y Yang and Y Shi ldquoParticleswarm optimizer with two differential mutationrdquo Applied SoftComputing vol 61 pp 314ndash330 2017
[20] Q Zhang W Liu X Meng B Yang and A V VasilakosldquoVector coevolving particle swarm optimization algorithmrdquoInformation Sciences vol 394 pp 273ndash298 2017
[21] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimization[C]Evolutionary computationrdquo in Proceedingsof the 1999 Congress on Evolutionary Computation-CEC99vol 3 pp 1945ndash1950 IEEE Washington DC USA 1999
[22] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[23] A Ratnaweera S K Halgamuge and H C Watson ldquoSelf-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficientsrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 240ndash255 2004
[24] K Chen F Zhou and A Liu ldquoChaotic dynamic weightparticle swarm optimization for numerical function optimi-zationrdquo Knowledge-Based Systems vol 139 pp 23ndash40 2018
[25] Y Chen L Li H Peng J Xiao and Q Wu ldquoDynamic multi-swarm differential learning particle swarm optimizerrdquoSwarm and Evolutionary Computation vol 39 pp 209ndash2212018
[26] F Jiang H Xia Q A Tran Q M Ha N Q Tran and J HuldquoA new binary hybrid particle swarm optimization withwavelet mutationrdquo Knowledge-Based Systems vol 130pp 90ndash101 2017
[27] R Liu J Li C Mu J fan and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimiza-tion for dynamic multi-objective optimizationrdquo EuropeanJournal of Operational Research vol 261 no 3 pp 1028ndash1051 2017
[28] W Ye W Feng and S Fan ldquoA novel multi-swarm particleswarm optimization with dynamic learning strategyrdquo AppliedSoft Computing vol 61 pp 832ndash843 2017
[29] L Zhang Y Tang C Hua and X Guan ldquoA new particleswarm optimization algorithm with adaptive inertia weightbased on Bayesian techniquesrdquo Applied Soft Computingvol 28 pp 138ndash149 2015
[30] Q Cui Q Li G Li et al ldquoGlobally-optimal prediction-basedadaptive mutation particle swarm optimizationrdquo InformationSciences vol 418 pp 186ndash217 2017
[31] D Zhao and J Liu ldquoStudy on network security situationawareness based on particle swarm optimization algorithmrdquoComputers amp Industrial Engineering vol 125 pp 764ndash7752018
[32] H Samareh S H Khoshrou K Shahriar M M Ebadzadehand M Eslami ldquoOptimization of a nonlinear model forpredicting the ground vibration using the combinationalparticle swarm optimization-genetic algorithmrdquo Journal ofAfrican Earth Sciences vol 133 pp 36ndash45 2017
[33] M Dash T Panigrahi and R Sharma ldquoDistributed parameterestimation of IIR system using diffusion particle swarm op-timization algorithmrdquo Journal of King Saud University-Engineering Sciences 2017 In press
[34] B Wang S Li J Guo and Q Chen ldquoCar-like mobile robotpath planning in rough terrain using multi-objective particleswarm optimization algorithmrdquo Neurocomputing vol 282pp 42ndash51 2018
[35] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[36] R F Lopes F F Costa A Oliveira et al ldquoAlgorithm based onparticle swarm applied to electrical load scheduling in anindustrial settingrdquo Energy vol 147 pp 1007ndash1015 2018
[37] F Sheikholeslami and N J Navimipour ldquoService allocation inthe cloud environments using multi-objective particle swarmoptimization algorithm based on crowding distancerdquo Swarmamp Evolutionary Computation vol 35 pp 53ndash64 2017
[38] M Petrovic N Vukovic M Mitic et al ldquoIntegration ofprocess planning and scheduling using chaotic particle swarmoptimization algorithmrdquo Expert Systems with Applicationsvol 64 pp 569ndash588 2016
[39] Z Zhang Y Jiang S Zhang S Geng H Wang and G SangldquoAn adaptive particle swarm optimization algorithm forreservoir operation optimizationrdquo Applied Soft ComputingJournal vol 18 no 4 pp 167ndash177 2014
[40] K Li L Liu J Zhai T M Khoshgoftaar and T Li ldquoeimproved grey model based on particle swarm optimizationalgorithm for time series predictionrdquo Engineering Applica-tions of Artificial Intelligence vol 55 pp 285ndash291 2016
[41] S Gulcu and H Kodaz ldquoe estimation of the electricityenergy demand using particle swarm optimization algorithma case study of Turkeyrdquo Procedia Computer Science vol 111pp 64ndash70 2017
Computational Intelligence and Neuroscience 17
Computer Games Technology
International Journal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom
Journal ofEngineeringVolume 2018
Advances in
FuzzySystems
Hindawiwwwhindawicom
Volume 2018
International Journal of
ReconfigurableComputing
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
thinspArtificial Intelligence
Hindawiwwwhindawicom Volumethinsp2018
Hindawiwwwhindawicom Volume 2018
Civil EngineeringAdvances in
Hindawiwwwhindawicom Volume 2018
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawiwwwhindawicom Volume 2018
Hindawi
wwwhindawicom Volume 2018
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Engineering Mathematics
International Journal of
RoboticsJournal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Computational Intelligence and Neuroscience
Hindawiwwwhindawicom Volume 2018
Mathematical Problems in Engineering
Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018
Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom
The Scientific World Journal
Volume 2018
Hindawiwwwhindawicom Volume 2018
Human-ComputerInteraction
Advances in
Hindawiwwwhindawicom Volume 2018
Scientic Programming
Submit your manuscripts atwwwhindawicom
50 100 150 200Iterations
0
10
20
30
40
50
60
Fitn
ess v
alue
600
400
200
010
0
ndash10 ndash10 ndash5
05
10
k = ndash1k = 1Constant
(d)
50 100 150 200Iterations
0
5000
10000
15000
Fitn
ess v
alue
2times108
15
1
05
0100
0
ndash100 ndash100ndash50
050
100
k = ndash1k = 1Constant
(e)
50 100 150 200Iterations
0
05
1
15
2
Fitn
ess v
alue
2
15
1
05
0200
150100
50 50100
150200
k = ndash1k = 1Constant
(f )
Figure 2 Continued
Computational Intelligence and Neuroscience 7
50 100 150 200Iterations
0
10
20
30
40
50
60
Fitn
ess v
alue
80
60
40
20
05
0
ndash5 ndash50
5
k = ndash1k = 1Constant
(g)
50 100 150 200Iterations
0
1
2
3
4
5
6
Fitn
ess v
alue
15
10
5
010
0
ndash10 ndash10ndash5 0
105
k = ndash1k = 1Constant
(h)
0 50 100 150 200Iterations
ndash1865
ndash186
ndash1855
ndash185
ndash1845
ndash184
Fitn
ess v
alue
200
100
0
ndash100
200150
10050 50
100150 200
k = ndash1k = 1Constant
(i)
Figure 2 Continued
8 Computational Intelligence and Neuroscience
50 100 150 200Iterations
0
5
10
15
Fitn
ess v
alue
15
10
5
0
200
ndash20 ndash200
20
k = ndash1k = 1Constant
(j)
0 50 100 150 200Iterations
ndash840
ndash820
ndash800
ndash780
ndash760
ndash740
ndash720
Fitn
ess v
alue
1000
500
0
ndash500
500ndash1000
0
ndash500 ndash5000
500
k = ndash1k = 1Constant
(k)
Figure 2 11 test functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquares function (e) f5 Zakharovfunction (f ) f6 Griewank function (g) f7 Rastrigin function (h) f8 alpine function (i) f9 Shubert function (j) f10 Ackley function (k) f11Cmfun function
Table 4 Unimodal test functions
Functions Criteria C3 2sim0 k 02 C3 2sim0 k 2 C3 2sim0 k 1
f1Mean 266E minus 20 551E minus 10 450E minus 20SD 265E minus 39 287E minus 19 375E minus 16Best 912E minus 24 138E minus 11 155E minus 25
f2Mean 0 0 0SD 0 0 0Best 0 0 0
f3Mean 621E minus 19 604E minus 10 182E minus 20SD 263E minus 36 779E minus 19 105E minus 39Best 181E minus 27 308E minus 11 322E minus 24
f4Mean 170E minus 21 242E minus 11 820E minus 20SD 131E minus 41 440E minus 22 511E minus 38Best 282E minus 29 436E minus 12 843E minus 26
f5Mean 165E minus 10 283E minus 11 556E minus 11SD 330E minus 20 359E minus 11 488E minus 14Best 217E minus 11 100E minus 11 354E minus 11
Computational Intelligence and Neuroscience 9
not show its advantages because of its strong deep searchcapability In the complex multimodal test function whenthe convex function is used in C3 the downward trend isslow in the early stage thus beneting the global search andthe downward speed increases in the later stage thusbeneting the local search When the concave function isused for C3 the descent speed is fast in the early stageAlthough the search speed is improved the coverage area ofthe search is reduced thereby leading to the convergence ofthe algorithm to the nonoptimal value From the simulationdiagrams (f)ndash(k) the convergence speed is observed to beslightly slow when C3 is a convex function but its ability tojump out of the local extremum and the accuracy of theglobal search are higher than those in the other two casesWhen C3 is a concave function the convergence speed isfaster than those in the other two cases and the searchaccuracy is lower than that when C3 is a convex function
32 Comparison of Test Results e 11 test functions inFigure 1 are used to compare the IEPSO algorithm withclassical PSO SPSO dierential algorithm (DE) and GAe DE GA and PSO algorithms are all stochastic in-telligent optimization algorithms with population iterationse evaluation criteria of algorithm performance includespeed of convergence and size of individual populationsearch coverage e dierential optimization algorithm hasa low space complexity and obvious advantages in dealingwith large-scale and complex optimization problems eGA has good convergence when solving discrete multipeakand noise-containing optimization problems Based on thetraditional PSO algorithm the SPSO algorithm achieves thebalance between global search and local search by adjustingthe inertial weight (Figures 3 and 4)
e experimental parameters of the ve algorithmsare set as shown in Table 6 Each test function is runindependently 10 times and the average is recorded toreduce the data error e iteration is stopped when the
convergence condition meets the convergence accuracy ebest average tness value of the ve algorithms is blackenede standard deviation average tness and optimal value ofeach algorithm are shown in Tables 7 and 8 Figures 5 and 6plot the convergence curves of the 11 test functions
Table 7 shows that the IEPSO has the best performanceon f1 f2 f3 and f4 e IEPSO algorithm obtains the the-oretical optimal value on f2 DE can search the global so-lution on f5 e deep search capability of the IEPSOalgorithm is considerably higher than that of the PSO andSPSO algorithms due to the increased global-local in-formation sharing term and the last-eliminated principlee crossover mutation and selection mechanisms makethe DE algorithm perform well in the early stage of the globalsearch However the diversity of the population declines inthe latter stage because of population dierences e
0 2 4 6 8 10Iterations
0
02
04
06
08
1
C3
k = 2k = 02k = 1
Figure 3 e change curve of C3 with the number of iterations
Table 5 Multimodal test functions
Functions Criteria C3 2sim0 k 02 C3 2sim0 k 2 C3 2sim0 k 1
f6Mean 419E minus 02 479E minus 02 492E minus 02SD 343E minus 04 707E minus 04 596E minus 04Best 125E minus 02 57E minus 03 123E minus 02
f7Mean 446E minus 03 500E minus 05 19Ee minus 04SD 173E minus 04 303E minus 06 5649Best 231E minus 12 389E minus 11 225E minus 05
f8Mean 242E minus 10 374E minus 10 528E minus 10SD 674E minus 20 247E minus 12 223E minus 12Best 371E minus 16 436E minus 11 583E minus 13
f9Mean minus1867309 minus1867309 minus1867309SD 0 0 0Best minus1867309 minus1867309 minus1867309
f10Mean 113E minus 11 205E minus 10 184E minus 11SD 221E minus 22 437E minus 12 227E minus 22Best 506E minus 14 175E minus 10 250E minus 12
f11Mean minus8379658 minus8379658 minus8379658SD 0 0 440E minus 09Best minus8379658 minus8379658 minus8379658
10 Computational Intelligence and Neuroscience
0 50 100 150 200Iterations
0
500
1000
1500
2000
2500
3000Fi
tnes
s val
ue
2 4 6 8 10 120
1000
2000
k = 2k = 02k = 1
(a)
50 100 150 200Iterations
0
0005
001
0015
Fitn
ess v
alue
5 10 15 200
1
2
3times10ndash3
k = 02k = 2k = 1
(b)
50 100 150 200Iterations
0
500
1000
1500
2000
2500
3000
Fitn
ess v
alue
2 4 60
1000
2000
k = 1k = 2k = 02
(c)
50 100 150 200Iterations
0
10
20
30
40
50
60
Fitn
ess v
alue
2 4 6 8 100
20
40
k = 02k = 2k = 1
(d)
50 100 150 200Iterations
0
2000
4000
6000
8000
10000
12000
14000
Fitn
ess v
alue
5 10 15 200
2000
4000
6000
k = 02k = 2k = 1
(e)
50 100 150 200Iterations
0
05
1
15
2
Fitn
ess v
alue
5 10 15 20 25 300
05
1
15
k = 1k = 2k = 02
(f )
Figure 4 Continued
Computational Intelligence and Neuroscience 11
50 100 150 200Iterations
0
10
20
30
40
50Fi
tnes
s val
ue
20 40 60 800
10
20
30
k = 02k = 2k = 1
(g)
50 100 150 200Iterations
0
1
2
3
4
Fitn
ess v
alue
2 4 6 8 10 120
1
2
3
k = 02k = 2k = 1
(h)
0 50 100 150 200Iterations
ndash1865
ndash186
ndash1855
ndash185
ndash1845
ndash184
Fitn
ess v
alue
10 20 30 40
ndash1865
ndash186
ndash1855
ndash185
k = 02k = 2k = 1
(i)
50 100 150 200Iterations
0
5
10
15
Fitn
ess v
alue
10 20 300
2
4
k = 02k = 2k = 1
(j)
0 50 100 150 200Iterations
ndash840
ndash820
ndash800
ndash780
ndash760
ndash740
ndash720
Fitn
ess v
alue
k = 02k = 2k = 1
60 65 70 75 80ndash838ndash836ndash834ndash832ndash830
(k)
Figure 4 11 test functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquare function (e) f5 Zakharovfunction (f ) f6 Griewank function (g) f7 Rastrigin function (h) f8 alpine function (i) f9 Shubert function (j) f10 Ackley function (k) f11Cmfun function
12 Computational Intelligence and Neuroscience
simulation diagrams (a)ndash(e) show that although the DEalgorithm converges rapidly in the early stage its globalsearch performance in the later stage becomes lower thanthat of the IEPSO algorithm When the GA is used to solveoptimization problems the individuals in the population fallinto the local optimum and do not continue searching forthe optimum solution erefore in Figure 5 the simulationcurve of the GA converges to the local optimum
e test results in Table 8 indicate that the IEPSO has thebest performance on f6 f7 f8 f9 f10 and f11 and that the DE
and GA can obtain the theoretical optimal value on f9 and f11Although the GA and IEPSO algorithm can obtain the globaloptimal value on f9 the IEPSO algorithm is more robust thanthe GA is As shown in the simulation curve of Figure 6 thediversity of the population is maintained because the sup-plementary particles in the population are stochastic whenthe local optimal solution converges gradually e IEPSOalgorithm can jump out of the local extrema points in theface of complex multimodal test functions and the numberof iterations required is correspondingly reduced
Table 6 Parameter settings
Algorithm Population Maximum iteration Dim of each object OthersPSO 40 1000 10 C1 C2 2 R1 R2 05SPSO 40 1000 10 ω 09ndash04 C1 C2 2 R1 R2 05DE 40 1000 10 mdashGA 40 1000 10 GGAP 05 PRECI 25IEPSO 40 1000 10 ω 09ndash04 C1 C2 2 C3 2ndash0 R1 R2 R3 05
Table 7 Unimodal test functions
Functions Criteria PSO SPSO DE IEPSO GA
f1Mean 133E + 03 308E + 03 731E minus 12 892E minus 22 11696SD 253E + 05 121E + 06 225E minus 23 265E minus 39 44192Best 114E + 03 120E + 03 242E minus 12 772E minus 27 4660
f2Mean 296E minus 02 880E minus 02 837E minus 06 0 179E minus 11SD 836E minus 04 896E minus 04 158E minus 10 0 0Best 455E minus 03 8428734 755E minus 10 0 179E minus 11
f3Mean 119E + 03 251E + 03 114E minus 11 621E minus 19 7430SD 293E + 05 182E + 06 995E minus 23 263E minus 36 5833Best 106E + 03 282E minus 02 210E minus 12 181E minus 27 4542
f4Mean 8238 8210 336E minus 13 170E minus 21 3031SD 686E + 02 140E + 03 995E minus 26 131E minus 41 0835Best 115E + 02 3739 115E minus 13 282E minus 29 1968
f5Mean 126E + 04 860E + 03 702E minus 12 165E minus 10 362E + 03SD 206E + 07 215E + 07 181E minus 23 330E minus 20 344E + 05Best 104E + 04 130E + 02 267E minus 12 217E minus 11 253E + 03
Table 8 Multimodal test functions
Functions Criteria PSO SPSO DE IEPSO GA
f6Mean 1548 1752 944E minus 02 419E minus 02 1006SD 0026 0093 487E minus 04 343E minus 04 0018Best 1236 1417 006 0013 0794
f7Mean 57737 43405 11945 446E minus 03 8939SD 117768 65178 16502 173E minus 04 3608Best 35981 317E + 01 6398 231E minus 12 5040
f8Mean 4996 4665 379E minus 02 242E minus 10 0423SD 191E + 00 1056 54E minus 03 674E minus 20 0051Best 2933 3151 46E minus 03 371E minus 16 0086
f9Mean minus186448 minus186048 minus186728 minus186731 minus186731SD 119E minus 01 983E minus 01 229E minus 08 0 999E minus 12Best minus187E + 02 minus186731 minus1867309 minus1867309 minus186731
f10Mean 13134 15560 1613 113E minus 11 2515SD 14260 2163 0 221E minus 22 0166Best 2861 12719 1613 506E minus 14 1796
f11Mean minus740326 minus715438 minus837966 minus837966 minus837966SD 874E + 03 723E + 03 0 0 0Best minus837966 minus837697 minus837966 minus837966 minus837966
Computational Intelligence and Neuroscience 13
0 100 200 300 400 500 600Iterations
0
2000
4000
6000
8000
10000
12000Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
DE
GASPSO
PSOIEPSO
0 100 200 300 400 500 600Iterations
0
01
02
03
04
05
Fitn
ess v
alue
(b)
DEGAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
5000
10000
15000
Fitn
ess v
alue
(c)
DE
GAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
200
400
600
800
Fitn
ess v
alue
(d)
DEGAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
1
2
3
4
Fitn
ess v
alue
times104
(e)
Figure 5 Unimodal functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquares function (e) f5Zakharov function
14 Computational Intelligence and Neuroscience
0 100 200 300 400 500 600Iterations
0
05
1
15
2
25
3Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
0 100 200 300 400 500 600Iterations
0
20
40
60
80
100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(b)
0 100 200 300 400 500 600Iterations
0
5
10
15
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(c)
0 100 200 300 400 500 600Iterations
ndash180
ndash160
ndash140
ndash120
ndash100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(d)
0 100 200 300 400 500 600Iterations
0
5
10
15
20
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(e)
100 200 300 400 500 600Iterations
ndash850
ndash800
ndash750
ndash700
ndash650
ndash600
ndash550
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(f )
Figure 6 Multimodal functions (a) f6 Griewank function (b) f7 Rastrigin function (c) f8 alpine function (d) f9 Shubert function (e) f10Ackley function (f ) f11 Cmfun function
Computational Intelligence and Neuroscience 15
Table 9 shows the test results for the three improved PSOalgorithms e DMSDL-PSO algorithm in [25] is a PSOalgorithm combined with differential variation and thequasi-Newton method whereas the HPSOWM algorithm in[26] is a binary PSO algorithm based on wavelet transformTable 9 shows that the IEPSO algorithm obtains the bestvalue in 5 out of the 11 test functions and the above analysisindicates that the IEPSO outperforms the other improvedPSO algorithms
4 Conclusion
In contemporary engineering design solving the globaloptimization problems of multiparameter strongly coupledand nonlinear systems using conventional optimizationalgorithms is difficult In this study an improved PSO thatis the IEPSO algorithm is proposed on the basis of the last-eliminated principle and an enhanced local-global in-formation sharing capability e comparison and analysisof the simulation results indicate the following conclusions
(1) e exchange of information between global andlocal optimal particles enhances the deep searchcapability of the IEPSO algorithm
(2) e standard test function is used to simulate theparameter C3 of the local-global information sharingterm e results show that the global optimizationcapability of the IEPSO algorithm is strong when C3is linearly decreasing Moreover the proposed al-gorithm can show the best search performance whenC3 is a nonlinear convex function
(3) e last-eliminated principle is used in the IEPSO tomaintain particle population diversity MoreoverPSO is avoided in the local optimal value A com-parison of the IEPSO algorithm with the classicaloptimization algorithm and its improved versionsverifies the global search capability of the IEPSOalgorithm
In summary the comparative results of the simulationanalysis reveal that with the application of the last-eliminatedprinciple and the local-global information sharing term to the
IEPSO the proposed algorithm effectively overcomes thedisadvantages of the classical algorithms including theirprecocious convergence and tendency to fall into the localoptimum e IEPSO shows an ideal global optimizationperformance and indicates a high application value for solvingpractical engineering optimization problems
Data Availability
e data used to support the findings of this study areavailable from the corresponding author upon request
Conflicts of Interest
e authors declare that there are no conflicts of interest
Acknowledgments
is work was supported by Shanghai Rising-Star Program(no 16QB1401000) Key Project of Shanghai Science andTechnology Committee (no 16DZ1120400) and the Na-tional Natural Science Foundation of China (Project no51705187) the Postdoctoral Science Foundation of China(Grant no 2017M621202)
References
[1] Z Zhou J Wang Z Zhu D Yang and J Wu ldquoTangentnavigated robot path planning strategy using particle swarmoptimized artificial potential fieldrdquo Optik vol 158 pp 639ndash651 2018
[2] P Du R Barrio H Jiang and L Cheng ldquoAccurate Quotient-Difference algorithm error analysis improvements and ap-plicationsrdquo Applied Mathematics and Computation vol 309pp 245ndash271 2017
[3] L Jiang Z Wang Y Ye and J Jiang ldquoFast circle detectionalgorithm based on sampling from difference areardquo Optikvol 158 pp 424ndash433 2018
[4] H Garg ldquoA hybrid PSO-GA algorithm for constrained op-timization problemsrdquo Applied Mathematics amp Computationvol 274 no 11 pp 292ndash305 2016
[5] J Zhang and P Xia ldquoAn improved PSO algorithm for pa-rameter identification of nonlinear dynamic hystereticmodelsrdquo Journal of Sound and Vibration vol 389 pp 153ndash167 2017
[6] R Saini P P Roy and D P Dogra ldquoA segmental HMMbasedtrajectory classification using genetic algorithmrdquo ExpertSystems with Applications vol 93 pp 169ndash181 2018
[7] P R D O D Costa S Mauceri P Carroll et al ldquoA geneticalgorithm for a vehicle routing problemrdquo Electronic Notes inDiscrete Mathematics vol 64 pp 65ndash74 2017
[8] V Jindal and P Bedi ldquoAn improved hybrid ant particleoptimization (IHAPO) algorithm for reducing travel time inVANETsrdquo Applied Soft Computing vol 64 pp 526ndash5352018
[9] Z Peng H Manier and M A Manier ldquoParticle swarmoptimization for capacitated location-routing problemrdquoIFAC-PapersOnLine vol 50 no 1 pp 14668ndash14673 2017
[10] G Xu and G Yu ldquoReprint of on convergence analysis ofparticle swarm optimization algorithmrdquo Journal of ShanxiNormal University vol 4 no 14 pp 25ndash32 2008
[11] J Lu W Xie and H Zhou ldquoCombined fitness functionbased particle swarm optimization algorithm for system
Table 9 ree improved particle swarm algorithm test results
Functions Criteria IEPSO DMSDL-PSO [25]
BHPSOWM[26]
f1Mean 892E minus 22 473E minus 10 4240SD 265E minus 39 181E minus 09 5211
f3Mean 621E minus 19 237E + 03 761SD 263E minus 36 571E + 02 007
f6Mean 419E minus 02 866E minus 05 mdashSD 343E minus 04 296E minus 04 mdash
f7Mean 446E minus 03 915E + 01 7618SD 173E minus 04 180E + 01 2675
f8Mean 242E minus 10 131E + 02 mdashSD 674E minus 20 582E + 01 mdash
f10Mean 113E minus 11 101E + 00 172SD 221E minus 22 271E minus 01 0
16 Computational Intelligence and Neuroscience
identificationrdquo Computers amp Industrial Engineering vol 95pp 122ndash134 2016
[12] F Javidrad and M Nazari ldquoA new hybrid particle swarm andsimulated annealing stochastic optimization methodrdquo Ap-plied Soft Computing vol 60 pp 634ndash654 2017
[13] J Jie J Zhang H Zheng and B Hou ldquoFormalized model andanalysis of mixed swarm based cooperative particle swarmoptimizationrdquo Neurocomputing vol 174 pp 542ndash552 2016
[14] A Meng Z Li H Yin S Chen and Z Guo ldquoAcceleratingparticle swarm optimization using crisscross searchrdquo In-formation Sciences vol 329 pp 52ndash72 2016
[15] L Wang B Yang and J Orchard ldquoParticle swarm optimi-zation using dynamic tournament topologyrdquo Applied SoftComputing vol 48 pp 584ndash596 2016
[16] M S Kiran ldquoParticle swarm optimization with a new updatemechanismrdquo Applied Soft Computing vol 60 pp 670ndash6782017
[17] H C Tsai ldquoUnified particle swarm delivers high efficiency toparticle swarm optimizationrdquo Applied Soft Computingvol 55 pp 371ndash383 2017
[18] S F Li and C Y Cheng ldquoParticle swarm optimization withfitness adjustment parametersrdquo Computers amp Industrial En-gineering vol 113 pp 831ndash841 2017
[19] Y Chen L Li H Peng J Xiao Y Yang and Y Shi ldquoParticleswarm optimizer with two differential mutationrdquo Applied SoftComputing vol 61 pp 314ndash330 2017
[20] Q Zhang W Liu X Meng B Yang and A V VasilakosldquoVector coevolving particle swarm optimization algorithmrdquoInformation Sciences vol 394 pp 273ndash298 2017
[21] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimization[C]Evolutionary computationrdquo in Proceedingsof the 1999 Congress on Evolutionary Computation-CEC99vol 3 pp 1945ndash1950 IEEE Washington DC USA 1999
[22] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[23] A Ratnaweera S K Halgamuge and H C Watson ldquoSelf-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficientsrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 240ndash255 2004
[24] K Chen F Zhou and A Liu ldquoChaotic dynamic weightparticle swarm optimization for numerical function optimi-zationrdquo Knowledge-Based Systems vol 139 pp 23ndash40 2018
[25] Y Chen L Li H Peng J Xiao and Q Wu ldquoDynamic multi-swarm differential learning particle swarm optimizerrdquoSwarm and Evolutionary Computation vol 39 pp 209ndash2212018
[26] F Jiang H Xia Q A Tran Q M Ha N Q Tran and J HuldquoA new binary hybrid particle swarm optimization withwavelet mutationrdquo Knowledge-Based Systems vol 130pp 90ndash101 2017
[27] R Liu J Li C Mu J fan and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimiza-tion for dynamic multi-objective optimizationrdquo EuropeanJournal of Operational Research vol 261 no 3 pp 1028ndash1051 2017
[28] W Ye W Feng and S Fan ldquoA novel multi-swarm particleswarm optimization with dynamic learning strategyrdquo AppliedSoft Computing vol 61 pp 832ndash843 2017
[29] L Zhang Y Tang C Hua and X Guan ldquoA new particleswarm optimization algorithm with adaptive inertia weightbased on Bayesian techniquesrdquo Applied Soft Computingvol 28 pp 138ndash149 2015
[30] Q Cui Q Li G Li et al ldquoGlobally-optimal prediction-basedadaptive mutation particle swarm optimizationrdquo InformationSciences vol 418 pp 186ndash217 2017
[31] D Zhao and J Liu ldquoStudy on network security situationawareness based on particle swarm optimization algorithmrdquoComputers amp Industrial Engineering vol 125 pp 764ndash7752018
[32] H Samareh S H Khoshrou K Shahriar M M Ebadzadehand M Eslami ldquoOptimization of a nonlinear model forpredicting the ground vibration using the combinationalparticle swarm optimization-genetic algorithmrdquo Journal ofAfrican Earth Sciences vol 133 pp 36ndash45 2017
[33] M Dash T Panigrahi and R Sharma ldquoDistributed parameterestimation of IIR system using diffusion particle swarm op-timization algorithmrdquo Journal of King Saud University-Engineering Sciences 2017 In press
[34] B Wang S Li J Guo and Q Chen ldquoCar-like mobile robotpath planning in rough terrain using multi-objective particleswarm optimization algorithmrdquo Neurocomputing vol 282pp 42ndash51 2018
[35] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[36] R F Lopes F F Costa A Oliveira et al ldquoAlgorithm based onparticle swarm applied to electrical load scheduling in anindustrial settingrdquo Energy vol 147 pp 1007ndash1015 2018
[37] F Sheikholeslami and N J Navimipour ldquoService allocation inthe cloud environments using multi-objective particle swarmoptimization algorithm based on crowding distancerdquo Swarmamp Evolutionary Computation vol 35 pp 53ndash64 2017
[38] M Petrovic N Vukovic M Mitic et al ldquoIntegration ofprocess planning and scheduling using chaotic particle swarmoptimization algorithmrdquo Expert Systems with Applicationsvol 64 pp 569ndash588 2016
[39] Z Zhang Y Jiang S Zhang S Geng H Wang and G SangldquoAn adaptive particle swarm optimization algorithm forreservoir operation optimizationrdquo Applied Soft ComputingJournal vol 18 no 4 pp 167ndash177 2014
[40] K Li L Liu J Zhai T M Khoshgoftaar and T Li ldquoeimproved grey model based on particle swarm optimizationalgorithm for time series predictionrdquo Engineering Applica-tions of Artificial Intelligence vol 55 pp 285ndash291 2016
[41] S Gulcu and H Kodaz ldquoe estimation of the electricityenergy demand using particle swarm optimization algorithma case study of Turkeyrdquo Procedia Computer Science vol 111pp 64ndash70 2017
Computational Intelligence and Neuroscience 17
Computer Games Technology
International Journal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom
Journal ofEngineeringVolume 2018
Advances in
FuzzySystems
Hindawiwwwhindawicom
Volume 2018
International Journal of
ReconfigurableComputing
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
thinspArtificial Intelligence
Hindawiwwwhindawicom Volumethinsp2018
Hindawiwwwhindawicom Volume 2018
Civil EngineeringAdvances in
Hindawiwwwhindawicom Volume 2018
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawiwwwhindawicom Volume 2018
Hindawi
wwwhindawicom Volume 2018
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Engineering Mathematics
International Journal of
RoboticsJournal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Computational Intelligence and Neuroscience
Hindawiwwwhindawicom Volume 2018
Mathematical Problems in Engineering
Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018
Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom
The Scientific World Journal
Volume 2018
Hindawiwwwhindawicom Volume 2018
Human-ComputerInteraction
Advances in
Hindawiwwwhindawicom Volume 2018
Scientic Programming
Submit your manuscripts atwwwhindawicom
50 100 150 200Iterations
0
10
20
30
40
50
60
Fitn
ess v
alue
80
60
40
20
05
0
ndash5 ndash50
5
k = ndash1k = 1Constant
(g)
50 100 150 200Iterations
0
1
2
3
4
5
6
Fitn
ess v
alue
15
10
5
010
0
ndash10 ndash10ndash5 0
105
k = ndash1k = 1Constant
(h)
0 50 100 150 200Iterations
ndash1865
ndash186
ndash1855
ndash185
ndash1845
ndash184
Fitn
ess v
alue
200
100
0
ndash100
200150
10050 50
100150 200
k = ndash1k = 1Constant
(i)
Figure 2 Continued
8 Computational Intelligence and Neuroscience
50 100 150 200Iterations
0
5
10
15
Fitn
ess v
alue
15
10
5
0
200
ndash20 ndash200
20
k = ndash1k = 1Constant
(j)
0 50 100 150 200Iterations
ndash840
ndash820
ndash800
ndash780
ndash760
ndash740
ndash720
Fitn
ess v
alue
1000
500
0
ndash500
500ndash1000
0
ndash500 ndash5000
500
k = ndash1k = 1Constant
(k)
Figure 2 11 test functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquares function (e) f5 Zakharovfunction (f ) f6 Griewank function (g) f7 Rastrigin function (h) f8 alpine function (i) f9 Shubert function (j) f10 Ackley function (k) f11Cmfun function
Table 4 Unimodal test functions
Functions Criteria C3 2sim0 k 02 C3 2sim0 k 2 C3 2sim0 k 1
f1Mean 266E minus 20 551E minus 10 450E minus 20SD 265E minus 39 287E minus 19 375E minus 16Best 912E minus 24 138E minus 11 155E minus 25
f2Mean 0 0 0SD 0 0 0Best 0 0 0
f3Mean 621E minus 19 604E minus 10 182E minus 20SD 263E minus 36 779E minus 19 105E minus 39Best 181E minus 27 308E minus 11 322E minus 24
f4Mean 170E minus 21 242E minus 11 820E minus 20SD 131E minus 41 440E minus 22 511E minus 38Best 282E minus 29 436E minus 12 843E minus 26
f5Mean 165E minus 10 283E minus 11 556E minus 11SD 330E minus 20 359E minus 11 488E minus 14Best 217E minus 11 100E minus 11 354E minus 11
Computational Intelligence and Neuroscience 9
not show its advantages because of its strong deep searchcapability In the complex multimodal test function whenthe convex function is used in C3 the downward trend isslow in the early stage thus beneting the global search andthe downward speed increases in the later stage thusbeneting the local search When the concave function isused for C3 the descent speed is fast in the early stageAlthough the search speed is improved the coverage area ofthe search is reduced thereby leading to the convergence ofthe algorithm to the nonoptimal value From the simulationdiagrams (f)ndash(k) the convergence speed is observed to beslightly slow when C3 is a convex function but its ability tojump out of the local extremum and the accuracy of theglobal search are higher than those in the other two casesWhen C3 is a concave function the convergence speed isfaster than those in the other two cases and the searchaccuracy is lower than that when C3 is a convex function
32 Comparison of Test Results e 11 test functions inFigure 1 are used to compare the IEPSO algorithm withclassical PSO SPSO dierential algorithm (DE) and GAe DE GA and PSO algorithms are all stochastic in-telligent optimization algorithms with population iterationse evaluation criteria of algorithm performance includespeed of convergence and size of individual populationsearch coverage e dierential optimization algorithm hasa low space complexity and obvious advantages in dealingwith large-scale and complex optimization problems eGA has good convergence when solving discrete multipeakand noise-containing optimization problems Based on thetraditional PSO algorithm the SPSO algorithm achieves thebalance between global search and local search by adjustingthe inertial weight (Figures 3 and 4)
e experimental parameters of the ve algorithmsare set as shown in Table 6 Each test function is runindependently 10 times and the average is recorded toreduce the data error e iteration is stopped when the
convergence condition meets the convergence accuracy ebest average tness value of the ve algorithms is blackenede standard deviation average tness and optimal value ofeach algorithm are shown in Tables 7 and 8 Figures 5 and 6plot the convergence curves of the 11 test functions
Table 7 shows that the IEPSO has the best performanceon f1 f2 f3 and f4 e IEPSO algorithm obtains the the-oretical optimal value on f2 DE can search the global so-lution on f5 e deep search capability of the IEPSOalgorithm is considerably higher than that of the PSO andSPSO algorithms due to the increased global-local in-formation sharing term and the last-eliminated principlee crossover mutation and selection mechanisms makethe DE algorithm perform well in the early stage of the globalsearch However the diversity of the population declines inthe latter stage because of population dierences e
0 2 4 6 8 10Iterations
0
02
04
06
08
1
C3
k = 2k = 02k = 1
Figure 3 e change curve of C3 with the number of iterations
Table 5 Multimodal test functions
Functions Criteria C3 2sim0 k 02 C3 2sim0 k 2 C3 2sim0 k 1
f6Mean 419E minus 02 479E minus 02 492E minus 02SD 343E minus 04 707E minus 04 596E minus 04Best 125E minus 02 57E minus 03 123E minus 02
f7Mean 446E minus 03 500E minus 05 19Ee minus 04SD 173E minus 04 303E minus 06 5649Best 231E minus 12 389E minus 11 225E minus 05
f8Mean 242E minus 10 374E minus 10 528E minus 10SD 674E minus 20 247E minus 12 223E minus 12Best 371E minus 16 436E minus 11 583E minus 13
f9Mean minus1867309 minus1867309 minus1867309SD 0 0 0Best minus1867309 minus1867309 minus1867309
f10Mean 113E minus 11 205E minus 10 184E minus 11SD 221E minus 22 437E minus 12 227E minus 22Best 506E minus 14 175E minus 10 250E minus 12
f11Mean minus8379658 minus8379658 minus8379658SD 0 0 440E minus 09Best minus8379658 minus8379658 minus8379658
10 Computational Intelligence and Neuroscience
0 50 100 150 200Iterations
0
500
1000
1500
2000
2500
3000Fi
tnes
s val
ue
2 4 6 8 10 120
1000
2000
k = 2k = 02k = 1
(a)
50 100 150 200Iterations
0
0005
001
0015
Fitn
ess v
alue
5 10 15 200
1
2
3times10ndash3
k = 02k = 2k = 1
(b)
50 100 150 200Iterations
0
500
1000
1500
2000
2500
3000
Fitn
ess v
alue
2 4 60
1000
2000
k = 1k = 2k = 02
(c)
50 100 150 200Iterations
0
10
20
30
40
50
60
Fitn
ess v
alue
2 4 6 8 100
20
40
k = 02k = 2k = 1
(d)
50 100 150 200Iterations
0
2000
4000
6000
8000
10000
12000
14000
Fitn
ess v
alue
5 10 15 200
2000
4000
6000
k = 02k = 2k = 1
(e)
50 100 150 200Iterations
0
05
1
15
2
Fitn
ess v
alue
5 10 15 20 25 300
05
1
15
k = 1k = 2k = 02
(f )
Figure 4 Continued
Computational Intelligence and Neuroscience 11
50 100 150 200Iterations
0
10
20
30
40
50Fi
tnes
s val
ue
20 40 60 800
10
20
30
k = 02k = 2k = 1
(g)
50 100 150 200Iterations
0
1
2
3
4
Fitn
ess v
alue
2 4 6 8 10 120
1
2
3
k = 02k = 2k = 1
(h)
0 50 100 150 200Iterations
ndash1865
ndash186
ndash1855
ndash185
ndash1845
ndash184
Fitn
ess v
alue
10 20 30 40
ndash1865
ndash186
ndash1855
ndash185
k = 02k = 2k = 1
(i)
50 100 150 200Iterations
0
5
10
15
Fitn
ess v
alue
10 20 300
2
4
k = 02k = 2k = 1
(j)
0 50 100 150 200Iterations
ndash840
ndash820
ndash800
ndash780
ndash760
ndash740
ndash720
Fitn
ess v
alue
k = 02k = 2k = 1
60 65 70 75 80ndash838ndash836ndash834ndash832ndash830
(k)
Figure 4 11 test functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquare function (e) f5 Zakharovfunction (f ) f6 Griewank function (g) f7 Rastrigin function (h) f8 alpine function (i) f9 Shubert function (j) f10 Ackley function (k) f11Cmfun function
12 Computational Intelligence and Neuroscience
simulation diagrams (a)ndash(e) show that although the DEalgorithm converges rapidly in the early stage its globalsearch performance in the later stage becomes lower thanthat of the IEPSO algorithm When the GA is used to solveoptimization problems the individuals in the population fallinto the local optimum and do not continue searching forthe optimum solution erefore in Figure 5 the simulationcurve of the GA converges to the local optimum
e test results in Table 8 indicate that the IEPSO has thebest performance on f6 f7 f8 f9 f10 and f11 and that the DE
and GA can obtain the theoretical optimal value on f9 and f11Although the GA and IEPSO algorithm can obtain the globaloptimal value on f9 the IEPSO algorithm is more robust thanthe GA is As shown in the simulation curve of Figure 6 thediversity of the population is maintained because the sup-plementary particles in the population are stochastic whenthe local optimal solution converges gradually e IEPSOalgorithm can jump out of the local extrema points in theface of complex multimodal test functions and the numberof iterations required is correspondingly reduced
Table 6 Parameter settings
Algorithm Population Maximum iteration Dim of each object OthersPSO 40 1000 10 C1 C2 2 R1 R2 05SPSO 40 1000 10 ω 09ndash04 C1 C2 2 R1 R2 05DE 40 1000 10 mdashGA 40 1000 10 GGAP 05 PRECI 25IEPSO 40 1000 10 ω 09ndash04 C1 C2 2 C3 2ndash0 R1 R2 R3 05
Table 7 Unimodal test functions
Functions Criteria PSO SPSO DE IEPSO GA
f1Mean 133E + 03 308E + 03 731E minus 12 892E minus 22 11696SD 253E + 05 121E + 06 225E minus 23 265E minus 39 44192Best 114E + 03 120E + 03 242E minus 12 772E minus 27 4660
f2Mean 296E minus 02 880E minus 02 837E minus 06 0 179E minus 11SD 836E minus 04 896E minus 04 158E minus 10 0 0Best 455E minus 03 8428734 755E minus 10 0 179E minus 11
f3Mean 119E + 03 251E + 03 114E minus 11 621E minus 19 7430SD 293E + 05 182E + 06 995E minus 23 263E minus 36 5833Best 106E + 03 282E minus 02 210E minus 12 181E minus 27 4542
f4Mean 8238 8210 336E minus 13 170E minus 21 3031SD 686E + 02 140E + 03 995E minus 26 131E minus 41 0835Best 115E + 02 3739 115E minus 13 282E minus 29 1968
f5Mean 126E + 04 860E + 03 702E minus 12 165E minus 10 362E + 03SD 206E + 07 215E + 07 181E minus 23 330E minus 20 344E + 05Best 104E + 04 130E + 02 267E minus 12 217E minus 11 253E + 03
Table 8 Multimodal test functions
Functions Criteria PSO SPSO DE IEPSO GA
f6Mean 1548 1752 944E minus 02 419E minus 02 1006SD 0026 0093 487E minus 04 343E minus 04 0018Best 1236 1417 006 0013 0794
f7Mean 57737 43405 11945 446E minus 03 8939SD 117768 65178 16502 173E minus 04 3608Best 35981 317E + 01 6398 231E minus 12 5040
f8Mean 4996 4665 379E minus 02 242E minus 10 0423SD 191E + 00 1056 54E minus 03 674E minus 20 0051Best 2933 3151 46E minus 03 371E minus 16 0086
f9Mean minus186448 minus186048 minus186728 minus186731 minus186731SD 119E minus 01 983E minus 01 229E minus 08 0 999E minus 12Best minus187E + 02 minus186731 minus1867309 minus1867309 minus186731
f10Mean 13134 15560 1613 113E minus 11 2515SD 14260 2163 0 221E minus 22 0166Best 2861 12719 1613 506E minus 14 1796
f11Mean minus740326 minus715438 minus837966 minus837966 minus837966SD 874E + 03 723E + 03 0 0 0Best minus837966 minus837697 minus837966 minus837966 minus837966
Computational Intelligence and Neuroscience 13
0 100 200 300 400 500 600Iterations
0
2000
4000
6000
8000
10000
12000Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
DE
GASPSO
PSOIEPSO
0 100 200 300 400 500 600Iterations
0
01
02
03
04
05
Fitn
ess v
alue
(b)
DEGAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
5000
10000
15000
Fitn
ess v
alue
(c)
DE
GAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
200
400
600
800
Fitn
ess v
alue
(d)
DEGAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
1
2
3
4
Fitn
ess v
alue
times104
(e)
Figure 5 Unimodal functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquares function (e) f5Zakharov function
14 Computational Intelligence and Neuroscience
0 100 200 300 400 500 600Iterations
0
05
1
15
2
25
3Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
0 100 200 300 400 500 600Iterations
0
20
40
60
80
100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(b)
0 100 200 300 400 500 600Iterations
0
5
10
15
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(c)
0 100 200 300 400 500 600Iterations
ndash180
ndash160
ndash140
ndash120
ndash100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(d)
0 100 200 300 400 500 600Iterations
0
5
10
15
20
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(e)
100 200 300 400 500 600Iterations
ndash850
ndash800
ndash750
ndash700
ndash650
ndash600
ndash550
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(f )
Figure 6 Multimodal functions (a) f6 Griewank function (b) f7 Rastrigin function (c) f8 alpine function (d) f9 Shubert function (e) f10Ackley function (f ) f11 Cmfun function
Computational Intelligence and Neuroscience 15
Table 9 shows the test results for the three improved PSOalgorithms e DMSDL-PSO algorithm in [25] is a PSOalgorithm combined with differential variation and thequasi-Newton method whereas the HPSOWM algorithm in[26] is a binary PSO algorithm based on wavelet transformTable 9 shows that the IEPSO algorithm obtains the bestvalue in 5 out of the 11 test functions and the above analysisindicates that the IEPSO outperforms the other improvedPSO algorithms
4 Conclusion
In contemporary engineering design solving the globaloptimization problems of multiparameter strongly coupledand nonlinear systems using conventional optimizationalgorithms is difficult In this study an improved PSO thatis the IEPSO algorithm is proposed on the basis of the last-eliminated principle and an enhanced local-global in-formation sharing capability e comparison and analysisof the simulation results indicate the following conclusions
(1) e exchange of information between global andlocal optimal particles enhances the deep searchcapability of the IEPSO algorithm
(2) e standard test function is used to simulate theparameter C3 of the local-global information sharingterm e results show that the global optimizationcapability of the IEPSO algorithm is strong when C3is linearly decreasing Moreover the proposed al-gorithm can show the best search performance whenC3 is a nonlinear convex function
(3) e last-eliminated principle is used in the IEPSO tomaintain particle population diversity MoreoverPSO is avoided in the local optimal value A com-parison of the IEPSO algorithm with the classicaloptimization algorithm and its improved versionsverifies the global search capability of the IEPSOalgorithm
In summary the comparative results of the simulationanalysis reveal that with the application of the last-eliminatedprinciple and the local-global information sharing term to the
IEPSO the proposed algorithm effectively overcomes thedisadvantages of the classical algorithms including theirprecocious convergence and tendency to fall into the localoptimum e IEPSO shows an ideal global optimizationperformance and indicates a high application value for solvingpractical engineering optimization problems
Data Availability
e data used to support the findings of this study areavailable from the corresponding author upon request
Conflicts of Interest
e authors declare that there are no conflicts of interest
Acknowledgments
is work was supported by Shanghai Rising-Star Program(no 16QB1401000) Key Project of Shanghai Science andTechnology Committee (no 16DZ1120400) and the Na-tional Natural Science Foundation of China (Project no51705187) the Postdoctoral Science Foundation of China(Grant no 2017M621202)
References
[1] Z Zhou J Wang Z Zhu D Yang and J Wu ldquoTangentnavigated robot path planning strategy using particle swarmoptimized artificial potential fieldrdquo Optik vol 158 pp 639ndash651 2018
[2] P Du R Barrio H Jiang and L Cheng ldquoAccurate Quotient-Difference algorithm error analysis improvements and ap-plicationsrdquo Applied Mathematics and Computation vol 309pp 245ndash271 2017
[3] L Jiang Z Wang Y Ye and J Jiang ldquoFast circle detectionalgorithm based on sampling from difference areardquo Optikvol 158 pp 424ndash433 2018
[4] H Garg ldquoA hybrid PSO-GA algorithm for constrained op-timization problemsrdquo Applied Mathematics amp Computationvol 274 no 11 pp 292ndash305 2016
[5] J Zhang and P Xia ldquoAn improved PSO algorithm for pa-rameter identification of nonlinear dynamic hystereticmodelsrdquo Journal of Sound and Vibration vol 389 pp 153ndash167 2017
[6] R Saini P P Roy and D P Dogra ldquoA segmental HMMbasedtrajectory classification using genetic algorithmrdquo ExpertSystems with Applications vol 93 pp 169ndash181 2018
[7] P R D O D Costa S Mauceri P Carroll et al ldquoA geneticalgorithm for a vehicle routing problemrdquo Electronic Notes inDiscrete Mathematics vol 64 pp 65ndash74 2017
[8] V Jindal and P Bedi ldquoAn improved hybrid ant particleoptimization (IHAPO) algorithm for reducing travel time inVANETsrdquo Applied Soft Computing vol 64 pp 526ndash5352018
[9] Z Peng H Manier and M A Manier ldquoParticle swarmoptimization for capacitated location-routing problemrdquoIFAC-PapersOnLine vol 50 no 1 pp 14668ndash14673 2017
[10] G Xu and G Yu ldquoReprint of on convergence analysis ofparticle swarm optimization algorithmrdquo Journal of ShanxiNormal University vol 4 no 14 pp 25ndash32 2008
[11] J Lu W Xie and H Zhou ldquoCombined fitness functionbased particle swarm optimization algorithm for system
Table 9 ree improved particle swarm algorithm test results
Functions Criteria IEPSO DMSDL-PSO [25]
BHPSOWM[26]
f1Mean 892E minus 22 473E minus 10 4240SD 265E minus 39 181E minus 09 5211
f3Mean 621E minus 19 237E + 03 761SD 263E minus 36 571E + 02 007
f6Mean 419E minus 02 866E minus 05 mdashSD 343E minus 04 296E minus 04 mdash
f7Mean 446E minus 03 915E + 01 7618SD 173E minus 04 180E + 01 2675
f8Mean 242E minus 10 131E + 02 mdashSD 674E minus 20 582E + 01 mdash
f10Mean 113E minus 11 101E + 00 172SD 221E minus 22 271E minus 01 0
16 Computational Intelligence and Neuroscience
identificationrdquo Computers amp Industrial Engineering vol 95pp 122ndash134 2016
[12] F Javidrad and M Nazari ldquoA new hybrid particle swarm andsimulated annealing stochastic optimization methodrdquo Ap-plied Soft Computing vol 60 pp 634ndash654 2017
[13] J Jie J Zhang H Zheng and B Hou ldquoFormalized model andanalysis of mixed swarm based cooperative particle swarmoptimizationrdquo Neurocomputing vol 174 pp 542ndash552 2016
[14] A Meng Z Li H Yin S Chen and Z Guo ldquoAcceleratingparticle swarm optimization using crisscross searchrdquo In-formation Sciences vol 329 pp 52ndash72 2016
[15] L Wang B Yang and J Orchard ldquoParticle swarm optimi-zation using dynamic tournament topologyrdquo Applied SoftComputing vol 48 pp 584ndash596 2016
[16] M S Kiran ldquoParticle swarm optimization with a new updatemechanismrdquo Applied Soft Computing vol 60 pp 670ndash6782017
[17] H C Tsai ldquoUnified particle swarm delivers high efficiency toparticle swarm optimizationrdquo Applied Soft Computingvol 55 pp 371ndash383 2017
[18] S F Li and C Y Cheng ldquoParticle swarm optimization withfitness adjustment parametersrdquo Computers amp Industrial En-gineering vol 113 pp 831ndash841 2017
[19] Y Chen L Li H Peng J Xiao Y Yang and Y Shi ldquoParticleswarm optimizer with two differential mutationrdquo Applied SoftComputing vol 61 pp 314ndash330 2017
[20] Q Zhang W Liu X Meng B Yang and A V VasilakosldquoVector coevolving particle swarm optimization algorithmrdquoInformation Sciences vol 394 pp 273ndash298 2017
[21] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimization[C]Evolutionary computationrdquo in Proceedingsof the 1999 Congress on Evolutionary Computation-CEC99vol 3 pp 1945ndash1950 IEEE Washington DC USA 1999
[22] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[23] A Ratnaweera S K Halgamuge and H C Watson ldquoSelf-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficientsrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 240ndash255 2004
[24] K Chen F Zhou and A Liu ldquoChaotic dynamic weightparticle swarm optimization for numerical function optimi-zationrdquo Knowledge-Based Systems vol 139 pp 23ndash40 2018
[25] Y Chen L Li H Peng J Xiao and Q Wu ldquoDynamic multi-swarm differential learning particle swarm optimizerrdquoSwarm and Evolutionary Computation vol 39 pp 209ndash2212018
[26] F Jiang H Xia Q A Tran Q M Ha N Q Tran and J HuldquoA new binary hybrid particle swarm optimization withwavelet mutationrdquo Knowledge-Based Systems vol 130pp 90ndash101 2017
[27] R Liu J Li C Mu J fan and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimiza-tion for dynamic multi-objective optimizationrdquo EuropeanJournal of Operational Research vol 261 no 3 pp 1028ndash1051 2017
[28] W Ye W Feng and S Fan ldquoA novel multi-swarm particleswarm optimization with dynamic learning strategyrdquo AppliedSoft Computing vol 61 pp 832ndash843 2017
[29] L Zhang Y Tang C Hua and X Guan ldquoA new particleswarm optimization algorithm with adaptive inertia weightbased on Bayesian techniquesrdquo Applied Soft Computingvol 28 pp 138ndash149 2015
[30] Q Cui Q Li G Li et al ldquoGlobally-optimal prediction-basedadaptive mutation particle swarm optimizationrdquo InformationSciences vol 418 pp 186ndash217 2017
[31] D Zhao and J Liu ldquoStudy on network security situationawareness based on particle swarm optimization algorithmrdquoComputers amp Industrial Engineering vol 125 pp 764ndash7752018
[32] H Samareh S H Khoshrou K Shahriar M M Ebadzadehand M Eslami ldquoOptimization of a nonlinear model forpredicting the ground vibration using the combinationalparticle swarm optimization-genetic algorithmrdquo Journal ofAfrican Earth Sciences vol 133 pp 36ndash45 2017
[33] M Dash T Panigrahi and R Sharma ldquoDistributed parameterestimation of IIR system using diffusion particle swarm op-timization algorithmrdquo Journal of King Saud University-Engineering Sciences 2017 In press
[34] B Wang S Li J Guo and Q Chen ldquoCar-like mobile robotpath planning in rough terrain using multi-objective particleswarm optimization algorithmrdquo Neurocomputing vol 282pp 42ndash51 2018
[35] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[36] R F Lopes F F Costa A Oliveira et al ldquoAlgorithm based onparticle swarm applied to electrical load scheduling in anindustrial settingrdquo Energy vol 147 pp 1007ndash1015 2018
[37] F Sheikholeslami and N J Navimipour ldquoService allocation inthe cloud environments using multi-objective particle swarmoptimization algorithm based on crowding distancerdquo Swarmamp Evolutionary Computation vol 35 pp 53ndash64 2017
[38] M Petrovic N Vukovic M Mitic et al ldquoIntegration ofprocess planning and scheduling using chaotic particle swarmoptimization algorithmrdquo Expert Systems with Applicationsvol 64 pp 569ndash588 2016
[39] Z Zhang Y Jiang S Zhang S Geng H Wang and G SangldquoAn adaptive particle swarm optimization algorithm forreservoir operation optimizationrdquo Applied Soft ComputingJournal vol 18 no 4 pp 167ndash177 2014
[40] K Li L Liu J Zhai T M Khoshgoftaar and T Li ldquoeimproved grey model based on particle swarm optimizationalgorithm for time series predictionrdquo Engineering Applica-tions of Artificial Intelligence vol 55 pp 285ndash291 2016
[41] S Gulcu and H Kodaz ldquoe estimation of the electricityenergy demand using particle swarm optimization algorithma case study of Turkeyrdquo Procedia Computer Science vol 111pp 64ndash70 2017
Computational Intelligence and Neuroscience 17
Computer Games Technology
International Journal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom
Journal ofEngineeringVolume 2018
Advances in
FuzzySystems
Hindawiwwwhindawicom
Volume 2018
International Journal of
ReconfigurableComputing
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
thinspArtificial Intelligence
Hindawiwwwhindawicom Volumethinsp2018
Hindawiwwwhindawicom Volume 2018
Civil EngineeringAdvances in
Hindawiwwwhindawicom Volume 2018
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawiwwwhindawicom Volume 2018
Hindawi
wwwhindawicom Volume 2018
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Engineering Mathematics
International Journal of
RoboticsJournal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Computational Intelligence and Neuroscience
Hindawiwwwhindawicom Volume 2018
Mathematical Problems in Engineering
Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018
Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom
The Scientific World Journal
Volume 2018
Hindawiwwwhindawicom Volume 2018
Human-ComputerInteraction
Advances in
Hindawiwwwhindawicom Volume 2018
Scientic Programming
Submit your manuscripts atwwwhindawicom
50 100 150 200Iterations
0
5
10
15
Fitn
ess v
alue
15
10
5
0
200
ndash20 ndash200
20
k = ndash1k = 1Constant
(j)
0 50 100 150 200Iterations
ndash840
ndash820
ndash800
ndash780
ndash760
ndash740
ndash720
Fitn
ess v
alue
1000
500
0
ndash500
500ndash1000
0
ndash500 ndash5000
500
k = ndash1k = 1Constant
(k)
Figure 2 11 test functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquares function (e) f5 Zakharovfunction (f ) f6 Griewank function (g) f7 Rastrigin function (h) f8 alpine function (i) f9 Shubert function (j) f10 Ackley function (k) f11Cmfun function
Table 4 Unimodal test functions
Functions Criteria C3 2sim0 k 02 C3 2sim0 k 2 C3 2sim0 k 1
f1Mean 266E minus 20 551E minus 10 450E minus 20SD 265E minus 39 287E minus 19 375E minus 16Best 912E minus 24 138E minus 11 155E minus 25
f2Mean 0 0 0SD 0 0 0Best 0 0 0
f3Mean 621E minus 19 604E minus 10 182E minus 20SD 263E minus 36 779E minus 19 105E minus 39Best 181E minus 27 308E minus 11 322E minus 24
f4Mean 170E minus 21 242E minus 11 820E minus 20SD 131E minus 41 440E minus 22 511E minus 38Best 282E minus 29 436E minus 12 843E minus 26
f5Mean 165E minus 10 283E minus 11 556E minus 11SD 330E minus 20 359E minus 11 488E minus 14Best 217E minus 11 100E minus 11 354E minus 11
Computational Intelligence and Neuroscience 9
not show its advantages because of its strong deep searchcapability In the complex multimodal test function whenthe convex function is used in C3 the downward trend isslow in the early stage thus beneting the global search andthe downward speed increases in the later stage thusbeneting the local search When the concave function isused for C3 the descent speed is fast in the early stageAlthough the search speed is improved the coverage area ofthe search is reduced thereby leading to the convergence ofthe algorithm to the nonoptimal value From the simulationdiagrams (f)ndash(k) the convergence speed is observed to beslightly slow when C3 is a convex function but its ability tojump out of the local extremum and the accuracy of theglobal search are higher than those in the other two casesWhen C3 is a concave function the convergence speed isfaster than those in the other two cases and the searchaccuracy is lower than that when C3 is a convex function
32 Comparison of Test Results e 11 test functions inFigure 1 are used to compare the IEPSO algorithm withclassical PSO SPSO dierential algorithm (DE) and GAe DE GA and PSO algorithms are all stochastic in-telligent optimization algorithms with population iterationse evaluation criteria of algorithm performance includespeed of convergence and size of individual populationsearch coverage e dierential optimization algorithm hasa low space complexity and obvious advantages in dealingwith large-scale and complex optimization problems eGA has good convergence when solving discrete multipeakand noise-containing optimization problems Based on thetraditional PSO algorithm the SPSO algorithm achieves thebalance between global search and local search by adjustingthe inertial weight (Figures 3 and 4)
e experimental parameters of the ve algorithmsare set as shown in Table 6 Each test function is runindependently 10 times and the average is recorded toreduce the data error e iteration is stopped when the
convergence condition meets the convergence accuracy ebest average tness value of the ve algorithms is blackenede standard deviation average tness and optimal value ofeach algorithm are shown in Tables 7 and 8 Figures 5 and 6plot the convergence curves of the 11 test functions
Table 7 shows that the IEPSO has the best performanceon f1 f2 f3 and f4 e IEPSO algorithm obtains the the-oretical optimal value on f2 DE can search the global so-lution on f5 e deep search capability of the IEPSOalgorithm is considerably higher than that of the PSO andSPSO algorithms due to the increased global-local in-formation sharing term and the last-eliminated principlee crossover mutation and selection mechanisms makethe DE algorithm perform well in the early stage of the globalsearch However the diversity of the population declines inthe latter stage because of population dierences e
0 2 4 6 8 10Iterations
0
02
04
06
08
1
C3
k = 2k = 02k = 1
Figure 3 e change curve of C3 with the number of iterations
Table 5 Multimodal test functions
Functions Criteria C3 2sim0 k 02 C3 2sim0 k 2 C3 2sim0 k 1
f6Mean 419E minus 02 479E minus 02 492E minus 02SD 343E minus 04 707E minus 04 596E minus 04Best 125E minus 02 57E minus 03 123E minus 02
f7Mean 446E minus 03 500E minus 05 19Ee minus 04SD 173E minus 04 303E minus 06 5649Best 231E minus 12 389E minus 11 225E minus 05
f8Mean 242E minus 10 374E minus 10 528E minus 10SD 674E minus 20 247E minus 12 223E minus 12Best 371E minus 16 436E minus 11 583E minus 13
f9Mean minus1867309 minus1867309 minus1867309SD 0 0 0Best minus1867309 minus1867309 minus1867309
f10Mean 113E minus 11 205E minus 10 184E minus 11SD 221E minus 22 437E minus 12 227E minus 22Best 506E minus 14 175E minus 10 250E minus 12
f11Mean minus8379658 minus8379658 minus8379658SD 0 0 440E minus 09Best minus8379658 minus8379658 minus8379658
10 Computational Intelligence and Neuroscience
0 50 100 150 200Iterations
0
500
1000
1500
2000
2500
3000Fi
tnes
s val
ue
2 4 6 8 10 120
1000
2000
k = 2k = 02k = 1
(a)
50 100 150 200Iterations
0
0005
001
0015
Fitn
ess v
alue
5 10 15 200
1
2
3times10ndash3
k = 02k = 2k = 1
(b)
50 100 150 200Iterations
0
500
1000
1500
2000
2500
3000
Fitn
ess v
alue
2 4 60
1000
2000
k = 1k = 2k = 02
(c)
50 100 150 200Iterations
0
10
20
30
40
50
60
Fitn
ess v
alue
2 4 6 8 100
20
40
k = 02k = 2k = 1
(d)
50 100 150 200Iterations
0
2000
4000
6000
8000
10000
12000
14000
Fitn
ess v
alue
5 10 15 200
2000
4000
6000
k = 02k = 2k = 1
(e)
50 100 150 200Iterations
0
05
1
15
2
Fitn
ess v
alue
5 10 15 20 25 300
05
1
15
k = 1k = 2k = 02
(f )
Figure 4 Continued
Computational Intelligence and Neuroscience 11
50 100 150 200Iterations
0
10
20
30
40
50Fi
tnes
s val
ue
20 40 60 800
10
20
30
k = 02k = 2k = 1
(g)
50 100 150 200Iterations
0
1
2
3
4
Fitn
ess v
alue
2 4 6 8 10 120
1
2
3
k = 02k = 2k = 1
(h)
0 50 100 150 200Iterations
ndash1865
ndash186
ndash1855
ndash185
ndash1845
ndash184
Fitn
ess v
alue
10 20 30 40
ndash1865
ndash186
ndash1855
ndash185
k = 02k = 2k = 1
(i)
50 100 150 200Iterations
0
5
10
15
Fitn
ess v
alue
10 20 300
2
4
k = 02k = 2k = 1
(j)
0 50 100 150 200Iterations
ndash840
ndash820
ndash800
ndash780
ndash760
ndash740
ndash720
Fitn
ess v
alue
k = 02k = 2k = 1
60 65 70 75 80ndash838ndash836ndash834ndash832ndash830
(k)
Figure 4 11 test functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquare function (e) f5 Zakharovfunction (f ) f6 Griewank function (g) f7 Rastrigin function (h) f8 alpine function (i) f9 Shubert function (j) f10 Ackley function (k) f11Cmfun function
12 Computational Intelligence and Neuroscience
simulation diagrams (a)ndash(e) show that although the DEalgorithm converges rapidly in the early stage its globalsearch performance in the later stage becomes lower thanthat of the IEPSO algorithm When the GA is used to solveoptimization problems the individuals in the population fallinto the local optimum and do not continue searching forthe optimum solution erefore in Figure 5 the simulationcurve of the GA converges to the local optimum
e test results in Table 8 indicate that the IEPSO has thebest performance on f6 f7 f8 f9 f10 and f11 and that the DE
and GA can obtain the theoretical optimal value on f9 and f11Although the GA and IEPSO algorithm can obtain the globaloptimal value on f9 the IEPSO algorithm is more robust thanthe GA is As shown in the simulation curve of Figure 6 thediversity of the population is maintained because the sup-plementary particles in the population are stochastic whenthe local optimal solution converges gradually e IEPSOalgorithm can jump out of the local extrema points in theface of complex multimodal test functions and the numberof iterations required is correspondingly reduced
Table 6 Parameter settings
Algorithm Population Maximum iteration Dim of each object OthersPSO 40 1000 10 C1 C2 2 R1 R2 05SPSO 40 1000 10 ω 09ndash04 C1 C2 2 R1 R2 05DE 40 1000 10 mdashGA 40 1000 10 GGAP 05 PRECI 25IEPSO 40 1000 10 ω 09ndash04 C1 C2 2 C3 2ndash0 R1 R2 R3 05
Table 7 Unimodal test functions
Functions Criteria PSO SPSO DE IEPSO GA
f1Mean 133E + 03 308E + 03 731E minus 12 892E minus 22 11696SD 253E + 05 121E + 06 225E minus 23 265E minus 39 44192Best 114E + 03 120E + 03 242E minus 12 772E minus 27 4660
f2Mean 296E minus 02 880E minus 02 837E minus 06 0 179E minus 11SD 836E minus 04 896E minus 04 158E minus 10 0 0Best 455E minus 03 8428734 755E minus 10 0 179E minus 11
f3Mean 119E + 03 251E + 03 114E minus 11 621E minus 19 7430SD 293E + 05 182E + 06 995E minus 23 263E minus 36 5833Best 106E + 03 282E minus 02 210E minus 12 181E minus 27 4542
f4Mean 8238 8210 336E minus 13 170E minus 21 3031SD 686E + 02 140E + 03 995E minus 26 131E minus 41 0835Best 115E + 02 3739 115E minus 13 282E minus 29 1968
f5Mean 126E + 04 860E + 03 702E minus 12 165E minus 10 362E + 03SD 206E + 07 215E + 07 181E minus 23 330E minus 20 344E + 05Best 104E + 04 130E + 02 267E minus 12 217E minus 11 253E + 03
Table 8 Multimodal test functions
Functions Criteria PSO SPSO DE IEPSO GA
f6Mean 1548 1752 944E minus 02 419E minus 02 1006SD 0026 0093 487E minus 04 343E minus 04 0018Best 1236 1417 006 0013 0794
f7Mean 57737 43405 11945 446E minus 03 8939SD 117768 65178 16502 173E minus 04 3608Best 35981 317E + 01 6398 231E minus 12 5040
f8Mean 4996 4665 379E minus 02 242E minus 10 0423SD 191E + 00 1056 54E minus 03 674E minus 20 0051Best 2933 3151 46E minus 03 371E minus 16 0086
f9Mean minus186448 minus186048 minus186728 minus186731 minus186731SD 119E minus 01 983E minus 01 229E minus 08 0 999E minus 12Best minus187E + 02 minus186731 minus1867309 minus1867309 minus186731
f10Mean 13134 15560 1613 113E minus 11 2515SD 14260 2163 0 221E minus 22 0166Best 2861 12719 1613 506E minus 14 1796
f11Mean minus740326 minus715438 minus837966 minus837966 minus837966SD 874E + 03 723E + 03 0 0 0Best minus837966 minus837697 minus837966 minus837966 minus837966
Computational Intelligence and Neuroscience 13
0 100 200 300 400 500 600Iterations
0
2000
4000
6000
8000
10000
12000Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
DE
GASPSO
PSOIEPSO
0 100 200 300 400 500 600Iterations
0
01
02
03
04
05
Fitn
ess v
alue
(b)
DEGAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
5000
10000
15000
Fitn
ess v
alue
(c)
DE
GAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
200
400
600
800
Fitn
ess v
alue
(d)
DEGAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
1
2
3
4
Fitn
ess v
alue
times104
(e)
Figure 5 Unimodal functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquares function (e) f5Zakharov function
14 Computational Intelligence and Neuroscience
0 100 200 300 400 500 600Iterations
0
05
1
15
2
25
3Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
0 100 200 300 400 500 600Iterations
0
20
40
60
80
100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(b)
0 100 200 300 400 500 600Iterations
0
5
10
15
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(c)
0 100 200 300 400 500 600Iterations
ndash180
ndash160
ndash140
ndash120
ndash100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(d)
0 100 200 300 400 500 600Iterations
0
5
10
15
20
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(e)
100 200 300 400 500 600Iterations
ndash850
ndash800
ndash750
ndash700
ndash650
ndash600
ndash550
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(f )
Figure 6 Multimodal functions (a) f6 Griewank function (b) f7 Rastrigin function (c) f8 alpine function (d) f9 Shubert function (e) f10Ackley function (f ) f11 Cmfun function
Computational Intelligence and Neuroscience 15
Table 9 shows the test results for the three improved PSOalgorithms e DMSDL-PSO algorithm in [25] is a PSOalgorithm combined with differential variation and thequasi-Newton method whereas the HPSOWM algorithm in[26] is a binary PSO algorithm based on wavelet transformTable 9 shows that the IEPSO algorithm obtains the bestvalue in 5 out of the 11 test functions and the above analysisindicates that the IEPSO outperforms the other improvedPSO algorithms
4 Conclusion
In contemporary engineering design solving the globaloptimization problems of multiparameter strongly coupledand nonlinear systems using conventional optimizationalgorithms is difficult In this study an improved PSO thatis the IEPSO algorithm is proposed on the basis of the last-eliminated principle and an enhanced local-global in-formation sharing capability e comparison and analysisof the simulation results indicate the following conclusions
(1) e exchange of information between global andlocal optimal particles enhances the deep searchcapability of the IEPSO algorithm
(2) e standard test function is used to simulate theparameter C3 of the local-global information sharingterm e results show that the global optimizationcapability of the IEPSO algorithm is strong when C3is linearly decreasing Moreover the proposed al-gorithm can show the best search performance whenC3 is a nonlinear convex function
(3) e last-eliminated principle is used in the IEPSO tomaintain particle population diversity MoreoverPSO is avoided in the local optimal value A com-parison of the IEPSO algorithm with the classicaloptimization algorithm and its improved versionsverifies the global search capability of the IEPSOalgorithm
In summary the comparative results of the simulationanalysis reveal that with the application of the last-eliminatedprinciple and the local-global information sharing term to the
IEPSO the proposed algorithm effectively overcomes thedisadvantages of the classical algorithms including theirprecocious convergence and tendency to fall into the localoptimum e IEPSO shows an ideal global optimizationperformance and indicates a high application value for solvingpractical engineering optimization problems
Data Availability
e data used to support the findings of this study areavailable from the corresponding author upon request
Conflicts of Interest
e authors declare that there are no conflicts of interest
Acknowledgments
is work was supported by Shanghai Rising-Star Program(no 16QB1401000) Key Project of Shanghai Science andTechnology Committee (no 16DZ1120400) and the Na-tional Natural Science Foundation of China (Project no51705187) the Postdoctoral Science Foundation of China(Grant no 2017M621202)
References
[1] Z Zhou J Wang Z Zhu D Yang and J Wu ldquoTangentnavigated robot path planning strategy using particle swarmoptimized artificial potential fieldrdquo Optik vol 158 pp 639ndash651 2018
[2] P Du R Barrio H Jiang and L Cheng ldquoAccurate Quotient-Difference algorithm error analysis improvements and ap-plicationsrdquo Applied Mathematics and Computation vol 309pp 245ndash271 2017
[3] L Jiang Z Wang Y Ye and J Jiang ldquoFast circle detectionalgorithm based on sampling from difference areardquo Optikvol 158 pp 424ndash433 2018
[4] H Garg ldquoA hybrid PSO-GA algorithm for constrained op-timization problemsrdquo Applied Mathematics amp Computationvol 274 no 11 pp 292ndash305 2016
[5] J Zhang and P Xia ldquoAn improved PSO algorithm for pa-rameter identification of nonlinear dynamic hystereticmodelsrdquo Journal of Sound and Vibration vol 389 pp 153ndash167 2017
[6] R Saini P P Roy and D P Dogra ldquoA segmental HMMbasedtrajectory classification using genetic algorithmrdquo ExpertSystems with Applications vol 93 pp 169ndash181 2018
[7] P R D O D Costa S Mauceri P Carroll et al ldquoA geneticalgorithm for a vehicle routing problemrdquo Electronic Notes inDiscrete Mathematics vol 64 pp 65ndash74 2017
[8] V Jindal and P Bedi ldquoAn improved hybrid ant particleoptimization (IHAPO) algorithm for reducing travel time inVANETsrdquo Applied Soft Computing vol 64 pp 526ndash5352018
[9] Z Peng H Manier and M A Manier ldquoParticle swarmoptimization for capacitated location-routing problemrdquoIFAC-PapersOnLine vol 50 no 1 pp 14668ndash14673 2017
[10] G Xu and G Yu ldquoReprint of on convergence analysis ofparticle swarm optimization algorithmrdquo Journal of ShanxiNormal University vol 4 no 14 pp 25ndash32 2008
[11] J Lu W Xie and H Zhou ldquoCombined fitness functionbased particle swarm optimization algorithm for system
Table 9 ree improved particle swarm algorithm test results
Functions Criteria IEPSO DMSDL-PSO [25]
BHPSOWM[26]
f1Mean 892E minus 22 473E minus 10 4240SD 265E minus 39 181E minus 09 5211
f3Mean 621E minus 19 237E + 03 761SD 263E minus 36 571E + 02 007
f6Mean 419E minus 02 866E minus 05 mdashSD 343E minus 04 296E minus 04 mdash
f7Mean 446E minus 03 915E + 01 7618SD 173E minus 04 180E + 01 2675
f8Mean 242E minus 10 131E + 02 mdashSD 674E minus 20 582E + 01 mdash
f10Mean 113E minus 11 101E + 00 172SD 221E minus 22 271E minus 01 0
16 Computational Intelligence and Neuroscience
identificationrdquo Computers amp Industrial Engineering vol 95pp 122ndash134 2016
[12] F Javidrad and M Nazari ldquoA new hybrid particle swarm andsimulated annealing stochastic optimization methodrdquo Ap-plied Soft Computing vol 60 pp 634ndash654 2017
[13] J Jie J Zhang H Zheng and B Hou ldquoFormalized model andanalysis of mixed swarm based cooperative particle swarmoptimizationrdquo Neurocomputing vol 174 pp 542ndash552 2016
[14] A Meng Z Li H Yin S Chen and Z Guo ldquoAcceleratingparticle swarm optimization using crisscross searchrdquo In-formation Sciences vol 329 pp 52ndash72 2016
[15] L Wang B Yang and J Orchard ldquoParticle swarm optimi-zation using dynamic tournament topologyrdquo Applied SoftComputing vol 48 pp 584ndash596 2016
[16] M S Kiran ldquoParticle swarm optimization with a new updatemechanismrdquo Applied Soft Computing vol 60 pp 670ndash6782017
[17] H C Tsai ldquoUnified particle swarm delivers high efficiency toparticle swarm optimizationrdquo Applied Soft Computingvol 55 pp 371ndash383 2017
[18] S F Li and C Y Cheng ldquoParticle swarm optimization withfitness adjustment parametersrdquo Computers amp Industrial En-gineering vol 113 pp 831ndash841 2017
[19] Y Chen L Li H Peng J Xiao Y Yang and Y Shi ldquoParticleswarm optimizer with two differential mutationrdquo Applied SoftComputing vol 61 pp 314ndash330 2017
[20] Q Zhang W Liu X Meng B Yang and A V VasilakosldquoVector coevolving particle swarm optimization algorithmrdquoInformation Sciences vol 394 pp 273ndash298 2017
[21] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimization[C]Evolutionary computationrdquo in Proceedingsof the 1999 Congress on Evolutionary Computation-CEC99vol 3 pp 1945ndash1950 IEEE Washington DC USA 1999
[22] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[23] A Ratnaweera S K Halgamuge and H C Watson ldquoSelf-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficientsrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 240ndash255 2004
[24] K Chen F Zhou and A Liu ldquoChaotic dynamic weightparticle swarm optimization for numerical function optimi-zationrdquo Knowledge-Based Systems vol 139 pp 23ndash40 2018
[25] Y Chen L Li H Peng J Xiao and Q Wu ldquoDynamic multi-swarm differential learning particle swarm optimizerrdquoSwarm and Evolutionary Computation vol 39 pp 209ndash2212018
[26] F Jiang H Xia Q A Tran Q M Ha N Q Tran and J HuldquoA new binary hybrid particle swarm optimization withwavelet mutationrdquo Knowledge-Based Systems vol 130pp 90ndash101 2017
[27] R Liu J Li C Mu J fan and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimiza-tion for dynamic multi-objective optimizationrdquo EuropeanJournal of Operational Research vol 261 no 3 pp 1028ndash1051 2017
[28] W Ye W Feng and S Fan ldquoA novel multi-swarm particleswarm optimization with dynamic learning strategyrdquo AppliedSoft Computing vol 61 pp 832ndash843 2017
[29] L Zhang Y Tang C Hua and X Guan ldquoA new particleswarm optimization algorithm with adaptive inertia weightbased on Bayesian techniquesrdquo Applied Soft Computingvol 28 pp 138ndash149 2015
[30] Q Cui Q Li G Li et al ldquoGlobally-optimal prediction-basedadaptive mutation particle swarm optimizationrdquo InformationSciences vol 418 pp 186ndash217 2017
[31] D Zhao and J Liu ldquoStudy on network security situationawareness based on particle swarm optimization algorithmrdquoComputers amp Industrial Engineering vol 125 pp 764ndash7752018
[32] H Samareh S H Khoshrou K Shahriar M M Ebadzadehand M Eslami ldquoOptimization of a nonlinear model forpredicting the ground vibration using the combinationalparticle swarm optimization-genetic algorithmrdquo Journal ofAfrican Earth Sciences vol 133 pp 36ndash45 2017
[33] M Dash T Panigrahi and R Sharma ldquoDistributed parameterestimation of IIR system using diffusion particle swarm op-timization algorithmrdquo Journal of King Saud University-Engineering Sciences 2017 In press
[34] B Wang S Li J Guo and Q Chen ldquoCar-like mobile robotpath planning in rough terrain using multi-objective particleswarm optimization algorithmrdquo Neurocomputing vol 282pp 42ndash51 2018
[35] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[36] R F Lopes F F Costa A Oliveira et al ldquoAlgorithm based onparticle swarm applied to electrical load scheduling in anindustrial settingrdquo Energy vol 147 pp 1007ndash1015 2018
[37] F Sheikholeslami and N J Navimipour ldquoService allocation inthe cloud environments using multi-objective particle swarmoptimization algorithm based on crowding distancerdquo Swarmamp Evolutionary Computation vol 35 pp 53ndash64 2017
[38] M Petrovic N Vukovic M Mitic et al ldquoIntegration ofprocess planning and scheduling using chaotic particle swarmoptimization algorithmrdquo Expert Systems with Applicationsvol 64 pp 569ndash588 2016
[39] Z Zhang Y Jiang S Zhang S Geng H Wang and G SangldquoAn adaptive particle swarm optimization algorithm forreservoir operation optimizationrdquo Applied Soft ComputingJournal vol 18 no 4 pp 167ndash177 2014
[40] K Li L Liu J Zhai T M Khoshgoftaar and T Li ldquoeimproved grey model based on particle swarm optimizationalgorithm for time series predictionrdquo Engineering Applica-tions of Artificial Intelligence vol 55 pp 285ndash291 2016
[41] S Gulcu and H Kodaz ldquoe estimation of the electricityenergy demand using particle swarm optimization algorithma case study of Turkeyrdquo Procedia Computer Science vol 111pp 64ndash70 2017
Computational Intelligence and Neuroscience 17
Computer Games Technology
International Journal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom
Journal ofEngineeringVolume 2018
Advances in
FuzzySystems
Hindawiwwwhindawicom
Volume 2018
International Journal of
ReconfigurableComputing
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
thinspArtificial Intelligence
Hindawiwwwhindawicom Volumethinsp2018
Hindawiwwwhindawicom Volume 2018
Civil EngineeringAdvances in
Hindawiwwwhindawicom Volume 2018
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawiwwwhindawicom Volume 2018
Hindawi
wwwhindawicom Volume 2018
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Engineering Mathematics
International Journal of
RoboticsJournal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Computational Intelligence and Neuroscience
Hindawiwwwhindawicom Volume 2018
Mathematical Problems in Engineering
Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018
Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom
The Scientific World Journal
Volume 2018
Hindawiwwwhindawicom Volume 2018
Human-ComputerInteraction
Advances in
Hindawiwwwhindawicom Volume 2018
Scientic Programming
Submit your manuscripts atwwwhindawicom
not show its advantages because of its strong deep searchcapability In the complex multimodal test function whenthe convex function is used in C3 the downward trend isslow in the early stage thus beneting the global search andthe downward speed increases in the later stage thusbeneting the local search When the concave function isused for C3 the descent speed is fast in the early stageAlthough the search speed is improved the coverage area ofthe search is reduced thereby leading to the convergence ofthe algorithm to the nonoptimal value From the simulationdiagrams (f)ndash(k) the convergence speed is observed to beslightly slow when C3 is a convex function but its ability tojump out of the local extremum and the accuracy of theglobal search are higher than those in the other two casesWhen C3 is a concave function the convergence speed isfaster than those in the other two cases and the searchaccuracy is lower than that when C3 is a convex function
32 Comparison of Test Results e 11 test functions inFigure 1 are used to compare the IEPSO algorithm withclassical PSO SPSO dierential algorithm (DE) and GAe DE GA and PSO algorithms are all stochastic in-telligent optimization algorithms with population iterationse evaluation criteria of algorithm performance includespeed of convergence and size of individual populationsearch coverage e dierential optimization algorithm hasa low space complexity and obvious advantages in dealingwith large-scale and complex optimization problems eGA has good convergence when solving discrete multipeakand noise-containing optimization problems Based on thetraditional PSO algorithm the SPSO algorithm achieves thebalance between global search and local search by adjustingthe inertial weight (Figures 3 and 4)
e experimental parameters of the ve algorithmsare set as shown in Table 6 Each test function is runindependently 10 times and the average is recorded toreduce the data error e iteration is stopped when the
convergence condition meets the convergence accuracy ebest average tness value of the ve algorithms is blackenede standard deviation average tness and optimal value ofeach algorithm are shown in Tables 7 and 8 Figures 5 and 6plot the convergence curves of the 11 test functions
Table 7 shows that the IEPSO has the best performanceon f1 f2 f3 and f4 e IEPSO algorithm obtains the the-oretical optimal value on f2 DE can search the global so-lution on f5 e deep search capability of the IEPSOalgorithm is considerably higher than that of the PSO andSPSO algorithms due to the increased global-local in-formation sharing term and the last-eliminated principlee crossover mutation and selection mechanisms makethe DE algorithm perform well in the early stage of the globalsearch However the diversity of the population declines inthe latter stage because of population dierences e
0 2 4 6 8 10Iterations
0
02
04
06
08
1
C3
k = 2k = 02k = 1
Figure 3 e change curve of C3 with the number of iterations
Table 5 Multimodal test functions
Functions Criteria C3 2sim0 k 02 C3 2sim0 k 2 C3 2sim0 k 1
f6Mean 419E minus 02 479E minus 02 492E minus 02SD 343E minus 04 707E minus 04 596E minus 04Best 125E minus 02 57E minus 03 123E minus 02
f7Mean 446E minus 03 500E minus 05 19Ee minus 04SD 173E minus 04 303E minus 06 5649Best 231E minus 12 389E minus 11 225E minus 05
f8Mean 242E minus 10 374E minus 10 528E minus 10SD 674E minus 20 247E minus 12 223E minus 12Best 371E minus 16 436E minus 11 583E minus 13
f9Mean minus1867309 minus1867309 minus1867309SD 0 0 0Best minus1867309 minus1867309 minus1867309
f10Mean 113E minus 11 205E minus 10 184E minus 11SD 221E minus 22 437E minus 12 227E minus 22Best 506E minus 14 175E minus 10 250E minus 12
f11Mean minus8379658 minus8379658 minus8379658SD 0 0 440E minus 09Best minus8379658 minus8379658 minus8379658
10 Computational Intelligence and Neuroscience
0 50 100 150 200Iterations
0
500
1000
1500
2000
2500
3000Fi
tnes
s val
ue
2 4 6 8 10 120
1000
2000
k = 2k = 02k = 1
(a)
50 100 150 200Iterations
0
0005
001
0015
Fitn
ess v
alue
5 10 15 200
1
2
3times10ndash3
k = 02k = 2k = 1
(b)
50 100 150 200Iterations
0
500
1000
1500
2000
2500
3000
Fitn
ess v
alue
2 4 60
1000
2000
k = 1k = 2k = 02
(c)
50 100 150 200Iterations
0
10
20
30
40
50
60
Fitn
ess v
alue
2 4 6 8 100
20
40
k = 02k = 2k = 1
(d)
50 100 150 200Iterations
0
2000
4000
6000
8000
10000
12000
14000
Fitn
ess v
alue
5 10 15 200
2000
4000
6000
k = 02k = 2k = 1
(e)
50 100 150 200Iterations
0
05
1
15
2
Fitn
ess v
alue
5 10 15 20 25 300
05
1
15
k = 1k = 2k = 02
(f )
Figure 4 Continued
Computational Intelligence and Neuroscience 11
50 100 150 200Iterations
0
10
20
30
40
50Fi
tnes
s val
ue
20 40 60 800
10
20
30
k = 02k = 2k = 1
(g)
50 100 150 200Iterations
0
1
2
3
4
Fitn
ess v
alue
2 4 6 8 10 120
1
2
3
k = 02k = 2k = 1
(h)
0 50 100 150 200Iterations
ndash1865
ndash186
ndash1855
ndash185
ndash1845
ndash184
Fitn
ess v
alue
10 20 30 40
ndash1865
ndash186
ndash1855
ndash185
k = 02k = 2k = 1
(i)
50 100 150 200Iterations
0
5
10
15
Fitn
ess v
alue
10 20 300
2
4
k = 02k = 2k = 1
(j)
0 50 100 150 200Iterations
ndash840
ndash820
ndash800
ndash780
ndash760
ndash740
ndash720
Fitn
ess v
alue
k = 02k = 2k = 1
60 65 70 75 80ndash838ndash836ndash834ndash832ndash830
(k)
Figure 4 11 test functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquare function (e) f5 Zakharovfunction (f ) f6 Griewank function (g) f7 Rastrigin function (h) f8 alpine function (i) f9 Shubert function (j) f10 Ackley function (k) f11Cmfun function
12 Computational Intelligence and Neuroscience
simulation diagrams (a)ndash(e) show that although the DEalgorithm converges rapidly in the early stage its globalsearch performance in the later stage becomes lower thanthat of the IEPSO algorithm When the GA is used to solveoptimization problems the individuals in the population fallinto the local optimum and do not continue searching forthe optimum solution erefore in Figure 5 the simulationcurve of the GA converges to the local optimum
e test results in Table 8 indicate that the IEPSO has thebest performance on f6 f7 f8 f9 f10 and f11 and that the DE
and GA can obtain the theoretical optimal value on f9 and f11Although the GA and IEPSO algorithm can obtain the globaloptimal value on f9 the IEPSO algorithm is more robust thanthe GA is As shown in the simulation curve of Figure 6 thediversity of the population is maintained because the sup-plementary particles in the population are stochastic whenthe local optimal solution converges gradually e IEPSOalgorithm can jump out of the local extrema points in theface of complex multimodal test functions and the numberof iterations required is correspondingly reduced
Table 6 Parameter settings
Algorithm Population Maximum iteration Dim of each object OthersPSO 40 1000 10 C1 C2 2 R1 R2 05SPSO 40 1000 10 ω 09ndash04 C1 C2 2 R1 R2 05DE 40 1000 10 mdashGA 40 1000 10 GGAP 05 PRECI 25IEPSO 40 1000 10 ω 09ndash04 C1 C2 2 C3 2ndash0 R1 R2 R3 05
Table 7 Unimodal test functions
Functions Criteria PSO SPSO DE IEPSO GA
f1Mean 133E + 03 308E + 03 731E minus 12 892E minus 22 11696SD 253E + 05 121E + 06 225E minus 23 265E minus 39 44192Best 114E + 03 120E + 03 242E minus 12 772E minus 27 4660
f2Mean 296E minus 02 880E minus 02 837E minus 06 0 179E minus 11SD 836E minus 04 896E minus 04 158E minus 10 0 0Best 455E minus 03 8428734 755E minus 10 0 179E minus 11
f3Mean 119E + 03 251E + 03 114E minus 11 621E minus 19 7430SD 293E + 05 182E + 06 995E minus 23 263E minus 36 5833Best 106E + 03 282E minus 02 210E minus 12 181E minus 27 4542
f4Mean 8238 8210 336E minus 13 170E minus 21 3031SD 686E + 02 140E + 03 995E minus 26 131E minus 41 0835Best 115E + 02 3739 115E minus 13 282E minus 29 1968
f5Mean 126E + 04 860E + 03 702E minus 12 165E minus 10 362E + 03SD 206E + 07 215E + 07 181E minus 23 330E minus 20 344E + 05Best 104E + 04 130E + 02 267E minus 12 217E minus 11 253E + 03
Table 8 Multimodal test functions
Functions Criteria PSO SPSO DE IEPSO GA
f6Mean 1548 1752 944E minus 02 419E minus 02 1006SD 0026 0093 487E minus 04 343E minus 04 0018Best 1236 1417 006 0013 0794
f7Mean 57737 43405 11945 446E minus 03 8939SD 117768 65178 16502 173E minus 04 3608Best 35981 317E + 01 6398 231E minus 12 5040
f8Mean 4996 4665 379E minus 02 242E minus 10 0423SD 191E + 00 1056 54E minus 03 674E minus 20 0051Best 2933 3151 46E minus 03 371E minus 16 0086
f9Mean minus186448 minus186048 minus186728 minus186731 minus186731SD 119E minus 01 983E minus 01 229E minus 08 0 999E minus 12Best minus187E + 02 minus186731 minus1867309 minus1867309 minus186731
f10Mean 13134 15560 1613 113E minus 11 2515SD 14260 2163 0 221E minus 22 0166Best 2861 12719 1613 506E minus 14 1796
f11Mean minus740326 minus715438 minus837966 minus837966 minus837966SD 874E + 03 723E + 03 0 0 0Best minus837966 minus837697 minus837966 minus837966 minus837966
Computational Intelligence and Neuroscience 13
0 100 200 300 400 500 600Iterations
0
2000
4000
6000
8000
10000
12000Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
DE
GASPSO
PSOIEPSO
0 100 200 300 400 500 600Iterations
0
01
02
03
04
05
Fitn
ess v
alue
(b)
DEGAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
5000
10000
15000
Fitn
ess v
alue
(c)
DE
GAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
200
400
600
800
Fitn
ess v
alue
(d)
DEGAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
1
2
3
4
Fitn
ess v
alue
times104
(e)
Figure 5 Unimodal functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquares function (e) f5Zakharov function
14 Computational Intelligence and Neuroscience
0 100 200 300 400 500 600Iterations
0
05
1
15
2
25
3Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
0 100 200 300 400 500 600Iterations
0
20
40
60
80
100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(b)
0 100 200 300 400 500 600Iterations
0
5
10
15
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(c)
0 100 200 300 400 500 600Iterations
ndash180
ndash160
ndash140
ndash120
ndash100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(d)
0 100 200 300 400 500 600Iterations
0
5
10
15
20
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(e)
100 200 300 400 500 600Iterations
ndash850
ndash800
ndash750
ndash700
ndash650
ndash600
ndash550
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(f )
Figure 6 Multimodal functions (a) f6 Griewank function (b) f7 Rastrigin function (c) f8 alpine function (d) f9 Shubert function (e) f10Ackley function (f ) f11 Cmfun function
Computational Intelligence and Neuroscience 15
Table 9 shows the test results for the three improved PSOalgorithms e DMSDL-PSO algorithm in [25] is a PSOalgorithm combined with differential variation and thequasi-Newton method whereas the HPSOWM algorithm in[26] is a binary PSO algorithm based on wavelet transformTable 9 shows that the IEPSO algorithm obtains the bestvalue in 5 out of the 11 test functions and the above analysisindicates that the IEPSO outperforms the other improvedPSO algorithms
4 Conclusion
In contemporary engineering design solving the globaloptimization problems of multiparameter strongly coupledand nonlinear systems using conventional optimizationalgorithms is difficult In this study an improved PSO thatis the IEPSO algorithm is proposed on the basis of the last-eliminated principle and an enhanced local-global in-formation sharing capability e comparison and analysisof the simulation results indicate the following conclusions
(1) e exchange of information between global andlocal optimal particles enhances the deep searchcapability of the IEPSO algorithm
(2) e standard test function is used to simulate theparameter C3 of the local-global information sharingterm e results show that the global optimizationcapability of the IEPSO algorithm is strong when C3is linearly decreasing Moreover the proposed al-gorithm can show the best search performance whenC3 is a nonlinear convex function
(3) e last-eliminated principle is used in the IEPSO tomaintain particle population diversity MoreoverPSO is avoided in the local optimal value A com-parison of the IEPSO algorithm with the classicaloptimization algorithm and its improved versionsverifies the global search capability of the IEPSOalgorithm
In summary the comparative results of the simulationanalysis reveal that with the application of the last-eliminatedprinciple and the local-global information sharing term to the
IEPSO the proposed algorithm effectively overcomes thedisadvantages of the classical algorithms including theirprecocious convergence and tendency to fall into the localoptimum e IEPSO shows an ideal global optimizationperformance and indicates a high application value for solvingpractical engineering optimization problems
Data Availability
e data used to support the findings of this study areavailable from the corresponding author upon request
Conflicts of Interest
e authors declare that there are no conflicts of interest
Acknowledgments
is work was supported by Shanghai Rising-Star Program(no 16QB1401000) Key Project of Shanghai Science andTechnology Committee (no 16DZ1120400) and the Na-tional Natural Science Foundation of China (Project no51705187) the Postdoctoral Science Foundation of China(Grant no 2017M621202)
References
[1] Z Zhou J Wang Z Zhu D Yang and J Wu ldquoTangentnavigated robot path planning strategy using particle swarmoptimized artificial potential fieldrdquo Optik vol 158 pp 639ndash651 2018
[2] P Du R Barrio H Jiang and L Cheng ldquoAccurate Quotient-Difference algorithm error analysis improvements and ap-plicationsrdquo Applied Mathematics and Computation vol 309pp 245ndash271 2017
[3] L Jiang Z Wang Y Ye and J Jiang ldquoFast circle detectionalgorithm based on sampling from difference areardquo Optikvol 158 pp 424ndash433 2018
[4] H Garg ldquoA hybrid PSO-GA algorithm for constrained op-timization problemsrdquo Applied Mathematics amp Computationvol 274 no 11 pp 292ndash305 2016
[5] J Zhang and P Xia ldquoAn improved PSO algorithm for pa-rameter identification of nonlinear dynamic hystereticmodelsrdquo Journal of Sound and Vibration vol 389 pp 153ndash167 2017
[6] R Saini P P Roy and D P Dogra ldquoA segmental HMMbasedtrajectory classification using genetic algorithmrdquo ExpertSystems with Applications vol 93 pp 169ndash181 2018
[7] P R D O D Costa S Mauceri P Carroll et al ldquoA geneticalgorithm for a vehicle routing problemrdquo Electronic Notes inDiscrete Mathematics vol 64 pp 65ndash74 2017
[8] V Jindal and P Bedi ldquoAn improved hybrid ant particleoptimization (IHAPO) algorithm for reducing travel time inVANETsrdquo Applied Soft Computing vol 64 pp 526ndash5352018
[9] Z Peng H Manier and M A Manier ldquoParticle swarmoptimization for capacitated location-routing problemrdquoIFAC-PapersOnLine vol 50 no 1 pp 14668ndash14673 2017
[10] G Xu and G Yu ldquoReprint of on convergence analysis ofparticle swarm optimization algorithmrdquo Journal of ShanxiNormal University vol 4 no 14 pp 25ndash32 2008
[11] J Lu W Xie and H Zhou ldquoCombined fitness functionbased particle swarm optimization algorithm for system
Table 9 ree improved particle swarm algorithm test results
Functions Criteria IEPSO DMSDL-PSO [25]
BHPSOWM[26]
f1Mean 892E minus 22 473E minus 10 4240SD 265E minus 39 181E minus 09 5211
f3Mean 621E minus 19 237E + 03 761SD 263E minus 36 571E + 02 007
f6Mean 419E minus 02 866E minus 05 mdashSD 343E minus 04 296E minus 04 mdash
f7Mean 446E minus 03 915E + 01 7618SD 173E minus 04 180E + 01 2675
f8Mean 242E minus 10 131E + 02 mdashSD 674E minus 20 582E + 01 mdash
f10Mean 113E minus 11 101E + 00 172SD 221E minus 22 271E minus 01 0
16 Computational Intelligence and Neuroscience
identificationrdquo Computers amp Industrial Engineering vol 95pp 122ndash134 2016
[12] F Javidrad and M Nazari ldquoA new hybrid particle swarm andsimulated annealing stochastic optimization methodrdquo Ap-plied Soft Computing vol 60 pp 634ndash654 2017
[13] J Jie J Zhang H Zheng and B Hou ldquoFormalized model andanalysis of mixed swarm based cooperative particle swarmoptimizationrdquo Neurocomputing vol 174 pp 542ndash552 2016
[14] A Meng Z Li H Yin S Chen and Z Guo ldquoAcceleratingparticle swarm optimization using crisscross searchrdquo In-formation Sciences vol 329 pp 52ndash72 2016
[15] L Wang B Yang and J Orchard ldquoParticle swarm optimi-zation using dynamic tournament topologyrdquo Applied SoftComputing vol 48 pp 584ndash596 2016
[16] M S Kiran ldquoParticle swarm optimization with a new updatemechanismrdquo Applied Soft Computing vol 60 pp 670ndash6782017
[17] H C Tsai ldquoUnified particle swarm delivers high efficiency toparticle swarm optimizationrdquo Applied Soft Computingvol 55 pp 371ndash383 2017
[18] S F Li and C Y Cheng ldquoParticle swarm optimization withfitness adjustment parametersrdquo Computers amp Industrial En-gineering vol 113 pp 831ndash841 2017
[19] Y Chen L Li H Peng J Xiao Y Yang and Y Shi ldquoParticleswarm optimizer with two differential mutationrdquo Applied SoftComputing vol 61 pp 314ndash330 2017
[20] Q Zhang W Liu X Meng B Yang and A V VasilakosldquoVector coevolving particle swarm optimization algorithmrdquoInformation Sciences vol 394 pp 273ndash298 2017
[21] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimization[C]Evolutionary computationrdquo in Proceedingsof the 1999 Congress on Evolutionary Computation-CEC99vol 3 pp 1945ndash1950 IEEE Washington DC USA 1999
[22] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[23] A Ratnaweera S K Halgamuge and H C Watson ldquoSelf-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficientsrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 240ndash255 2004
[24] K Chen F Zhou and A Liu ldquoChaotic dynamic weightparticle swarm optimization for numerical function optimi-zationrdquo Knowledge-Based Systems vol 139 pp 23ndash40 2018
[25] Y Chen L Li H Peng J Xiao and Q Wu ldquoDynamic multi-swarm differential learning particle swarm optimizerrdquoSwarm and Evolutionary Computation vol 39 pp 209ndash2212018
[26] F Jiang H Xia Q A Tran Q M Ha N Q Tran and J HuldquoA new binary hybrid particle swarm optimization withwavelet mutationrdquo Knowledge-Based Systems vol 130pp 90ndash101 2017
[27] R Liu J Li C Mu J fan and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimiza-tion for dynamic multi-objective optimizationrdquo EuropeanJournal of Operational Research vol 261 no 3 pp 1028ndash1051 2017
[28] W Ye W Feng and S Fan ldquoA novel multi-swarm particleswarm optimization with dynamic learning strategyrdquo AppliedSoft Computing vol 61 pp 832ndash843 2017
[29] L Zhang Y Tang C Hua and X Guan ldquoA new particleswarm optimization algorithm with adaptive inertia weightbased on Bayesian techniquesrdquo Applied Soft Computingvol 28 pp 138ndash149 2015
[30] Q Cui Q Li G Li et al ldquoGlobally-optimal prediction-basedadaptive mutation particle swarm optimizationrdquo InformationSciences vol 418 pp 186ndash217 2017
[31] D Zhao and J Liu ldquoStudy on network security situationawareness based on particle swarm optimization algorithmrdquoComputers amp Industrial Engineering vol 125 pp 764ndash7752018
[32] H Samareh S H Khoshrou K Shahriar M M Ebadzadehand M Eslami ldquoOptimization of a nonlinear model forpredicting the ground vibration using the combinationalparticle swarm optimization-genetic algorithmrdquo Journal ofAfrican Earth Sciences vol 133 pp 36ndash45 2017
[33] M Dash T Panigrahi and R Sharma ldquoDistributed parameterestimation of IIR system using diffusion particle swarm op-timization algorithmrdquo Journal of King Saud University-Engineering Sciences 2017 In press
[34] B Wang S Li J Guo and Q Chen ldquoCar-like mobile robotpath planning in rough terrain using multi-objective particleswarm optimization algorithmrdquo Neurocomputing vol 282pp 42ndash51 2018
[35] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[36] R F Lopes F F Costa A Oliveira et al ldquoAlgorithm based onparticle swarm applied to electrical load scheduling in anindustrial settingrdquo Energy vol 147 pp 1007ndash1015 2018
[37] F Sheikholeslami and N J Navimipour ldquoService allocation inthe cloud environments using multi-objective particle swarmoptimization algorithm based on crowding distancerdquo Swarmamp Evolutionary Computation vol 35 pp 53ndash64 2017
[38] M Petrovic N Vukovic M Mitic et al ldquoIntegration ofprocess planning and scheduling using chaotic particle swarmoptimization algorithmrdquo Expert Systems with Applicationsvol 64 pp 569ndash588 2016
[39] Z Zhang Y Jiang S Zhang S Geng H Wang and G SangldquoAn adaptive particle swarm optimization algorithm forreservoir operation optimizationrdquo Applied Soft ComputingJournal vol 18 no 4 pp 167ndash177 2014
[40] K Li L Liu J Zhai T M Khoshgoftaar and T Li ldquoeimproved grey model based on particle swarm optimizationalgorithm for time series predictionrdquo Engineering Applica-tions of Artificial Intelligence vol 55 pp 285ndash291 2016
[41] S Gulcu and H Kodaz ldquoe estimation of the electricityenergy demand using particle swarm optimization algorithma case study of Turkeyrdquo Procedia Computer Science vol 111pp 64ndash70 2017
Computational Intelligence and Neuroscience 17
Computer Games Technology
International Journal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom
Journal ofEngineeringVolume 2018
Advances in
FuzzySystems
Hindawiwwwhindawicom
Volume 2018
International Journal of
ReconfigurableComputing
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
thinspArtificial Intelligence
Hindawiwwwhindawicom Volumethinsp2018
Hindawiwwwhindawicom Volume 2018
Civil EngineeringAdvances in
Hindawiwwwhindawicom Volume 2018
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawiwwwhindawicom Volume 2018
Hindawi
wwwhindawicom Volume 2018
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Engineering Mathematics
International Journal of
RoboticsJournal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Computational Intelligence and Neuroscience
Hindawiwwwhindawicom Volume 2018
Mathematical Problems in Engineering
Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018
Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom
The Scientific World Journal
Volume 2018
Hindawiwwwhindawicom Volume 2018
Human-ComputerInteraction
Advances in
Hindawiwwwhindawicom Volume 2018
Scientic Programming
Submit your manuscripts atwwwhindawicom
0 50 100 150 200Iterations
0
500
1000
1500
2000
2500
3000Fi
tnes
s val
ue
2 4 6 8 10 120
1000
2000
k = 2k = 02k = 1
(a)
50 100 150 200Iterations
0
0005
001
0015
Fitn
ess v
alue
5 10 15 200
1
2
3times10ndash3
k = 02k = 2k = 1
(b)
50 100 150 200Iterations
0
500
1000
1500
2000
2500
3000
Fitn
ess v
alue
2 4 60
1000
2000
k = 1k = 2k = 02
(c)
50 100 150 200Iterations
0
10
20
30
40
50
60
Fitn
ess v
alue
2 4 6 8 100
20
40
k = 02k = 2k = 1
(d)
50 100 150 200Iterations
0
2000
4000
6000
8000
10000
12000
14000
Fitn
ess v
alue
5 10 15 200
2000
4000
6000
k = 02k = 2k = 1
(e)
50 100 150 200Iterations
0
05
1
15
2
Fitn
ess v
alue
5 10 15 20 25 300
05
1
15
k = 1k = 2k = 02
(f )
Figure 4 Continued
Computational Intelligence and Neuroscience 11
50 100 150 200Iterations
0
10
20
30
40
50Fi
tnes
s val
ue
20 40 60 800
10
20
30
k = 02k = 2k = 1
(g)
50 100 150 200Iterations
0
1
2
3
4
Fitn
ess v
alue
2 4 6 8 10 120
1
2
3
k = 02k = 2k = 1
(h)
0 50 100 150 200Iterations
ndash1865
ndash186
ndash1855
ndash185
ndash1845
ndash184
Fitn
ess v
alue
10 20 30 40
ndash1865
ndash186
ndash1855
ndash185
k = 02k = 2k = 1
(i)
50 100 150 200Iterations
0
5
10
15
Fitn
ess v
alue
10 20 300
2
4
k = 02k = 2k = 1
(j)
0 50 100 150 200Iterations
ndash840
ndash820
ndash800
ndash780
ndash760
ndash740
ndash720
Fitn
ess v
alue
k = 02k = 2k = 1
60 65 70 75 80ndash838ndash836ndash834ndash832ndash830
(k)
Figure 4 11 test functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquare function (e) f5 Zakharovfunction (f ) f6 Griewank function (g) f7 Rastrigin function (h) f8 alpine function (i) f9 Shubert function (j) f10 Ackley function (k) f11Cmfun function
12 Computational Intelligence and Neuroscience
simulation diagrams (a)ndash(e) show that although the DEalgorithm converges rapidly in the early stage its globalsearch performance in the later stage becomes lower thanthat of the IEPSO algorithm When the GA is used to solveoptimization problems the individuals in the population fallinto the local optimum and do not continue searching forthe optimum solution erefore in Figure 5 the simulationcurve of the GA converges to the local optimum
e test results in Table 8 indicate that the IEPSO has thebest performance on f6 f7 f8 f9 f10 and f11 and that the DE
and GA can obtain the theoretical optimal value on f9 and f11Although the GA and IEPSO algorithm can obtain the globaloptimal value on f9 the IEPSO algorithm is more robust thanthe GA is As shown in the simulation curve of Figure 6 thediversity of the population is maintained because the sup-plementary particles in the population are stochastic whenthe local optimal solution converges gradually e IEPSOalgorithm can jump out of the local extrema points in theface of complex multimodal test functions and the numberof iterations required is correspondingly reduced
Table 6 Parameter settings
Algorithm Population Maximum iteration Dim of each object OthersPSO 40 1000 10 C1 C2 2 R1 R2 05SPSO 40 1000 10 ω 09ndash04 C1 C2 2 R1 R2 05DE 40 1000 10 mdashGA 40 1000 10 GGAP 05 PRECI 25IEPSO 40 1000 10 ω 09ndash04 C1 C2 2 C3 2ndash0 R1 R2 R3 05
Table 7 Unimodal test functions
Functions Criteria PSO SPSO DE IEPSO GA
f1Mean 133E + 03 308E + 03 731E minus 12 892E minus 22 11696SD 253E + 05 121E + 06 225E minus 23 265E minus 39 44192Best 114E + 03 120E + 03 242E minus 12 772E minus 27 4660
f2Mean 296E minus 02 880E minus 02 837E minus 06 0 179E minus 11SD 836E minus 04 896E minus 04 158E minus 10 0 0Best 455E minus 03 8428734 755E minus 10 0 179E minus 11
f3Mean 119E + 03 251E + 03 114E minus 11 621E minus 19 7430SD 293E + 05 182E + 06 995E minus 23 263E minus 36 5833Best 106E + 03 282E minus 02 210E minus 12 181E minus 27 4542
f4Mean 8238 8210 336E minus 13 170E minus 21 3031SD 686E + 02 140E + 03 995E minus 26 131E minus 41 0835Best 115E + 02 3739 115E minus 13 282E minus 29 1968
f5Mean 126E + 04 860E + 03 702E minus 12 165E minus 10 362E + 03SD 206E + 07 215E + 07 181E minus 23 330E minus 20 344E + 05Best 104E + 04 130E + 02 267E minus 12 217E minus 11 253E + 03
Table 8 Multimodal test functions
Functions Criteria PSO SPSO DE IEPSO GA
f6Mean 1548 1752 944E minus 02 419E minus 02 1006SD 0026 0093 487E minus 04 343E minus 04 0018Best 1236 1417 006 0013 0794
f7Mean 57737 43405 11945 446E minus 03 8939SD 117768 65178 16502 173E minus 04 3608Best 35981 317E + 01 6398 231E minus 12 5040
f8Mean 4996 4665 379E minus 02 242E minus 10 0423SD 191E + 00 1056 54E minus 03 674E minus 20 0051Best 2933 3151 46E minus 03 371E minus 16 0086
f9Mean minus186448 minus186048 minus186728 minus186731 minus186731SD 119E minus 01 983E minus 01 229E minus 08 0 999E minus 12Best minus187E + 02 minus186731 minus1867309 minus1867309 minus186731
f10Mean 13134 15560 1613 113E minus 11 2515SD 14260 2163 0 221E minus 22 0166Best 2861 12719 1613 506E minus 14 1796
f11Mean minus740326 minus715438 minus837966 minus837966 minus837966SD 874E + 03 723E + 03 0 0 0Best minus837966 minus837697 minus837966 minus837966 minus837966
Computational Intelligence and Neuroscience 13
0 100 200 300 400 500 600Iterations
0
2000
4000
6000
8000
10000
12000Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
DE
GASPSO
PSOIEPSO
0 100 200 300 400 500 600Iterations
0
01
02
03
04
05
Fitn
ess v
alue
(b)
DEGAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
5000
10000
15000
Fitn
ess v
alue
(c)
DE
GAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
200
400
600
800
Fitn
ess v
alue
(d)
DEGAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
1
2
3
4
Fitn
ess v
alue
times104
(e)
Figure 5 Unimodal functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquares function (e) f5Zakharov function
14 Computational Intelligence and Neuroscience
0 100 200 300 400 500 600Iterations
0
05
1
15
2
25
3Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
0 100 200 300 400 500 600Iterations
0
20
40
60
80
100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(b)
0 100 200 300 400 500 600Iterations
0
5
10
15
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(c)
0 100 200 300 400 500 600Iterations
ndash180
ndash160
ndash140
ndash120
ndash100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(d)
0 100 200 300 400 500 600Iterations
0
5
10
15
20
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(e)
100 200 300 400 500 600Iterations
ndash850
ndash800
ndash750
ndash700
ndash650
ndash600
ndash550
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(f )
Figure 6 Multimodal functions (a) f6 Griewank function (b) f7 Rastrigin function (c) f8 alpine function (d) f9 Shubert function (e) f10Ackley function (f ) f11 Cmfun function
Computational Intelligence and Neuroscience 15
Table 9 shows the test results for the three improved PSOalgorithms e DMSDL-PSO algorithm in [25] is a PSOalgorithm combined with differential variation and thequasi-Newton method whereas the HPSOWM algorithm in[26] is a binary PSO algorithm based on wavelet transformTable 9 shows that the IEPSO algorithm obtains the bestvalue in 5 out of the 11 test functions and the above analysisindicates that the IEPSO outperforms the other improvedPSO algorithms
4 Conclusion
In contemporary engineering design solving the globaloptimization problems of multiparameter strongly coupledand nonlinear systems using conventional optimizationalgorithms is difficult In this study an improved PSO thatis the IEPSO algorithm is proposed on the basis of the last-eliminated principle and an enhanced local-global in-formation sharing capability e comparison and analysisof the simulation results indicate the following conclusions
(1) e exchange of information between global andlocal optimal particles enhances the deep searchcapability of the IEPSO algorithm
(2) e standard test function is used to simulate theparameter C3 of the local-global information sharingterm e results show that the global optimizationcapability of the IEPSO algorithm is strong when C3is linearly decreasing Moreover the proposed al-gorithm can show the best search performance whenC3 is a nonlinear convex function
(3) e last-eliminated principle is used in the IEPSO tomaintain particle population diversity MoreoverPSO is avoided in the local optimal value A com-parison of the IEPSO algorithm with the classicaloptimization algorithm and its improved versionsverifies the global search capability of the IEPSOalgorithm
In summary the comparative results of the simulationanalysis reveal that with the application of the last-eliminatedprinciple and the local-global information sharing term to the
IEPSO the proposed algorithm effectively overcomes thedisadvantages of the classical algorithms including theirprecocious convergence and tendency to fall into the localoptimum e IEPSO shows an ideal global optimizationperformance and indicates a high application value for solvingpractical engineering optimization problems
Data Availability
e data used to support the findings of this study areavailable from the corresponding author upon request
Conflicts of Interest
e authors declare that there are no conflicts of interest
Acknowledgments
is work was supported by Shanghai Rising-Star Program(no 16QB1401000) Key Project of Shanghai Science andTechnology Committee (no 16DZ1120400) and the Na-tional Natural Science Foundation of China (Project no51705187) the Postdoctoral Science Foundation of China(Grant no 2017M621202)
References
[1] Z Zhou J Wang Z Zhu D Yang and J Wu ldquoTangentnavigated robot path planning strategy using particle swarmoptimized artificial potential fieldrdquo Optik vol 158 pp 639ndash651 2018
[2] P Du R Barrio H Jiang and L Cheng ldquoAccurate Quotient-Difference algorithm error analysis improvements and ap-plicationsrdquo Applied Mathematics and Computation vol 309pp 245ndash271 2017
[3] L Jiang Z Wang Y Ye and J Jiang ldquoFast circle detectionalgorithm based on sampling from difference areardquo Optikvol 158 pp 424ndash433 2018
[4] H Garg ldquoA hybrid PSO-GA algorithm for constrained op-timization problemsrdquo Applied Mathematics amp Computationvol 274 no 11 pp 292ndash305 2016
[5] J Zhang and P Xia ldquoAn improved PSO algorithm for pa-rameter identification of nonlinear dynamic hystereticmodelsrdquo Journal of Sound and Vibration vol 389 pp 153ndash167 2017
[6] R Saini P P Roy and D P Dogra ldquoA segmental HMMbasedtrajectory classification using genetic algorithmrdquo ExpertSystems with Applications vol 93 pp 169ndash181 2018
[7] P R D O D Costa S Mauceri P Carroll et al ldquoA geneticalgorithm for a vehicle routing problemrdquo Electronic Notes inDiscrete Mathematics vol 64 pp 65ndash74 2017
[8] V Jindal and P Bedi ldquoAn improved hybrid ant particleoptimization (IHAPO) algorithm for reducing travel time inVANETsrdquo Applied Soft Computing vol 64 pp 526ndash5352018
[9] Z Peng H Manier and M A Manier ldquoParticle swarmoptimization for capacitated location-routing problemrdquoIFAC-PapersOnLine vol 50 no 1 pp 14668ndash14673 2017
[10] G Xu and G Yu ldquoReprint of on convergence analysis ofparticle swarm optimization algorithmrdquo Journal of ShanxiNormal University vol 4 no 14 pp 25ndash32 2008
[11] J Lu W Xie and H Zhou ldquoCombined fitness functionbased particle swarm optimization algorithm for system
Table 9 ree improved particle swarm algorithm test results
Functions Criteria IEPSO DMSDL-PSO [25]
BHPSOWM[26]
f1Mean 892E minus 22 473E minus 10 4240SD 265E minus 39 181E minus 09 5211
f3Mean 621E minus 19 237E + 03 761SD 263E minus 36 571E + 02 007
f6Mean 419E minus 02 866E minus 05 mdashSD 343E minus 04 296E minus 04 mdash
f7Mean 446E minus 03 915E + 01 7618SD 173E minus 04 180E + 01 2675
f8Mean 242E minus 10 131E + 02 mdashSD 674E minus 20 582E + 01 mdash
f10Mean 113E minus 11 101E + 00 172SD 221E minus 22 271E minus 01 0
16 Computational Intelligence and Neuroscience
identificationrdquo Computers amp Industrial Engineering vol 95pp 122ndash134 2016
[12] F Javidrad and M Nazari ldquoA new hybrid particle swarm andsimulated annealing stochastic optimization methodrdquo Ap-plied Soft Computing vol 60 pp 634ndash654 2017
[13] J Jie J Zhang H Zheng and B Hou ldquoFormalized model andanalysis of mixed swarm based cooperative particle swarmoptimizationrdquo Neurocomputing vol 174 pp 542ndash552 2016
[14] A Meng Z Li H Yin S Chen and Z Guo ldquoAcceleratingparticle swarm optimization using crisscross searchrdquo In-formation Sciences vol 329 pp 52ndash72 2016
[15] L Wang B Yang and J Orchard ldquoParticle swarm optimi-zation using dynamic tournament topologyrdquo Applied SoftComputing vol 48 pp 584ndash596 2016
[16] M S Kiran ldquoParticle swarm optimization with a new updatemechanismrdquo Applied Soft Computing vol 60 pp 670ndash6782017
[17] H C Tsai ldquoUnified particle swarm delivers high efficiency toparticle swarm optimizationrdquo Applied Soft Computingvol 55 pp 371ndash383 2017
[18] S F Li and C Y Cheng ldquoParticle swarm optimization withfitness adjustment parametersrdquo Computers amp Industrial En-gineering vol 113 pp 831ndash841 2017
[19] Y Chen L Li H Peng J Xiao Y Yang and Y Shi ldquoParticleswarm optimizer with two differential mutationrdquo Applied SoftComputing vol 61 pp 314ndash330 2017
[20] Q Zhang W Liu X Meng B Yang and A V VasilakosldquoVector coevolving particle swarm optimization algorithmrdquoInformation Sciences vol 394 pp 273ndash298 2017
[21] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimization[C]Evolutionary computationrdquo in Proceedingsof the 1999 Congress on Evolutionary Computation-CEC99vol 3 pp 1945ndash1950 IEEE Washington DC USA 1999
[22] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[23] A Ratnaweera S K Halgamuge and H C Watson ldquoSelf-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficientsrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 240ndash255 2004
[24] K Chen F Zhou and A Liu ldquoChaotic dynamic weightparticle swarm optimization for numerical function optimi-zationrdquo Knowledge-Based Systems vol 139 pp 23ndash40 2018
[25] Y Chen L Li H Peng J Xiao and Q Wu ldquoDynamic multi-swarm differential learning particle swarm optimizerrdquoSwarm and Evolutionary Computation vol 39 pp 209ndash2212018
[26] F Jiang H Xia Q A Tran Q M Ha N Q Tran and J HuldquoA new binary hybrid particle swarm optimization withwavelet mutationrdquo Knowledge-Based Systems vol 130pp 90ndash101 2017
[27] R Liu J Li C Mu J fan and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimiza-tion for dynamic multi-objective optimizationrdquo EuropeanJournal of Operational Research vol 261 no 3 pp 1028ndash1051 2017
[28] W Ye W Feng and S Fan ldquoA novel multi-swarm particleswarm optimization with dynamic learning strategyrdquo AppliedSoft Computing vol 61 pp 832ndash843 2017
[29] L Zhang Y Tang C Hua and X Guan ldquoA new particleswarm optimization algorithm with adaptive inertia weightbased on Bayesian techniquesrdquo Applied Soft Computingvol 28 pp 138ndash149 2015
[30] Q Cui Q Li G Li et al ldquoGlobally-optimal prediction-basedadaptive mutation particle swarm optimizationrdquo InformationSciences vol 418 pp 186ndash217 2017
[31] D Zhao and J Liu ldquoStudy on network security situationawareness based on particle swarm optimization algorithmrdquoComputers amp Industrial Engineering vol 125 pp 764ndash7752018
[32] H Samareh S H Khoshrou K Shahriar M M Ebadzadehand M Eslami ldquoOptimization of a nonlinear model forpredicting the ground vibration using the combinationalparticle swarm optimization-genetic algorithmrdquo Journal ofAfrican Earth Sciences vol 133 pp 36ndash45 2017
[33] M Dash T Panigrahi and R Sharma ldquoDistributed parameterestimation of IIR system using diffusion particle swarm op-timization algorithmrdquo Journal of King Saud University-Engineering Sciences 2017 In press
[34] B Wang S Li J Guo and Q Chen ldquoCar-like mobile robotpath planning in rough terrain using multi-objective particleswarm optimization algorithmrdquo Neurocomputing vol 282pp 42ndash51 2018
[35] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[36] R F Lopes F F Costa A Oliveira et al ldquoAlgorithm based onparticle swarm applied to electrical load scheduling in anindustrial settingrdquo Energy vol 147 pp 1007ndash1015 2018
[37] F Sheikholeslami and N J Navimipour ldquoService allocation inthe cloud environments using multi-objective particle swarmoptimization algorithm based on crowding distancerdquo Swarmamp Evolutionary Computation vol 35 pp 53ndash64 2017
[38] M Petrovic N Vukovic M Mitic et al ldquoIntegration ofprocess planning and scheduling using chaotic particle swarmoptimization algorithmrdquo Expert Systems with Applicationsvol 64 pp 569ndash588 2016
[39] Z Zhang Y Jiang S Zhang S Geng H Wang and G SangldquoAn adaptive particle swarm optimization algorithm forreservoir operation optimizationrdquo Applied Soft ComputingJournal vol 18 no 4 pp 167ndash177 2014
[40] K Li L Liu J Zhai T M Khoshgoftaar and T Li ldquoeimproved grey model based on particle swarm optimizationalgorithm for time series predictionrdquo Engineering Applica-tions of Artificial Intelligence vol 55 pp 285ndash291 2016
[41] S Gulcu and H Kodaz ldquoe estimation of the electricityenergy demand using particle swarm optimization algorithma case study of Turkeyrdquo Procedia Computer Science vol 111pp 64ndash70 2017
Computational Intelligence and Neuroscience 17
Computer Games Technology
International Journal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom
Journal ofEngineeringVolume 2018
Advances in
FuzzySystems
Hindawiwwwhindawicom
Volume 2018
International Journal of
ReconfigurableComputing
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
thinspArtificial Intelligence
Hindawiwwwhindawicom Volumethinsp2018
Hindawiwwwhindawicom Volume 2018
Civil EngineeringAdvances in
Hindawiwwwhindawicom Volume 2018
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawiwwwhindawicom Volume 2018
Hindawi
wwwhindawicom Volume 2018
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Engineering Mathematics
International Journal of
RoboticsJournal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Computational Intelligence and Neuroscience
Hindawiwwwhindawicom Volume 2018
Mathematical Problems in Engineering
Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018
Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom
The Scientific World Journal
Volume 2018
Hindawiwwwhindawicom Volume 2018
Human-ComputerInteraction
Advances in
Hindawiwwwhindawicom Volume 2018
Scientic Programming
Submit your manuscripts atwwwhindawicom
50 100 150 200Iterations
0
10
20
30
40
50Fi
tnes
s val
ue
20 40 60 800
10
20
30
k = 02k = 2k = 1
(g)
50 100 150 200Iterations
0
1
2
3
4
Fitn
ess v
alue
2 4 6 8 10 120
1
2
3
k = 02k = 2k = 1
(h)
0 50 100 150 200Iterations
ndash1865
ndash186
ndash1855
ndash185
ndash1845
ndash184
Fitn
ess v
alue
10 20 30 40
ndash1865
ndash186
ndash1855
ndash185
k = 02k = 2k = 1
(i)
50 100 150 200Iterations
0
5
10
15
Fitn
ess v
alue
10 20 300
2
4
k = 02k = 2k = 1
(j)
0 50 100 150 200Iterations
ndash840
ndash820
ndash800
ndash780
ndash760
ndash740
ndash720
Fitn
ess v
alue
k = 02k = 2k = 1
60 65 70 75 80ndash838ndash836ndash834ndash832ndash830
(k)
Figure 4 11 test functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquare function (e) f5 Zakharovfunction (f ) f6 Griewank function (g) f7 Rastrigin function (h) f8 alpine function (i) f9 Shubert function (j) f10 Ackley function (k) f11Cmfun function
12 Computational Intelligence and Neuroscience
simulation diagrams (a)ndash(e) show that although the DEalgorithm converges rapidly in the early stage its globalsearch performance in the later stage becomes lower thanthat of the IEPSO algorithm When the GA is used to solveoptimization problems the individuals in the population fallinto the local optimum and do not continue searching forthe optimum solution erefore in Figure 5 the simulationcurve of the GA converges to the local optimum
e test results in Table 8 indicate that the IEPSO has thebest performance on f6 f7 f8 f9 f10 and f11 and that the DE
and GA can obtain the theoretical optimal value on f9 and f11Although the GA and IEPSO algorithm can obtain the globaloptimal value on f9 the IEPSO algorithm is more robust thanthe GA is As shown in the simulation curve of Figure 6 thediversity of the population is maintained because the sup-plementary particles in the population are stochastic whenthe local optimal solution converges gradually e IEPSOalgorithm can jump out of the local extrema points in theface of complex multimodal test functions and the numberof iterations required is correspondingly reduced
Table 6 Parameter settings
Algorithm Population Maximum iteration Dim of each object OthersPSO 40 1000 10 C1 C2 2 R1 R2 05SPSO 40 1000 10 ω 09ndash04 C1 C2 2 R1 R2 05DE 40 1000 10 mdashGA 40 1000 10 GGAP 05 PRECI 25IEPSO 40 1000 10 ω 09ndash04 C1 C2 2 C3 2ndash0 R1 R2 R3 05
Table 7 Unimodal test functions
Functions Criteria PSO SPSO DE IEPSO GA
f1Mean 133E + 03 308E + 03 731E minus 12 892E minus 22 11696SD 253E + 05 121E + 06 225E minus 23 265E minus 39 44192Best 114E + 03 120E + 03 242E minus 12 772E minus 27 4660
f2Mean 296E minus 02 880E minus 02 837E minus 06 0 179E minus 11SD 836E minus 04 896E minus 04 158E minus 10 0 0Best 455E minus 03 8428734 755E minus 10 0 179E minus 11
f3Mean 119E + 03 251E + 03 114E minus 11 621E minus 19 7430SD 293E + 05 182E + 06 995E minus 23 263E minus 36 5833Best 106E + 03 282E minus 02 210E minus 12 181E minus 27 4542
f4Mean 8238 8210 336E minus 13 170E minus 21 3031SD 686E + 02 140E + 03 995E minus 26 131E minus 41 0835Best 115E + 02 3739 115E minus 13 282E minus 29 1968
f5Mean 126E + 04 860E + 03 702E minus 12 165E minus 10 362E + 03SD 206E + 07 215E + 07 181E minus 23 330E minus 20 344E + 05Best 104E + 04 130E + 02 267E minus 12 217E minus 11 253E + 03
Table 8 Multimodal test functions
Functions Criteria PSO SPSO DE IEPSO GA
f6Mean 1548 1752 944E minus 02 419E minus 02 1006SD 0026 0093 487E minus 04 343E minus 04 0018Best 1236 1417 006 0013 0794
f7Mean 57737 43405 11945 446E minus 03 8939SD 117768 65178 16502 173E minus 04 3608Best 35981 317E + 01 6398 231E minus 12 5040
f8Mean 4996 4665 379E minus 02 242E minus 10 0423SD 191E + 00 1056 54E minus 03 674E minus 20 0051Best 2933 3151 46E minus 03 371E minus 16 0086
f9Mean minus186448 minus186048 minus186728 minus186731 minus186731SD 119E minus 01 983E minus 01 229E minus 08 0 999E minus 12Best minus187E + 02 minus186731 minus1867309 minus1867309 minus186731
f10Mean 13134 15560 1613 113E minus 11 2515SD 14260 2163 0 221E minus 22 0166Best 2861 12719 1613 506E minus 14 1796
f11Mean minus740326 minus715438 minus837966 minus837966 minus837966SD 874E + 03 723E + 03 0 0 0Best minus837966 minus837697 minus837966 minus837966 minus837966
Computational Intelligence and Neuroscience 13
0 100 200 300 400 500 600Iterations
0
2000
4000
6000
8000
10000
12000Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
DE
GASPSO
PSOIEPSO
0 100 200 300 400 500 600Iterations
0
01
02
03
04
05
Fitn
ess v
alue
(b)
DEGAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
5000
10000
15000
Fitn
ess v
alue
(c)
DE
GAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
200
400
600
800
Fitn
ess v
alue
(d)
DEGAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
1
2
3
4
Fitn
ess v
alue
times104
(e)
Figure 5 Unimodal functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquares function (e) f5Zakharov function
14 Computational Intelligence and Neuroscience
0 100 200 300 400 500 600Iterations
0
05
1
15
2
25
3Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
0 100 200 300 400 500 600Iterations
0
20
40
60
80
100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(b)
0 100 200 300 400 500 600Iterations
0
5
10
15
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(c)
0 100 200 300 400 500 600Iterations
ndash180
ndash160
ndash140
ndash120
ndash100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(d)
0 100 200 300 400 500 600Iterations
0
5
10
15
20
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(e)
100 200 300 400 500 600Iterations
ndash850
ndash800
ndash750
ndash700
ndash650
ndash600
ndash550
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(f )
Figure 6 Multimodal functions (a) f6 Griewank function (b) f7 Rastrigin function (c) f8 alpine function (d) f9 Shubert function (e) f10Ackley function (f ) f11 Cmfun function
Computational Intelligence and Neuroscience 15
Table 9 shows the test results for the three improved PSOalgorithms e DMSDL-PSO algorithm in [25] is a PSOalgorithm combined with differential variation and thequasi-Newton method whereas the HPSOWM algorithm in[26] is a binary PSO algorithm based on wavelet transformTable 9 shows that the IEPSO algorithm obtains the bestvalue in 5 out of the 11 test functions and the above analysisindicates that the IEPSO outperforms the other improvedPSO algorithms
4 Conclusion
In contemporary engineering design solving the globaloptimization problems of multiparameter strongly coupledand nonlinear systems using conventional optimizationalgorithms is difficult In this study an improved PSO thatis the IEPSO algorithm is proposed on the basis of the last-eliminated principle and an enhanced local-global in-formation sharing capability e comparison and analysisof the simulation results indicate the following conclusions
(1) e exchange of information between global andlocal optimal particles enhances the deep searchcapability of the IEPSO algorithm
(2) e standard test function is used to simulate theparameter C3 of the local-global information sharingterm e results show that the global optimizationcapability of the IEPSO algorithm is strong when C3is linearly decreasing Moreover the proposed al-gorithm can show the best search performance whenC3 is a nonlinear convex function
(3) e last-eliminated principle is used in the IEPSO tomaintain particle population diversity MoreoverPSO is avoided in the local optimal value A com-parison of the IEPSO algorithm with the classicaloptimization algorithm and its improved versionsverifies the global search capability of the IEPSOalgorithm
In summary the comparative results of the simulationanalysis reveal that with the application of the last-eliminatedprinciple and the local-global information sharing term to the
IEPSO the proposed algorithm effectively overcomes thedisadvantages of the classical algorithms including theirprecocious convergence and tendency to fall into the localoptimum e IEPSO shows an ideal global optimizationperformance and indicates a high application value for solvingpractical engineering optimization problems
Data Availability
e data used to support the findings of this study areavailable from the corresponding author upon request
Conflicts of Interest
e authors declare that there are no conflicts of interest
Acknowledgments
is work was supported by Shanghai Rising-Star Program(no 16QB1401000) Key Project of Shanghai Science andTechnology Committee (no 16DZ1120400) and the Na-tional Natural Science Foundation of China (Project no51705187) the Postdoctoral Science Foundation of China(Grant no 2017M621202)
References
[1] Z Zhou J Wang Z Zhu D Yang and J Wu ldquoTangentnavigated robot path planning strategy using particle swarmoptimized artificial potential fieldrdquo Optik vol 158 pp 639ndash651 2018
[2] P Du R Barrio H Jiang and L Cheng ldquoAccurate Quotient-Difference algorithm error analysis improvements and ap-plicationsrdquo Applied Mathematics and Computation vol 309pp 245ndash271 2017
[3] L Jiang Z Wang Y Ye and J Jiang ldquoFast circle detectionalgorithm based on sampling from difference areardquo Optikvol 158 pp 424ndash433 2018
[4] H Garg ldquoA hybrid PSO-GA algorithm for constrained op-timization problemsrdquo Applied Mathematics amp Computationvol 274 no 11 pp 292ndash305 2016
[5] J Zhang and P Xia ldquoAn improved PSO algorithm for pa-rameter identification of nonlinear dynamic hystereticmodelsrdquo Journal of Sound and Vibration vol 389 pp 153ndash167 2017
[6] R Saini P P Roy and D P Dogra ldquoA segmental HMMbasedtrajectory classification using genetic algorithmrdquo ExpertSystems with Applications vol 93 pp 169ndash181 2018
[7] P R D O D Costa S Mauceri P Carroll et al ldquoA geneticalgorithm for a vehicle routing problemrdquo Electronic Notes inDiscrete Mathematics vol 64 pp 65ndash74 2017
[8] V Jindal and P Bedi ldquoAn improved hybrid ant particleoptimization (IHAPO) algorithm for reducing travel time inVANETsrdquo Applied Soft Computing vol 64 pp 526ndash5352018
[9] Z Peng H Manier and M A Manier ldquoParticle swarmoptimization for capacitated location-routing problemrdquoIFAC-PapersOnLine vol 50 no 1 pp 14668ndash14673 2017
[10] G Xu and G Yu ldquoReprint of on convergence analysis ofparticle swarm optimization algorithmrdquo Journal of ShanxiNormal University vol 4 no 14 pp 25ndash32 2008
[11] J Lu W Xie and H Zhou ldquoCombined fitness functionbased particle swarm optimization algorithm for system
Table 9 ree improved particle swarm algorithm test results
Functions Criteria IEPSO DMSDL-PSO [25]
BHPSOWM[26]
f1Mean 892E minus 22 473E minus 10 4240SD 265E minus 39 181E minus 09 5211
f3Mean 621E minus 19 237E + 03 761SD 263E minus 36 571E + 02 007
f6Mean 419E minus 02 866E minus 05 mdashSD 343E minus 04 296E minus 04 mdash
f7Mean 446E minus 03 915E + 01 7618SD 173E minus 04 180E + 01 2675
f8Mean 242E minus 10 131E + 02 mdashSD 674E minus 20 582E + 01 mdash
f10Mean 113E minus 11 101E + 00 172SD 221E minus 22 271E minus 01 0
16 Computational Intelligence and Neuroscience
identificationrdquo Computers amp Industrial Engineering vol 95pp 122ndash134 2016
[12] F Javidrad and M Nazari ldquoA new hybrid particle swarm andsimulated annealing stochastic optimization methodrdquo Ap-plied Soft Computing vol 60 pp 634ndash654 2017
[13] J Jie J Zhang H Zheng and B Hou ldquoFormalized model andanalysis of mixed swarm based cooperative particle swarmoptimizationrdquo Neurocomputing vol 174 pp 542ndash552 2016
[14] A Meng Z Li H Yin S Chen and Z Guo ldquoAcceleratingparticle swarm optimization using crisscross searchrdquo In-formation Sciences vol 329 pp 52ndash72 2016
[15] L Wang B Yang and J Orchard ldquoParticle swarm optimi-zation using dynamic tournament topologyrdquo Applied SoftComputing vol 48 pp 584ndash596 2016
[16] M S Kiran ldquoParticle swarm optimization with a new updatemechanismrdquo Applied Soft Computing vol 60 pp 670ndash6782017
[17] H C Tsai ldquoUnified particle swarm delivers high efficiency toparticle swarm optimizationrdquo Applied Soft Computingvol 55 pp 371ndash383 2017
[18] S F Li and C Y Cheng ldquoParticle swarm optimization withfitness adjustment parametersrdquo Computers amp Industrial En-gineering vol 113 pp 831ndash841 2017
[19] Y Chen L Li H Peng J Xiao Y Yang and Y Shi ldquoParticleswarm optimizer with two differential mutationrdquo Applied SoftComputing vol 61 pp 314ndash330 2017
[20] Q Zhang W Liu X Meng B Yang and A V VasilakosldquoVector coevolving particle swarm optimization algorithmrdquoInformation Sciences vol 394 pp 273ndash298 2017
[21] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimization[C]Evolutionary computationrdquo in Proceedingsof the 1999 Congress on Evolutionary Computation-CEC99vol 3 pp 1945ndash1950 IEEE Washington DC USA 1999
[22] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[23] A Ratnaweera S K Halgamuge and H C Watson ldquoSelf-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficientsrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 240ndash255 2004
[24] K Chen F Zhou and A Liu ldquoChaotic dynamic weightparticle swarm optimization for numerical function optimi-zationrdquo Knowledge-Based Systems vol 139 pp 23ndash40 2018
[25] Y Chen L Li H Peng J Xiao and Q Wu ldquoDynamic multi-swarm differential learning particle swarm optimizerrdquoSwarm and Evolutionary Computation vol 39 pp 209ndash2212018
[26] F Jiang H Xia Q A Tran Q M Ha N Q Tran and J HuldquoA new binary hybrid particle swarm optimization withwavelet mutationrdquo Knowledge-Based Systems vol 130pp 90ndash101 2017
[27] R Liu J Li C Mu J fan and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimiza-tion for dynamic multi-objective optimizationrdquo EuropeanJournal of Operational Research vol 261 no 3 pp 1028ndash1051 2017
[28] W Ye W Feng and S Fan ldquoA novel multi-swarm particleswarm optimization with dynamic learning strategyrdquo AppliedSoft Computing vol 61 pp 832ndash843 2017
[29] L Zhang Y Tang C Hua and X Guan ldquoA new particleswarm optimization algorithm with adaptive inertia weightbased on Bayesian techniquesrdquo Applied Soft Computingvol 28 pp 138ndash149 2015
[30] Q Cui Q Li G Li et al ldquoGlobally-optimal prediction-basedadaptive mutation particle swarm optimizationrdquo InformationSciences vol 418 pp 186ndash217 2017
[31] D Zhao and J Liu ldquoStudy on network security situationawareness based on particle swarm optimization algorithmrdquoComputers amp Industrial Engineering vol 125 pp 764ndash7752018
[32] H Samareh S H Khoshrou K Shahriar M M Ebadzadehand M Eslami ldquoOptimization of a nonlinear model forpredicting the ground vibration using the combinationalparticle swarm optimization-genetic algorithmrdquo Journal ofAfrican Earth Sciences vol 133 pp 36ndash45 2017
[33] M Dash T Panigrahi and R Sharma ldquoDistributed parameterestimation of IIR system using diffusion particle swarm op-timization algorithmrdquo Journal of King Saud University-Engineering Sciences 2017 In press
[34] B Wang S Li J Guo and Q Chen ldquoCar-like mobile robotpath planning in rough terrain using multi-objective particleswarm optimization algorithmrdquo Neurocomputing vol 282pp 42ndash51 2018
[35] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[36] R F Lopes F F Costa A Oliveira et al ldquoAlgorithm based onparticle swarm applied to electrical load scheduling in anindustrial settingrdquo Energy vol 147 pp 1007ndash1015 2018
[37] F Sheikholeslami and N J Navimipour ldquoService allocation inthe cloud environments using multi-objective particle swarmoptimization algorithm based on crowding distancerdquo Swarmamp Evolutionary Computation vol 35 pp 53ndash64 2017
[38] M Petrovic N Vukovic M Mitic et al ldquoIntegration ofprocess planning and scheduling using chaotic particle swarmoptimization algorithmrdquo Expert Systems with Applicationsvol 64 pp 569ndash588 2016
[39] Z Zhang Y Jiang S Zhang S Geng H Wang and G SangldquoAn adaptive particle swarm optimization algorithm forreservoir operation optimizationrdquo Applied Soft ComputingJournal vol 18 no 4 pp 167ndash177 2014
[40] K Li L Liu J Zhai T M Khoshgoftaar and T Li ldquoeimproved grey model based on particle swarm optimizationalgorithm for time series predictionrdquo Engineering Applica-tions of Artificial Intelligence vol 55 pp 285ndash291 2016
[41] S Gulcu and H Kodaz ldquoe estimation of the electricityenergy demand using particle swarm optimization algorithma case study of Turkeyrdquo Procedia Computer Science vol 111pp 64ndash70 2017
Computational Intelligence and Neuroscience 17
Computer Games Technology
International Journal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom
Journal ofEngineeringVolume 2018
Advances in
FuzzySystems
Hindawiwwwhindawicom
Volume 2018
International Journal of
ReconfigurableComputing
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
thinspArtificial Intelligence
Hindawiwwwhindawicom Volumethinsp2018
Hindawiwwwhindawicom Volume 2018
Civil EngineeringAdvances in
Hindawiwwwhindawicom Volume 2018
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawiwwwhindawicom Volume 2018
Hindawi
wwwhindawicom Volume 2018
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Engineering Mathematics
International Journal of
RoboticsJournal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Computational Intelligence and Neuroscience
Hindawiwwwhindawicom Volume 2018
Mathematical Problems in Engineering
Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018
Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom
The Scientific World Journal
Volume 2018
Hindawiwwwhindawicom Volume 2018
Human-ComputerInteraction
Advances in
Hindawiwwwhindawicom Volume 2018
Scientic Programming
Submit your manuscripts atwwwhindawicom
simulation diagrams (a)ndash(e) show that although the DEalgorithm converges rapidly in the early stage its globalsearch performance in the later stage becomes lower thanthat of the IEPSO algorithm When the GA is used to solveoptimization problems the individuals in the population fallinto the local optimum and do not continue searching forthe optimum solution erefore in Figure 5 the simulationcurve of the GA converges to the local optimum
e test results in Table 8 indicate that the IEPSO has thebest performance on f6 f7 f8 f9 f10 and f11 and that the DE
and GA can obtain the theoretical optimal value on f9 and f11Although the GA and IEPSO algorithm can obtain the globaloptimal value on f9 the IEPSO algorithm is more robust thanthe GA is As shown in the simulation curve of Figure 6 thediversity of the population is maintained because the sup-plementary particles in the population are stochastic whenthe local optimal solution converges gradually e IEPSOalgorithm can jump out of the local extrema points in theface of complex multimodal test functions and the numberof iterations required is correspondingly reduced
Table 6 Parameter settings
Algorithm Population Maximum iteration Dim of each object OthersPSO 40 1000 10 C1 C2 2 R1 R2 05SPSO 40 1000 10 ω 09ndash04 C1 C2 2 R1 R2 05DE 40 1000 10 mdashGA 40 1000 10 GGAP 05 PRECI 25IEPSO 40 1000 10 ω 09ndash04 C1 C2 2 C3 2ndash0 R1 R2 R3 05
Table 7 Unimodal test functions
Functions Criteria PSO SPSO DE IEPSO GA
f1Mean 133E + 03 308E + 03 731E minus 12 892E minus 22 11696SD 253E + 05 121E + 06 225E minus 23 265E minus 39 44192Best 114E + 03 120E + 03 242E minus 12 772E minus 27 4660
f2Mean 296E minus 02 880E minus 02 837E minus 06 0 179E minus 11SD 836E minus 04 896E minus 04 158E minus 10 0 0Best 455E minus 03 8428734 755E minus 10 0 179E minus 11
f3Mean 119E + 03 251E + 03 114E minus 11 621E minus 19 7430SD 293E + 05 182E + 06 995E minus 23 263E minus 36 5833Best 106E + 03 282E minus 02 210E minus 12 181E minus 27 4542
f4Mean 8238 8210 336E minus 13 170E minus 21 3031SD 686E + 02 140E + 03 995E minus 26 131E minus 41 0835Best 115E + 02 3739 115E minus 13 282E minus 29 1968
f5Mean 126E + 04 860E + 03 702E minus 12 165E minus 10 362E + 03SD 206E + 07 215E + 07 181E minus 23 330E minus 20 344E + 05Best 104E + 04 130E + 02 267E minus 12 217E minus 11 253E + 03
Table 8 Multimodal test functions
Functions Criteria PSO SPSO DE IEPSO GA
f6Mean 1548 1752 944E minus 02 419E minus 02 1006SD 0026 0093 487E minus 04 343E minus 04 0018Best 1236 1417 006 0013 0794
f7Mean 57737 43405 11945 446E minus 03 8939SD 117768 65178 16502 173E minus 04 3608Best 35981 317E + 01 6398 231E minus 12 5040
f8Mean 4996 4665 379E minus 02 242E minus 10 0423SD 191E + 00 1056 54E minus 03 674E minus 20 0051Best 2933 3151 46E minus 03 371E minus 16 0086
f9Mean minus186448 minus186048 minus186728 minus186731 minus186731SD 119E minus 01 983E minus 01 229E minus 08 0 999E minus 12Best minus187E + 02 minus186731 minus1867309 minus1867309 minus186731
f10Mean 13134 15560 1613 113E minus 11 2515SD 14260 2163 0 221E minus 22 0166Best 2861 12719 1613 506E minus 14 1796
f11Mean minus740326 minus715438 minus837966 minus837966 minus837966SD 874E + 03 723E + 03 0 0 0Best minus837966 minus837697 minus837966 minus837966 minus837966
Computational Intelligence and Neuroscience 13
0 100 200 300 400 500 600Iterations
0
2000
4000
6000
8000
10000
12000Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
DE
GASPSO
PSOIEPSO
0 100 200 300 400 500 600Iterations
0
01
02
03
04
05
Fitn
ess v
alue
(b)
DEGAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
5000
10000
15000
Fitn
ess v
alue
(c)
DE
GAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
200
400
600
800
Fitn
ess v
alue
(d)
DEGAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
1
2
3
4
Fitn
ess v
alue
times104
(e)
Figure 5 Unimodal functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquares function (e) f5Zakharov function
14 Computational Intelligence and Neuroscience
0 100 200 300 400 500 600Iterations
0
05
1
15
2
25
3Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
0 100 200 300 400 500 600Iterations
0
20
40
60
80
100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(b)
0 100 200 300 400 500 600Iterations
0
5
10
15
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(c)
0 100 200 300 400 500 600Iterations
ndash180
ndash160
ndash140
ndash120
ndash100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(d)
0 100 200 300 400 500 600Iterations
0
5
10
15
20
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(e)
100 200 300 400 500 600Iterations
ndash850
ndash800
ndash750
ndash700
ndash650
ndash600
ndash550
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(f )
Figure 6 Multimodal functions (a) f6 Griewank function (b) f7 Rastrigin function (c) f8 alpine function (d) f9 Shubert function (e) f10Ackley function (f ) f11 Cmfun function
Computational Intelligence and Neuroscience 15
Table 9 shows the test results for the three improved PSOalgorithms e DMSDL-PSO algorithm in [25] is a PSOalgorithm combined with differential variation and thequasi-Newton method whereas the HPSOWM algorithm in[26] is a binary PSO algorithm based on wavelet transformTable 9 shows that the IEPSO algorithm obtains the bestvalue in 5 out of the 11 test functions and the above analysisindicates that the IEPSO outperforms the other improvedPSO algorithms
4 Conclusion
In contemporary engineering design solving the globaloptimization problems of multiparameter strongly coupledand nonlinear systems using conventional optimizationalgorithms is difficult In this study an improved PSO thatis the IEPSO algorithm is proposed on the basis of the last-eliminated principle and an enhanced local-global in-formation sharing capability e comparison and analysisof the simulation results indicate the following conclusions
(1) e exchange of information between global andlocal optimal particles enhances the deep searchcapability of the IEPSO algorithm
(2) e standard test function is used to simulate theparameter C3 of the local-global information sharingterm e results show that the global optimizationcapability of the IEPSO algorithm is strong when C3is linearly decreasing Moreover the proposed al-gorithm can show the best search performance whenC3 is a nonlinear convex function
(3) e last-eliminated principle is used in the IEPSO tomaintain particle population diversity MoreoverPSO is avoided in the local optimal value A com-parison of the IEPSO algorithm with the classicaloptimization algorithm and its improved versionsverifies the global search capability of the IEPSOalgorithm
In summary the comparative results of the simulationanalysis reveal that with the application of the last-eliminatedprinciple and the local-global information sharing term to the
IEPSO the proposed algorithm effectively overcomes thedisadvantages of the classical algorithms including theirprecocious convergence and tendency to fall into the localoptimum e IEPSO shows an ideal global optimizationperformance and indicates a high application value for solvingpractical engineering optimization problems
Data Availability
e data used to support the findings of this study areavailable from the corresponding author upon request
Conflicts of Interest
e authors declare that there are no conflicts of interest
Acknowledgments
is work was supported by Shanghai Rising-Star Program(no 16QB1401000) Key Project of Shanghai Science andTechnology Committee (no 16DZ1120400) and the Na-tional Natural Science Foundation of China (Project no51705187) the Postdoctoral Science Foundation of China(Grant no 2017M621202)
References
[1] Z Zhou J Wang Z Zhu D Yang and J Wu ldquoTangentnavigated robot path planning strategy using particle swarmoptimized artificial potential fieldrdquo Optik vol 158 pp 639ndash651 2018
[2] P Du R Barrio H Jiang and L Cheng ldquoAccurate Quotient-Difference algorithm error analysis improvements and ap-plicationsrdquo Applied Mathematics and Computation vol 309pp 245ndash271 2017
[3] L Jiang Z Wang Y Ye and J Jiang ldquoFast circle detectionalgorithm based on sampling from difference areardquo Optikvol 158 pp 424ndash433 2018
[4] H Garg ldquoA hybrid PSO-GA algorithm for constrained op-timization problemsrdquo Applied Mathematics amp Computationvol 274 no 11 pp 292ndash305 2016
[5] J Zhang and P Xia ldquoAn improved PSO algorithm for pa-rameter identification of nonlinear dynamic hystereticmodelsrdquo Journal of Sound and Vibration vol 389 pp 153ndash167 2017
[6] R Saini P P Roy and D P Dogra ldquoA segmental HMMbasedtrajectory classification using genetic algorithmrdquo ExpertSystems with Applications vol 93 pp 169ndash181 2018
[7] P R D O D Costa S Mauceri P Carroll et al ldquoA geneticalgorithm for a vehicle routing problemrdquo Electronic Notes inDiscrete Mathematics vol 64 pp 65ndash74 2017
[8] V Jindal and P Bedi ldquoAn improved hybrid ant particleoptimization (IHAPO) algorithm for reducing travel time inVANETsrdquo Applied Soft Computing vol 64 pp 526ndash5352018
[9] Z Peng H Manier and M A Manier ldquoParticle swarmoptimization for capacitated location-routing problemrdquoIFAC-PapersOnLine vol 50 no 1 pp 14668ndash14673 2017
[10] G Xu and G Yu ldquoReprint of on convergence analysis ofparticle swarm optimization algorithmrdquo Journal of ShanxiNormal University vol 4 no 14 pp 25ndash32 2008
[11] J Lu W Xie and H Zhou ldquoCombined fitness functionbased particle swarm optimization algorithm for system
Table 9 ree improved particle swarm algorithm test results
Functions Criteria IEPSO DMSDL-PSO [25]
BHPSOWM[26]
f1Mean 892E minus 22 473E minus 10 4240SD 265E minus 39 181E minus 09 5211
f3Mean 621E minus 19 237E + 03 761SD 263E minus 36 571E + 02 007
f6Mean 419E minus 02 866E minus 05 mdashSD 343E minus 04 296E minus 04 mdash
f7Mean 446E minus 03 915E + 01 7618SD 173E minus 04 180E + 01 2675
f8Mean 242E minus 10 131E + 02 mdashSD 674E minus 20 582E + 01 mdash
f10Mean 113E minus 11 101E + 00 172SD 221E minus 22 271E minus 01 0
16 Computational Intelligence and Neuroscience
identificationrdquo Computers amp Industrial Engineering vol 95pp 122ndash134 2016
[12] F Javidrad and M Nazari ldquoA new hybrid particle swarm andsimulated annealing stochastic optimization methodrdquo Ap-plied Soft Computing vol 60 pp 634ndash654 2017
[13] J Jie J Zhang H Zheng and B Hou ldquoFormalized model andanalysis of mixed swarm based cooperative particle swarmoptimizationrdquo Neurocomputing vol 174 pp 542ndash552 2016
[14] A Meng Z Li H Yin S Chen and Z Guo ldquoAcceleratingparticle swarm optimization using crisscross searchrdquo In-formation Sciences vol 329 pp 52ndash72 2016
[15] L Wang B Yang and J Orchard ldquoParticle swarm optimi-zation using dynamic tournament topologyrdquo Applied SoftComputing vol 48 pp 584ndash596 2016
[16] M S Kiran ldquoParticle swarm optimization with a new updatemechanismrdquo Applied Soft Computing vol 60 pp 670ndash6782017
[17] H C Tsai ldquoUnified particle swarm delivers high efficiency toparticle swarm optimizationrdquo Applied Soft Computingvol 55 pp 371ndash383 2017
[18] S F Li and C Y Cheng ldquoParticle swarm optimization withfitness adjustment parametersrdquo Computers amp Industrial En-gineering vol 113 pp 831ndash841 2017
[19] Y Chen L Li H Peng J Xiao Y Yang and Y Shi ldquoParticleswarm optimizer with two differential mutationrdquo Applied SoftComputing vol 61 pp 314ndash330 2017
[20] Q Zhang W Liu X Meng B Yang and A V VasilakosldquoVector coevolving particle swarm optimization algorithmrdquoInformation Sciences vol 394 pp 273ndash298 2017
[21] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimization[C]Evolutionary computationrdquo in Proceedingsof the 1999 Congress on Evolutionary Computation-CEC99vol 3 pp 1945ndash1950 IEEE Washington DC USA 1999
[22] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[23] A Ratnaweera S K Halgamuge and H C Watson ldquoSelf-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficientsrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 240ndash255 2004
[24] K Chen F Zhou and A Liu ldquoChaotic dynamic weightparticle swarm optimization for numerical function optimi-zationrdquo Knowledge-Based Systems vol 139 pp 23ndash40 2018
[25] Y Chen L Li H Peng J Xiao and Q Wu ldquoDynamic multi-swarm differential learning particle swarm optimizerrdquoSwarm and Evolutionary Computation vol 39 pp 209ndash2212018
[26] F Jiang H Xia Q A Tran Q M Ha N Q Tran and J HuldquoA new binary hybrid particle swarm optimization withwavelet mutationrdquo Knowledge-Based Systems vol 130pp 90ndash101 2017
[27] R Liu J Li C Mu J fan and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimiza-tion for dynamic multi-objective optimizationrdquo EuropeanJournal of Operational Research vol 261 no 3 pp 1028ndash1051 2017
[28] W Ye W Feng and S Fan ldquoA novel multi-swarm particleswarm optimization with dynamic learning strategyrdquo AppliedSoft Computing vol 61 pp 832ndash843 2017
[29] L Zhang Y Tang C Hua and X Guan ldquoA new particleswarm optimization algorithm with adaptive inertia weightbased on Bayesian techniquesrdquo Applied Soft Computingvol 28 pp 138ndash149 2015
[30] Q Cui Q Li G Li et al ldquoGlobally-optimal prediction-basedadaptive mutation particle swarm optimizationrdquo InformationSciences vol 418 pp 186ndash217 2017
[31] D Zhao and J Liu ldquoStudy on network security situationawareness based on particle swarm optimization algorithmrdquoComputers amp Industrial Engineering vol 125 pp 764ndash7752018
[32] H Samareh S H Khoshrou K Shahriar M M Ebadzadehand M Eslami ldquoOptimization of a nonlinear model forpredicting the ground vibration using the combinationalparticle swarm optimization-genetic algorithmrdquo Journal ofAfrican Earth Sciences vol 133 pp 36ndash45 2017
[33] M Dash T Panigrahi and R Sharma ldquoDistributed parameterestimation of IIR system using diffusion particle swarm op-timization algorithmrdquo Journal of King Saud University-Engineering Sciences 2017 In press
[34] B Wang S Li J Guo and Q Chen ldquoCar-like mobile robotpath planning in rough terrain using multi-objective particleswarm optimization algorithmrdquo Neurocomputing vol 282pp 42ndash51 2018
[35] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[36] R F Lopes F F Costa A Oliveira et al ldquoAlgorithm based onparticle swarm applied to electrical load scheduling in anindustrial settingrdquo Energy vol 147 pp 1007ndash1015 2018
[37] F Sheikholeslami and N J Navimipour ldquoService allocation inthe cloud environments using multi-objective particle swarmoptimization algorithm based on crowding distancerdquo Swarmamp Evolutionary Computation vol 35 pp 53ndash64 2017
[38] M Petrovic N Vukovic M Mitic et al ldquoIntegration ofprocess planning and scheduling using chaotic particle swarmoptimization algorithmrdquo Expert Systems with Applicationsvol 64 pp 569ndash588 2016
[39] Z Zhang Y Jiang S Zhang S Geng H Wang and G SangldquoAn adaptive particle swarm optimization algorithm forreservoir operation optimizationrdquo Applied Soft ComputingJournal vol 18 no 4 pp 167ndash177 2014
[40] K Li L Liu J Zhai T M Khoshgoftaar and T Li ldquoeimproved grey model based on particle swarm optimizationalgorithm for time series predictionrdquo Engineering Applica-tions of Artificial Intelligence vol 55 pp 285ndash291 2016
[41] S Gulcu and H Kodaz ldquoe estimation of the electricityenergy demand using particle swarm optimization algorithma case study of Turkeyrdquo Procedia Computer Science vol 111pp 64ndash70 2017
Computational Intelligence and Neuroscience 17
Computer Games Technology
International Journal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom
Journal ofEngineeringVolume 2018
Advances in
FuzzySystems
Hindawiwwwhindawicom
Volume 2018
International Journal of
ReconfigurableComputing
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
thinspArtificial Intelligence
Hindawiwwwhindawicom Volumethinsp2018
Hindawiwwwhindawicom Volume 2018
Civil EngineeringAdvances in
Hindawiwwwhindawicom Volume 2018
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawiwwwhindawicom Volume 2018
Hindawi
wwwhindawicom Volume 2018
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Engineering Mathematics
International Journal of
RoboticsJournal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Computational Intelligence and Neuroscience
Hindawiwwwhindawicom Volume 2018
Mathematical Problems in Engineering
Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018
Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom
The Scientific World Journal
Volume 2018
Hindawiwwwhindawicom Volume 2018
Human-ComputerInteraction
Advances in
Hindawiwwwhindawicom Volume 2018
Scientic Programming
Submit your manuscripts atwwwhindawicom
0 100 200 300 400 500 600Iterations
0
2000
4000
6000
8000
10000
12000Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
DE
GASPSO
PSOIEPSO
0 100 200 300 400 500 600Iterations
0
01
02
03
04
05
Fitn
ess v
alue
(b)
DEGAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
5000
10000
15000
Fitn
ess v
alue
(c)
DE
GAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
200
400
600
800
Fitn
ess v
alue
(d)
DEGAPSO
SPSOIEPSO
0 100 200 300 400 500 600Iterations
0
1
2
3
4
Fitn
ess v
alue
times104
(e)
Figure 5 Unimodal functions (a) f1 sphere function (b) f2 Schaer function (c) f3 step function (d) f4 SumSquares function (e) f5Zakharov function
14 Computational Intelligence and Neuroscience
0 100 200 300 400 500 600Iterations
0
05
1
15
2
25
3Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
0 100 200 300 400 500 600Iterations
0
20
40
60
80
100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(b)
0 100 200 300 400 500 600Iterations
0
5
10
15
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(c)
0 100 200 300 400 500 600Iterations
ndash180
ndash160
ndash140
ndash120
ndash100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(d)
0 100 200 300 400 500 600Iterations
0
5
10
15
20
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(e)
100 200 300 400 500 600Iterations
ndash850
ndash800
ndash750
ndash700
ndash650
ndash600
ndash550
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(f )
Figure 6 Multimodal functions (a) f6 Griewank function (b) f7 Rastrigin function (c) f8 alpine function (d) f9 Shubert function (e) f10Ackley function (f ) f11 Cmfun function
Computational Intelligence and Neuroscience 15
Table 9 shows the test results for the three improved PSOalgorithms e DMSDL-PSO algorithm in [25] is a PSOalgorithm combined with differential variation and thequasi-Newton method whereas the HPSOWM algorithm in[26] is a binary PSO algorithm based on wavelet transformTable 9 shows that the IEPSO algorithm obtains the bestvalue in 5 out of the 11 test functions and the above analysisindicates that the IEPSO outperforms the other improvedPSO algorithms
4 Conclusion
In contemporary engineering design solving the globaloptimization problems of multiparameter strongly coupledand nonlinear systems using conventional optimizationalgorithms is difficult In this study an improved PSO thatis the IEPSO algorithm is proposed on the basis of the last-eliminated principle and an enhanced local-global in-formation sharing capability e comparison and analysisof the simulation results indicate the following conclusions
(1) e exchange of information between global andlocal optimal particles enhances the deep searchcapability of the IEPSO algorithm
(2) e standard test function is used to simulate theparameter C3 of the local-global information sharingterm e results show that the global optimizationcapability of the IEPSO algorithm is strong when C3is linearly decreasing Moreover the proposed al-gorithm can show the best search performance whenC3 is a nonlinear convex function
(3) e last-eliminated principle is used in the IEPSO tomaintain particle population diversity MoreoverPSO is avoided in the local optimal value A com-parison of the IEPSO algorithm with the classicaloptimization algorithm and its improved versionsverifies the global search capability of the IEPSOalgorithm
In summary the comparative results of the simulationanalysis reveal that with the application of the last-eliminatedprinciple and the local-global information sharing term to the
IEPSO the proposed algorithm effectively overcomes thedisadvantages of the classical algorithms including theirprecocious convergence and tendency to fall into the localoptimum e IEPSO shows an ideal global optimizationperformance and indicates a high application value for solvingpractical engineering optimization problems
Data Availability
e data used to support the findings of this study areavailable from the corresponding author upon request
Conflicts of Interest
e authors declare that there are no conflicts of interest
Acknowledgments
is work was supported by Shanghai Rising-Star Program(no 16QB1401000) Key Project of Shanghai Science andTechnology Committee (no 16DZ1120400) and the Na-tional Natural Science Foundation of China (Project no51705187) the Postdoctoral Science Foundation of China(Grant no 2017M621202)
References
[1] Z Zhou J Wang Z Zhu D Yang and J Wu ldquoTangentnavigated robot path planning strategy using particle swarmoptimized artificial potential fieldrdquo Optik vol 158 pp 639ndash651 2018
[2] P Du R Barrio H Jiang and L Cheng ldquoAccurate Quotient-Difference algorithm error analysis improvements and ap-plicationsrdquo Applied Mathematics and Computation vol 309pp 245ndash271 2017
[3] L Jiang Z Wang Y Ye and J Jiang ldquoFast circle detectionalgorithm based on sampling from difference areardquo Optikvol 158 pp 424ndash433 2018
[4] H Garg ldquoA hybrid PSO-GA algorithm for constrained op-timization problemsrdquo Applied Mathematics amp Computationvol 274 no 11 pp 292ndash305 2016
[5] J Zhang and P Xia ldquoAn improved PSO algorithm for pa-rameter identification of nonlinear dynamic hystereticmodelsrdquo Journal of Sound and Vibration vol 389 pp 153ndash167 2017
[6] R Saini P P Roy and D P Dogra ldquoA segmental HMMbasedtrajectory classification using genetic algorithmrdquo ExpertSystems with Applications vol 93 pp 169ndash181 2018
[7] P R D O D Costa S Mauceri P Carroll et al ldquoA geneticalgorithm for a vehicle routing problemrdquo Electronic Notes inDiscrete Mathematics vol 64 pp 65ndash74 2017
[8] V Jindal and P Bedi ldquoAn improved hybrid ant particleoptimization (IHAPO) algorithm for reducing travel time inVANETsrdquo Applied Soft Computing vol 64 pp 526ndash5352018
[9] Z Peng H Manier and M A Manier ldquoParticle swarmoptimization for capacitated location-routing problemrdquoIFAC-PapersOnLine vol 50 no 1 pp 14668ndash14673 2017
[10] G Xu and G Yu ldquoReprint of on convergence analysis ofparticle swarm optimization algorithmrdquo Journal of ShanxiNormal University vol 4 no 14 pp 25ndash32 2008
[11] J Lu W Xie and H Zhou ldquoCombined fitness functionbased particle swarm optimization algorithm for system
Table 9 ree improved particle swarm algorithm test results
Functions Criteria IEPSO DMSDL-PSO [25]
BHPSOWM[26]
f1Mean 892E minus 22 473E minus 10 4240SD 265E minus 39 181E minus 09 5211
f3Mean 621E minus 19 237E + 03 761SD 263E minus 36 571E + 02 007
f6Mean 419E minus 02 866E minus 05 mdashSD 343E minus 04 296E minus 04 mdash
f7Mean 446E minus 03 915E + 01 7618SD 173E minus 04 180E + 01 2675
f8Mean 242E minus 10 131E + 02 mdashSD 674E minus 20 582E + 01 mdash
f10Mean 113E minus 11 101E + 00 172SD 221E minus 22 271E minus 01 0
16 Computational Intelligence and Neuroscience
identificationrdquo Computers amp Industrial Engineering vol 95pp 122ndash134 2016
[12] F Javidrad and M Nazari ldquoA new hybrid particle swarm andsimulated annealing stochastic optimization methodrdquo Ap-plied Soft Computing vol 60 pp 634ndash654 2017
[13] J Jie J Zhang H Zheng and B Hou ldquoFormalized model andanalysis of mixed swarm based cooperative particle swarmoptimizationrdquo Neurocomputing vol 174 pp 542ndash552 2016
[14] A Meng Z Li H Yin S Chen and Z Guo ldquoAcceleratingparticle swarm optimization using crisscross searchrdquo In-formation Sciences vol 329 pp 52ndash72 2016
[15] L Wang B Yang and J Orchard ldquoParticle swarm optimi-zation using dynamic tournament topologyrdquo Applied SoftComputing vol 48 pp 584ndash596 2016
[16] M S Kiran ldquoParticle swarm optimization with a new updatemechanismrdquo Applied Soft Computing vol 60 pp 670ndash6782017
[17] H C Tsai ldquoUnified particle swarm delivers high efficiency toparticle swarm optimizationrdquo Applied Soft Computingvol 55 pp 371ndash383 2017
[18] S F Li and C Y Cheng ldquoParticle swarm optimization withfitness adjustment parametersrdquo Computers amp Industrial En-gineering vol 113 pp 831ndash841 2017
[19] Y Chen L Li H Peng J Xiao Y Yang and Y Shi ldquoParticleswarm optimizer with two differential mutationrdquo Applied SoftComputing vol 61 pp 314ndash330 2017
[20] Q Zhang W Liu X Meng B Yang and A V VasilakosldquoVector coevolving particle swarm optimization algorithmrdquoInformation Sciences vol 394 pp 273ndash298 2017
[21] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimization[C]Evolutionary computationrdquo in Proceedingsof the 1999 Congress on Evolutionary Computation-CEC99vol 3 pp 1945ndash1950 IEEE Washington DC USA 1999
[22] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[23] A Ratnaweera S K Halgamuge and H C Watson ldquoSelf-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficientsrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 240ndash255 2004
[24] K Chen F Zhou and A Liu ldquoChaotic dynamic weightparticle swarm optimization for numerical function optimi-zationrdquo Knowledge-Based Systems vol 139 pp 23ndash40 2018
[25] Y Chen L Li H Peng J Xiao and Q Wu ldquoDynamic multi-swarm differential learning particle swarm optimizerrdquoSwarm and Evolutionary Computation vol 39 pp 209ndash2212018
[26] F Jiang H Xia Q A Tran Q M Ha N Q Tran and J HuldquoA new binary hybrid particle swarm optimization withwavelet mutationrdquo Knowledge-Based Systems vol 130pp 90ndash101 2017
[27] R Liu J Li C Mu J fan and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimiza-tion for dynamic multi-objective optimizationrdquo EuropeanJournal of Operational Research vol 261 no 3 pp 1028ndash1051 2017
[28] W Ye W Feng and S Fan ldquoA novel multi-swarm particleswarm optimization with dynamic learning strategyrdquo AppliedSoft Computing vol 61 pp 832ndash843 2017
[29] L Zhang Y Tang C Hua and X Guan ldquoA new particleswarm optimization algorithm with adaptive inertia weightbased on Bayesian techniquesrdquo Applied Soft Computingvol 28 pp 138ndash149 2015
[30] Q Cui Q Li G Li et al ldquoGlobally-optimal prediction-basedadaptive mutation particle swarm optimizationrdquo InformationSciences vol 418 pp 186ndash217 2017
[31] D Zhao and J Liu ldquoStudy on network security situationawareness based on particle swarm optimization algorithmrdquoComputers amp Industrial Engineering vol 125 pp 764ndash7752018
[32] H Samareh S H Khoshrou K Shahriar M M Ebadzadehand M Eslami ldquoOptimization of a nonlinear model forpredicting the ground vibration using the combinationalparticle swarm optimization-genetic algorithmrdquo Journal ofAfrican Earth Sciences vol 133 pp 36ndash45 2017
[33] M Dash T Panigrahi and R Sharma ldquoDistributed parameterestimation of IIR system using diffusion particle swarm op-timization algorithmrdquo Journal of King Saud University-Engineering Sciences 2017 In press
[34] B Wang S Li J Guo and Q Chen ldquoCar-like mobile robotpath planning in rough terrain using multi-objective particleswarm optimization algorithmrdquo Neurocomputing vol 282pp 42ndash51 2018
[35] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[36] R F Lopes F F Costa A Oliveira et al ldquoAlgorithm based onparticle swarm applied to electrical load scheduling in anindustrial settingrdquo Energy vol 147 pp 1007ndash1015 2018
[37] F Sheikholeslami and N J Navimipour ldquoService allocation inthe cloud environments using multi-objective particle swarmoptimization algorithm based on crowding distancerdquo Swarmamp Evolutionary Computation vol 35 pp 53ndash64 2017
[38] M Petrovic N Vukovic M Mitic et al ldquoIntegration ofprocess planning and scheduling using chaotic particle swarmoptimization algorithmrdquo Expert Systems with Applicationsvol 64 pp 569ndash588 2016
[39] Z Zhang Y Jiang S Zhang S Geng H Wang and G SangldquoAn adaptive particle swarm optimization algorithm forreservoir operation optimizationrdquo Applied Soft ComputingJournal vol 18 no 4 pp 167ndash177 2014
[40] K Li L Liu J Zhai T M Khoshgoftaar and T Li ldquoeimproved grey model based on particle swarm optimizationalgorithm for time series predictionrdquo Engineering Applica-tions of Artificial Intelligence vol 55 pp 285ndash291 2016
[41] S Gulcu and H Kodaz ldquoe estimation of the electricityenergy demand using particle swarm optimization algorithma case study of Turkeyrdquo Procedia Computer Science vol 111pp 64ndash70 2017
Computational Intelligence and Neuroscience 17
Computer Games Technology
International Journal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom
Journal ofEngineeringVolume 2018
Advances in
FuzzySystems
Hindawiwwwhindawicom
Volume 2018
International Journal of
ReconfigurableComputing
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
thinspArtificial Intelligence
Hindawiwwwhindawicom Volumethinsp2018
Hindawiwwwhindawicom Volume 2018
Civil EngineeringAdvances in
Hindawiwwwhindawicom Volume 2018
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawiwwwhindawicom Volume 2018
Hindawi
wwwhindawicom Volume 2018
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Engineering Mathematics
International Journal of
RoboticsJournal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Computational Intelligence and Neuroscience
Hindawiwwwhindawicom Volume 2018
Mathematical Problems in Engineering
Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018
Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom
The Scientific World Journal
Volume 2018
Hindawiwwwhindawicom Volume 2018
Human-ComputerInteraction
Advances in
Hindawiwwwhindawicom Volume 2018
Scientic Programming
Submit your manuscripts atwwwhindawicom
0 100 200 300 400 500 600Iterations
0
05
1
15
2
25
3Fi
tnes
s val
ue
DEGAPSO
SPSOIEPSO
(a)
0 100 200 300 400 500 600Iterations
0
20
40
60
80
100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(b)
0 100 200 300 400 500 600Iterations
0
5
10
15
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(c)
0 100 200 300 400 500 600Iterations
ndash180
ndash160
ndash140
ndash120
ndash100
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(d)
0 100 200 300 400 500 600Iterations
0
5
10
15
20
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(e)
100 200 300 400 500 600Iterations
ndash850
ndash800
ndash750
ndash700
ndash650
ndash600
ndash550
Fitn
ess v
alue
DEGAPSO
SPSOIEPSO
(f )
Figure 6 Multimodal functions (a) f6 Griewank function (b) f7 Rastrigin function (c) f8 alpine function (d) f9 Shubert function (e) f10Ackley function (f ) f11 Cmfun function
Computational Intelligence and Neuroscience 15
Table 9 shows the test results for the three improved PSOalgorithms e DMSDL-PSO algorithm in [25] is a PSOalgorithm combined with differential variation and thequasi-Newton method whereas the HPSOWM algorithm in[26] is a binary PSO algorithm based on wavelet transformTable 9 shows that the IEPSO algorithm obtains the bestvalue in 5 out of the 11 test functions and the above analysisindicates that the IEPSO outperforms the other improvedPSO algorithms
4 Conclusion
In contemporary engineering design solving the globaloptimization problems of multiparameter strongly coupledand nonlinear systems using conventional optimizationalgorithms is difficult In this study an improved PSO thatis the IEPSO algorithm is proposed on the basis of the last-eliminated principle and an enhanced local-global in-formation sharing capability e comparison and analysisof the simulation results indicate the following conclusions
(1) e exchange of information between global andlocal optimal particles enhances the deep searchcapability of the IEPSO algorithm
(2) e standard test function is used to simulate theparameter C3 of the local-global information sharingterm e results show that the global optimizationcapability of the IEPSO algorithm is strong when C3is linearly decreasing Moreover the proposed al-gorithm can show the best search performance whenC3 is a nonlinear convex function
(3) e last-eliminated principle is used in the IEPSO tomaintain particle population diversity MoreoverPSO is avoided in the local optimal value A com-parison of the IEPSO algorithm with the classicaloptimization algorithm and its improved versionsverifies the global search capability of the IEPSOalgorithm
In summary the comparative results of the simulationanalysis reveal that with the application of the last-eliminatedprinciple and the local-global information sharing term to the
IEPSO the proposed algorithm effectively overcomes thedisadvantages of the classical algorithms including theirprecocious convergence and tendency to fall into the localoptimum e IEPSO shows an ideal global optimizationperformance and indicates a high application value for solvingpractical engineering optimization problems
Data Availability
e data used to support the findings of this study areavailable from the corresponding author upon request
Conflicts of Interest
e authors declare that there are no conflicts of interest
Acknowledgments
is work was supported by Shanghai Rising-Star Program(no 16QB1401000) Key Project of Shanghai Science andTechnology Committee (no 16DZ1120400) and the Na-tional Natural Science Foundation of China (Project no51705187) the Postdoctoral Science Foundation of China(Grant no 2017M621202)
References
[1] Z Zhou J Wang Z Zhu D Yang and J Wu ldquoTangentnavigated robot path planning strategy using particle swarmoptimized artificial potential fieldrdquo Optik vol 158 pp 639ndash651 2018
[2] P Du R Barrio H Jiang and L Cheng ldquoAccurate Quotient-Difference algorithm error analysis improvements and ap-plicationsrdquo Applied Mathematics and Computation vol 309pp 245ndash271 2017
[3] L Jiang Z Wang Y Ye and J Jiang ldquoFast circle detectionalgorithm based on sampling from difference areardquo Optikvol 158 pp 424ndash433 2018
[4] H Garg ldquoA hybrid PSO-GA algorithm for constrained op-timization problemsrdquo Applied Mathematics amp Computationvol 274 no 11 pp 292ndash305 2016
[5] J Zhang and P Xia ldquoAn improved PSO algorithm for pa-rameter identification of nonlinear dynamic hystereticmodelsrdquo Journal of Sound and Vibration vol 389 pp 153ndash167 2017
[6] R Saini P P Roy and D P Dogra ldquoA segmental HMMbasedtrajectory classification using genetic algorithmrdquo ExpertSystems with Applications vol 93 pp 169ndash181 2018
[7] P R D O D Costa S Mauceri P Carroll et al ldquoA geneticalgorithm for a vehicle routing problemrdquo Electronic Notes inDiscrete Mathematics vol 64 pp 65ndash74 2017
[8] V Jindal and P Bedi ldquoAn improved hybrid ant particleoptimization (IHAPO) algorithm for reducing travel time inVANETsrdquo Applied Soft Computing vol 64 pp 526ndash5352018
[9] Z Peng H Manier and M A Manier ldquoParticle swarmoptimization for capacitated location-routing problemrdquoIFAC-PapersOnLine vol 50 no 1 pp 14668ndash14673 2017
[10] G Xu and G Yu ldquoReprint of on convergence analysis ofparticle swarm optimization algorithmrdquo Journal of ShanxiNormal University vol 4 no 14 pp 25ndash32 2008
[11] J Lu W Xie and H Zhou ldquoCombined fitness functionbased particle swarm optimization algorithm for system
Table 9 ree improved particle swarm algorithm test results
Functions Criteria IEPSO DMSDL-PSO [25]
BHPSOWM[26]
f1Mean 892E minus 22 473E minus 10 4240SD 265E minus 39 181E minus 09 5211
f3Mean 621E minus 19 237E + 03 761SD 263E minus 36 571E + 02 007
f6Mean 419E minus 02 866E minus 05 mdashSD 343E minus 04 296E minus 04 mdash
f7Mean 446E minus 03 915E + 01 7618SD 173E minus 04 180E + 01 2675
f8Mean 242E minus 10 131E + 02 mdashSD 674E minus 20 582E + 01 mdash
f10Mean 113E minus 11 101E + 00 172SD 221E minus 22 271E minus 01 0
16 Computational Intelligence and Neuroscience
identificationrdquo Computers amp Industrial Engineering vol 95pp 122ndash134 2016
[12] F Javidrad and M Nazari ldquoA new hybrid particle swarm andsimulated annealing stochastic optimization methodrdquo Ap-plied Soft Computing vol 60 pp 634ndash654 2017
[13] J Jie J Zhang H Zheng and B Hou ldquoFormalized model andanalysis of mixed swarm based cooperative particle swarmoptimizationrdquo Neurocomputing vol 174 pp 542ndash552 2016
[14] A Meng Z Li H Yin S Chen and Z Guo ldquoAcceleratingparticle swarm optimization using crisscross searchrdquo In-formation Sciences vol 329 pp 52ndash72 2016
[15] L Wang B Yang and J Orchard ldquoParticle swarm optimi-zation using dynamic tournament topologyrdquo Applied SoftComputing vol 48 pp 584ndash596 2016
[16] M S Kiran ldquoParticle swarm optimization with a new updatemechanismrdquo Applied Soft Computing vol 60 pp 670ndash6782017
[17] H C Tsai ldquoUnified particle swarm delivers high efficiency toparticle swarm optimizationrdquo Applied Soft Computingvol 55 pp 371ndash383 2017
[18] S F Li and C Y Cheng ldquoParticle swarm optimization withfitness adjustment parametersrdquo Computers amp Industrial En-gineering vol 113 pp 831ndash841 2017
[19] Y Chen L Li H Peng J Xiao Y Yang and Y Shi ldquoParticleswarm optimizer with two differential mutationrdquo Applied SoftComputing vol 61 pp 314ndash330 2017
[20] Q Zhang W Liu X Meng B Yang and A V VasilakosldquoVector coevolving particle swarm optimization algorithmrdquoInformation Sciences vol 394 pp 273ndash298 2017
[21] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimization[C]Evolutionary computationrdquo in Proceedingsof the 1999 Congress on Evolutionary Computation-CEC99vol 3 pp 1945ndash1950 IEEE Washington DC USA 1999
[22] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[23] A Ratnaweera S K Halgamuge and H C Watson ldquoSelf-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficientsrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 240ndash255 2004
[24] K Chen F Zhou and A Liu ldquoChaotic dynamic weightparticle swarm optimization for numerical function optimi-zationrdquo Knowledge-Based Systems vol 139 pp 23ndash40 2018
[25] Y Chen L Li H Peng J Xiao and Q Wu ldquoDynamic multi-swarm differential learning particle swarm optimizerrdquoSwarm and Evolutionary Computation vol 39 pp 209ndash2212018
[26] F Jiang H Xia Q A Tran Q M Ha N Q Tran and J HuldquoA new binary hybrid particle swarm optimization withwavelet mutationrdquo Knowledge-Based Systems vol 130pp 90ndash101 2017
[27] R Liu J Li C Mu J fan and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimiza-tion for dynamic multi-objective optimizationrdquo EuropeanJournal of Operational Research vol 261 no 3 pp 1028ndash1051 2017
[28] W Ye W Feng and S Fan ldquoA novel multi-swarm particleswarm optimization with dynamic learning strategyrdquo AppliedSoft Computing vol 61 pp 832ndash843 2017
[29] L Zhang Y Tang C Hua and X Guan ldquoA new particleswarm optimization algorithm with adaptive inertia weightbased on Bayesian techniquesrdquo Applied Soft Computingvol 28 pp 138ndash149 2015
[30] Q Cui Q Li G Li et al ldquoGlobally-optimal prediction-basedadaptive mutation particle swarm optimizationrdquo InformationSciences vol 418 pp 186ndash217 2017
[31] D Zhao and J Liu ldquoStudy on network security situationawareness based on particle swarm optimization algorithmrdquoComputers amp Industrial Engineering vol 125 pp 764ndash7752018
[32] H Samareh S H Khoshrou K Shahriar M M Ebadzadehand M Eslami ldquoOptimization of a nonlinear model forpredicting the ground vibration using the combinationalparticle swarm optimization-genetic algorithmrdquo Journal ofAfrican Earth Sciences vol 133 pp 36ndash45 2017
[33] M Dash T Panigrahi and R Sharma ldquoDistributed parameterestimation of IIR system using diffusion particle swarm op-timization algorithmrdquo Journal of King Saud University-Engineering Sciences 2017 In press
[34] B Wang S Li J Guo and Q Chen ldquoCar-like mobile robotpath planning in rough terrain using multi-objective particleswarm optimization algorithmrdquo Neurocomputing vol 282pp 42ndash51 2018
[35] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[36] R F Lopes F F Costa A Oliveira et al ldquoAlgorithm based onparticle swarm applied to electrical load scheduling in anindustrial settingrdquo Energy vol 147 pp 1007ndash1015 2018
[37] F Sheikholeslami and N J Navimipour ldquoService allocation inthe cloud environments using multi-objective particle swarmoptimization algorithm based on crowding distancerdquo Swarmamp Evolutionary Computation vol 35 pp 53ndash64 2017
[38] M Petrovic N Vukovic M Mitic et al ldquoIntegration ofprocess planning and scheduling using chaotic particle swarmoptimization algorithmrdquo Expert Systems with Applicationsvol 64 pp 569ndash588 2016
[39] Z Zhang Y Jiang S Zhang S Geng H Wang and G SangldquoAn adaptive particle swarm optimization algorithm forreservoir operation optimizationrdquo Applied Soft ComputingJournal vol 18 no 4 pp 167ndash177 2014
[40] K Li L Liu J Zhai T M Khoshgoftaar and T Li ldquoeimproved grey model based on particle swarm optimizationalgorithm for time series predictionrdquo Engineering Applica-tions of Artificial Intelligence vol 55 pp 285ndash291 2016
[41] S Gulcu and H Kodaz ldquoe estimation of the electricityenergy demand using particle swarm optimization algorithma case study of Turkeyrdquo Procedia Computer Science vol 111pp 64ndash70 2017
Computational Intelligence and Neuroscience 17
Computer Games Technology
International Journal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom
Journal ofEngineeringVolume 2018
Advances in
FuzzySystems
Hindawiwwwhindawicom
Volume 2018
International Journal of
ReconfigurableComputing
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
thinspArtificial Intelligence
Hindawiwwwhindawicom Volumethinsp2018
Hindawiwwwhindawicom Volume 2018
Civil EngineeringAdvances in
Hindawiwwwhindawicom Volume 2018
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawiwwwhindawicom Volume 2018
Hindawi
wwwhindawicom Volume 2018
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Engineering Mathematics
International Journal of
RoboticsJournal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Computational Intelligence and Neuroscience
Hindawiwwwhindawicom Volume 2018
Mathematical Problems in Engineering
Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018
Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom
The Scientific World Journal
Volume 2018
Hindawiwwwhindawicom Volume 2018
Human-ComputerInteraction
Advances in
Hindawiwwwhindawicom Volume 2018
Scientic Programming
Submit your manuscripts atwwwhindawicom
Table 9 shows the test results for the three improved PSOalgorithms e DMSDL-PSO algorithm in [25] is a PSOalgorithm combined with differential variation and thequasi-Newton method whereas the HPSOWM algorithm in[26] is a binary PSO algorithm based on wavelet transformTable 9 shows that the IEPSO algorithm obtains the bestvalue in 5 out of the 11 test functions and the above analysisindicates that the IEPSO outperforms the other improvedPSO algorithms
4 Conclusion
In contemporary engineering design solving the globaloptimization problems of multiparameter strongly coupledand nonlinear systems using conventional optimizationalgorithms is difficult In this study an improved PSO thatis the IEPSO algorithm is proposed on the basis of the last-eliminated principle and an enhanced local-global in-formation sharing capability e comparison and analysisof the simulation results indicate the following conclusions
(1) e exchange of information between global andlocal optimal particles enhances the deep searchcapability of the IEPSO algorithm
(2) e standard test function is used to simulate theparameter C3 of the local-global information sharingterm e results show that the global optimizationcapability of the IEPSO algorithm is strong when C3is linearly decreasing Moreover the proposed al-gorithm can show the best search performance whenC3 is a nonlinear convex function
(3) e last-eliminated principle is used in the IEPSO tomaintain particle population diversity MoreoverPSO is avoided in the local optimal value A com-parison of the IEPSO algorithm with the classicaloptimization algorithm and its improved versionsverifies the global search capability of the IEPSOalgorithm
In summary the comparative results of the simulationanalysis reveal that with the application of the last-eliminatedprinciple and the local-global information sharing term to the
IEPSO the proposed algorithm effectively overcomes thedisadvantages of the classical algorithms including theirprecocious convergence and tendency to fall into the localoptimum e IEPSO shows an ideal global optimizationperformance and indicates a high application value for solvingpractical engineering optimization problems
Data Availability
e data used to support the findings of this study areavailable from the corresponding author upon request
Conflicts of Interest
e authors declare that there are no conflicts of interest
Acknowledgments
is work was supported by Shanghai Rising-Star Program(no 16QB1401000) Key Project of Shanghai Science andTechnology Committee (no 16DZ1120400) and the Na-tional Natural Science Foundation of China (Project no51705187) the Postdoctoral Science Foundation of China(Grant no 2017M621202)
References
[1] Z Zhou J Wang Z Zhu D Yang and J Wu ldquoTangentnavigated robot path planning strategy using particle swarmoptimized artificial potential fieldrdquo Optik vol 158 pp 639ndash651 2018
[2] P Du R Barrio H Jiang and L Cheng ldquoAccurate Quotient-Difference algorithm error analysis improvements and ap-plicationsrdquo Applied Mathematics and Computation vol 309pp 245ndash271 2017
[3] L Jiang Z Wang Y Ye and J Jiang ldquoFast circle detectionalgorithm based on sampling from difference areardquo Optikvol 158 pp 424ndash433 2018
[4] H Garg ldquoA hybrid PSO-GA algorithm for constrained op-timization problemsrdquo Applied Mathematics amp Computationvol 274 no 11 pp 292ndash305 2016
[5] J Zhang and P Xia ldquoAn improved PSO algorithm for pa-rameter identification of nonlinear dynamic hystereticmodelsrdquo Journal of Sound and Vibration vol 389 pp 153ndash167 2017
[6] R Saini P P Roy and D P Dogra ldquoA segmental HMMbasedtrajectory classification using genetic algorithmrdquo ExpertSystems with Applications vol 93 pp 169ndash181 2018
[7] P R D O D Costa S Mauceri P Carroll et al ldquoA geneticalgorithm for a vehicle routing problemrdquo Electronic Notes inDiscrete Mathematics vol 64 pp 65ndash74 2017
[8] V Jindal and P Bedi ldquoAn improved hybrid ant particleoptimization (IHAPO) algorithm for reducing travel time inVANETsrdquo Applied Soft Computing vol 64 pp 526ndash5352018
[9] Z Peng H Manier and M A Manier ldquoParticle swarmoptimization for capacitated location-routing problemrdquoIFAC-PapersOnLine vol 50 no 1 pp 14668ndash14673 2017
[10] G Xu and G Yu ldquoReprint of on convergence analysis ofparticle swarm optimization algorithmrdquo Journal of ShanxiNormal University vol 4 no 14 pp 25ndash32 2008
[11] J Lu W Xie and H Zhou ldquoCombined fitness functionbased particle swarm optimization algorithm for system
Table 9 ree improved particle swarm algorithm test results
Functions Criteria IEPSO DMSDL-PSO [25]
BHPSOWM[26]
f1Mean 892E minus 22 473E minus 10 4240SD 265E minus 39 181E minus 09 5211
f3Mean 621E minus 19 237E + 03 761SD 263E minus 36 571E + 02 007
f6Mean 419E minus 02 866E minus 05 mdashSD 343E minus 04 296E minus 04 mdash
f7Mean 446E minus 03 915E + 01 7618SD 173E minus 04 180E + 01 2675
f8Mean 242E minus 10 131E + 02 mdashSD 674E minus 20 582E + 01 mdash
f10Mean 113E minus 11 101E + 00 172SD 221E minus 22 271E minus 01 0
16 Computational Intelligence and Neuroscience
identificationrdquo Computers amp Industrial Engineering vol 95pp 122ndash134 2016
[12] F Javidrad and M Nazari ldquoA new hybrid particle swarm andsimulated annealing stochastic optimization methodrdquo Ap-plied Soft Computing vol 60 pp 634ndash654 2017
[13] J Jie J Zhang H Zheng and B Hou ldquoFormalized model andanalysis of mixed swarm based cooperative particle swarmoptimizationrdquo Neurocomputing vol 174 pp 542ndash552 2016
[14] A Meng Z Li H Yin S Chen and Z Guo ldquoAcceleratingparticle swarm optimization using crisscross searchrdquo In-formation Sciences vol 329 pp 52ndash72 2016
[15] L Wang B Yang and J Orchard ldquoParticle swarm optimi-zation using dynamic tournament topologyrdquo Applied SoftComputing vol 48 pp 584ndash596 2016
[16] M S Kiran ldquoParticle swarm optimization with a new updatemechanismrdquo Applied Soft Computing vol 60 pp 670ndash6782017
[17] H C Tsai ldquoUnified particle swarm delivers high efficiency toparticle swarm optimizationrdquo Applied Soft Computingvol 55 pp 371ndash383 2017
[18] S F Li and C Y Cheng ldquoParticle swarm optimization withfitness adjustment parametersrdquo Computers amp Industrial En-gineering vol 113 pp 831ndash841 2017
[19] Y Chen L Li H Peng J Xiao Y Yang and Y Shi ldquoParticleswarm optimizer with two differential mutationrdquo Applied SoftComputing vol 61 pp 314ndash330 2017
[20] Q Zhang W Liu X Meng B Yang and A V VasilakosldquoVector coevolving particle swarm optimization algorithmrdquoInformation Sciences vol 394 pp 273ndash298 2017
[21] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimization[C]Evolutionary computationrdquo in Proceedingsof the 1999 Congress on Evolutionary Computation-CEC99vol 3 pp 1945ndash1950 IEEE Washington DC USA 1999
[22] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[23] A Ratnaweera S K Halgamuge and H C Watson ldquoSelf-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficientsrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 240ndash255 2004
[24] K Chen F Zhou and A Liu ldquoChaotic dynamic weightparticle swarm optimization for numerical function optimi-zationrdquo Knowledge-Based Systems vol 139 pp 23ndash40 2018
[25] Y Chen L Li H Peng J Xiao and Q Wu ldquoDynamic multi-swarm differential learning particle swarm optimizerrdquoSwarm and Evolutionary Computation vol 39 pp 209ndash2212018
[26] F Jiang H Xia Q A Tran Q M Ha N Q Tran and J HuldquoA new binary hybrid particle swarm optimization withwavelet mutationrdquo Knowledge-Based Systems vol 130pp 90ndash101 2017
[27] R Liu J Li C Mu J fan and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimiza-tion for dynamic multi-objective optimizationrdquo EuropeanJournal of Operational Research vol 261 no 3 pp 1028ndash1051 2017
[28] W Ye W Feng and S Fan ldquoA novel multi-swarm particleswarm optimization with dynamic learning strategyrdquo AppliedSoft Computing vol 61 pp 832ndash843 2017
[29] L Zhang Y Tang C Hua and X Guan ldquoA new particleswarm optimization algorithm with adaptive inertia weightbased on Bayesian techniquesrdquo Applied Soft Computingvol 28 pp 138ndash149 2015
[30] Q Cui Q Li G Li et al ldquoGlobally-optimal prediction-basedadaptive mutation particle swarm optimizationrdquo InformationSciences vol 418 pp 186ndash217 2017
[31] D Zhao and J Liu ldquoStudy on network security situationawareness based on particle swarm optimization algorithmrdquoComputers amp Industrial Engineering vol 125 pp 764ndash7752018
[32] H Samareh S H Khoshrou K Shahriar M M Ebadzadehand M Eslami ldquoOptimization of a nonlinear model forpredicting the ground vibration using the combinationalparticle swarm optimization-genetic algorithmrdquo Journal ofAfrican Earth Sciences vol 133 pp 36ndash45 2017
[33] M Dash T Panigrahi and R Sharma ldquoDistributed parameterestimation of IIR system using diffusion particle swarm op-timization algorithmrdquo Journal of King Saud University-Engineering Sciences 2017 In press
[34] B Wang S Li J Guo and Q Chen ldquoCar-like mobile robotpath planning in rough terrain using multi-objective particleswarm optimization algorithmrdquo Neurocomputing vol 282pp 42ndash51 2018
[35] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[36] R F Lopes F F Costa A Oliveira et al ldquoAlgorithm based onparticle swarm applied to electrical load scheduling in anindustrial settingrdquo Energy vol 147 pp 1007ndash1015 2018
[37] F Sheikholeslami and N J Navimipour ldquoService allocation inthe cloud environments using multi-objective particle swarmoptimization algorithm based on crowding distancerdquo Swarmamp Evolutionary Computation vol 35 pp 53ndash64 2017
[38] M Petrovic N Vukovic M Mitic et al ldquoIntegration ofprocess planning and scheduling using chaotic particle swarmoptimization algorithmrdquo Expert Systems with Applicationsvol 64 pp 569ndash588 2016
[39] Z Zhang Y Jiang S Zhang S Geng H Wang and G SangldquoAn adaptive particle swarm optimization algorithm forreservoir operation optimizationrdquo Applied Soft ComputingJournal vol 18 no 4 pp 167ndash177 2014
[40] K Li L Liu J Zhai T M Khoshgoftaar and T Li ldquoeimproved grey model based on particle swarm optimizationalgorithm for time series predictionrdquo Engineering Applica-tions of Artificial Intelligence vol 55 pp 285ndash291 2016
[41] S Gulcu and H Kodaz ldquoe estimation of the electricityenergy demand using particle swarm optimization algorithma case study of Turkeyrdquo Procedia Computer Science vol 111pp 64ndash70 2017
Computational Intelligence and Neuroscience 17
Computer Games Technology
International Journal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom
Journal ofEngineeringVolume 2018
Advances in
FuzzySystems
Hindawiwwwhindawicom
Volume 2018
International Journal of
ReconfigurableComputing
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
thinspArtificial Intelligence
Hindawiwwwhindawicom Volumethinsp2018
Hindawiwwwhindawicom Volume 2018
Civil EngineeringAdvances in
Hindawiwwwhindawicom Volume 2018
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawiwwwhindawicom Volume 2018
Hindawi
wwwhindawicom Volume 2018
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Engineering Mathematics
International Journal of
RoboticsJournal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Computational Intelligence and Neuroscience
Hindawiwwwhindawicom Volume 2018
Mathematical Problems in Engineering
Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018
Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom
The Scientific World Journal
Volume 2018
Hindawiwwwhindawicom Volume 2018
Human-ComputerInteraction
Advances in
Hindawiwwwhindawicom Volume 2018
Scientic Programming
Submit your manuscripts atwwwhindawicom
identificationrdquo Computers amp Industrial Engineering vol 95pp 122ndash134 2016
[12] F Javidrad and M Nazari ldquoA new hybrid particle swarm andsimulated annealing stochastic optimization methodrdquo Ap-plied Soft Computing vol 60 pp 634ndash654 2017
[13] J Jie J Zhang H Zheng and B Hou ldquoFormalized model andanalysis of mixed swarm based cooperative particle swarmoptimizationrdquo Neurocomputing vol 174 pp 542ndash552 2016
[14] A Meng Z Li H Yin S Chen and Z Guo ldquoAcceleratingparticle swarm optimization using crisscross searchrdquo In-formation Sciences vol 329 pp 52ndash72 2016
[15] L Wang B Yang and J Orchard ldquoParticle swarm optimi-zation using dynamic tournament topologyrdquo Applied SoftComputing vol 48 pp 584ndash596 2016
[16] M S Kiran ldquoParticle swarm optimization with a new updatemechanismrdquo Applied Soft Computing vol 60 pp 670ndash6782017
[17] H C Tsai ldquoUnified particle swarm delivers high efficiency toparticle swarm optimizationrdquo Applied Soft Computingvol 55 pp 371ndash383 2017
[18] S F Li and C Y Cheng ldquoParticle swarm optimization withfitness adjustment parametersrdquo Computers amp Industrial En-gineering vol 113 pp 831ndash841 2017
[19] Y Chen L Li H Peng J Xiao Y Yang and Y Shi ldquoParticleswarm optimizer with two differential mutationrdquo Applied SoftComputing vol 61 pp 314ndash330 2017
[20] Q Zhang W Liu X Meng B Yang and A V VasilakosldquoVector coevolving particle swarm optimization algorithmrdquoInformation Sciences vol 394 pp 273ndash298 2017
[21] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimization[C]Evolutionary computationrdquo in Proceedingsof the 1999 Congress on Evolutionary Computation-CEC99vol 3 pp 1945ndash1950 IEEE Washington DC USA 1999
[22] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[23] A Ratnaweera S K Halgamuge and H C Watson ldquoSelf-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficientsrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 240ndash255 2004
[24] K Chen F Zhou and A Liu ldquoChaotic dynamic weightparticle swarm optimization for numerical function optimi-zationrdquo Knowledge-Based Systems vol 139 pp 23ndash40 2018
[25] Y Chen L Li H Peng J Xiao and Q Wu ldquoDynamic multi-swarm differential learning particle swarm optimizerrdquoSwarm and Evolutionary Computation vol 39 pp 209ndash2212018
[26] F Jiang H Xia Q A Tran Q M Ha N Q Tran and J HuldquoA new binary hybrid particle swarm optimization withwavelet mutationrdquo Knowledge-Based Systems vol 130pp 90ndash101 2017
[27] R Liu J Li C Mu J fan and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimiza-tion for dynamic multi-objective optimizationrdquo EuropeanJournal of Operational Research vol 261 no 3 pp 1028ndash1051 2017
[28] W Ye W Feng and S Fan ldquoA novel multi-swarm particleswarm optimization with dynamic learning strategyrdquo AppliedSoft Computing vol 61 pp 832ndash843 2017
[29] L Zhang Y Tang C Hua and X Guan ldquoA new particleswarm optimization algorithm with adaptive inertia weightbased on Bayesian techniquesrdquo Applied Soft Computingvol 28 pp 138ndash149 2015
[30] Q Cui Q Li G Li et al ldquoGlobally-optimal prediction-basedadaptive mutation particle swarm optimizationrdquo InformationSciences vol 418 pp 186ndash217 2017
[31] D Zhao and J Liu ldquoStudy on network security situationawareness based on particle swarm optimization algorithmrdquoComputers amp Industrial Engineering vol 125 pp 764ndash7752018
[32] H Samareh S H Khoshrou K Shahriar M M Ebadzadehand M Eslami ldquoOptimization of a nonlinear model forpredicting the ground vibration using the combinationalparticle swarm optimization-genetic algorithmrdquo Journal ofAfrican Earth Sciences vol 133 pp 36ndash45 2017
[33] M Dash T Panigrahi and R Sharma ldquoDistributed parameterestimation of IIR system using diffusion particle swarm op-timization algorithmrdquo Journal of King Saud University-Engineering Sciences 2017 In press
[34] B Wang S Li J Guo and Q Chen ldquoCar-like mobile robotpath planning in rough terrain using multi-objective particleswarm optimization algorithmrdquo Neurocomputing vol 282pp 42ndash51 2018
[35] Z Wang and J Cai ldquoe path-planning in radioactive en-vironment of nuclear facilities using an improved particleswarm optimization algorithmrdquo Nuclear Engineering amp De-sign vol 326 pp 79ndash86 2018
[36] R F Lopes F F Costa A Oliveira et al ldquoAlgorithm based onparticle swarm applied to electrical load scheduling in anindustrial settingrdquo Energy vol 147 pp 1007ndash1015 2018
[37] F Sheikholeslami and N J Navimipour ldquoService allocation inthe cloud environments using multi-objective particle swarmoptimization algorithm based on crowding distancerdquo Swarmamp Evolutionary Computation vol 35 pp 53ndash64 2017
[38] M Petrovic N Vukovic M Mitic et al ldquoIntegration ofprocess planning and scheduling using chaotic particle swarmoptimization algorithmrdquo Expert Systems with Applicationsvol 64 pp 569ndash588 2016
[39] Z Zhang Y Jiang S Zhang S Geng H Wang and G SangldquoAn adaptive particle swarm optimization algorithm forreservoir operation optimizationrdquo Applied Soft ComputingJournal vol 18 no 4 pp 167ndash177 2014
[40] K Li L Liu J Zhai T M Khoshgoftaar and T Li ldquoeimproved grey model based on particle swarm optimizationalgorithm for time series predictionrdquo Engineering Applica-tions of Artificial Intelligence vol 55 pp 285ndash291 2016
[41] S Gulcu and H Kodaz ldquoe estimation of the electricityenergy demand using particle swarm optimization algorithma case study of Turkeyrdquo Procedia Computer Science vol 111pp 64ndash70 2017
Computational Intelligence and Neuroscience 17
Computer Games Technology
International Journal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom
Journal ofEngineeringVolume 2018
Advances in
FuzzySystems
Hindawiwwwhindawicom
Volume 2018
International Journal of
ReconfigurableComputing
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
thinspArtificial Intelligence
Hindawiwwwhindawicom Volumethinsp2018
Hindawiwwwhindawicom Volume 2018
Civil EngineeringAdvances in
Hindawiwwwhindawicom Volume 2018
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawiwwwhindawicom Volume 2018
Hindawi
wwwhindawicom Volume 2018
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Engineering Mathematics
International Journal of
RoboticsJournal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Computational Intelligence and Neuroscience
Hindawiwwwhindawicom Volume 2018
Mathematical Problems in Engineering
Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018
Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom
The Scientific World Journal
Volume 2018
Hindawiwwwhindawicom Volume 2018
Human-ComputerInteraction
Advances in
Hindawiwwwhindawicom Volume 2018
Scientic Programming
Submit your manuscripts atwwwhindawicom
Computer Games Technology
International Journal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom
Journal ofEngineeringVolume 2018
Advances in
FuzzySystems
Hindawiwwwhindawicom
Volume 2018
International Journal of
ReconfigurableComputing
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
thinspArtificial Intelligence
Hindawiwwwhindawicom Volumethinsp2018
Hindawiwwwhindawicom Volume 2018
Civil EngineeringAdvances in
Hindawiwwwhindawicom Volume 2018
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawiwwwhindawicom Volume 2018
Hindawi
wwwhindawicom Volume 2018
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Engineering Mathematics
International Journal of
RoboticsJournal of
Hindawiwwwhindawicom Volume 2018
Hindawiwwwhindawicom Volume 2018
Computational Intelligence and Neuroscience
Hindawiwwwhindawicom Volume 2018
Mathematical Problems in Engineering
Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018
Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom
The Scientific World Journal
Volume 2018
Hindawiwwwhindawicom Volume 2018
Human-ComputerInteraction
Advances in
Hindawiwwwhindawicom Volume 2018
Scientic Programming
Submit your manuscripts atwwwhindawicom