research article a novel hybrid bat algorithm with harmony

22
Hindawi Publishing Corporation Journal of Applied Mathematics Volume 2013, Article ID 696491, 21 pages http://dx.doi.org/10.1155/2013/696491 Research Article A Novel Hybrid Bat Algorithm with Harmony Search for Global Numerical Optimization Gaige Wang 1,2 and Lihong Guo 1 1 Changchun Institute of Optics, Fine Mechanics and Physics, Chinese Academy of Sciences, Changchun 130033, China 2 Graduate School of Chinese Academy of Sciences, Beijing 100039, China Correspondence should be addressed to Lihong Guo; [email protected] Received 27 June 2012; Revised 27 November 2012; Accepted 15 December 2012 Academic Editor: Marco H. Terra Copyright © 2013 G. Wang and L. Guo. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. A novel robust hybrid metaheuristic optimization approach, which can be considered as an improvement of the recently developed bat algorithm, is proposed to solve global numerical optimization problems. e improvement includes the addition of pitch adjustment operation in HS serving as a mutation operator during the process of the bat updating with the aim of speeding up convergence, thus making the approach more feasible for a wider range of real-world applications. e detailed implementation procedure for this improved metaheuristic method is also described. Fourteen standard benchmark functions are applied to verify the effects of these improvements, and it is demonstrated that, in most situations, the performance of this hybrid metaheuristic method (HS/BA) is superior to, or at least highly competitive with, the standard BA and other population-based optimization methods, such as ACO, BA, BBO, DE, ES, GA, HS, PSO, and SGA. e effect of the HS/BA parameters is also analyzed. 1. Introduction e process of optimization is searching a vector in a function that produces an optimal solution. All of feasible values are available solutions, and the extreme value is optimal solution. In general, optimization algorithms are applied to solve these optimization problems. A simple classification way for optimization algorithms is considering the nature of the algorithms, and optimization algorithms can be divided into two main categories: deterministic algorithms and stochastic algorithms. Deterministic algorithms using gradients such as hill climbing have a rigorous move and will generate the same set of solutions if the iterations commence with the same initial starting point. On the other hand, stochastic algorithms without using gradients oſten generate different solutions even with the same initial value. However, generally speaking, the final values, though slightly different, will converge to the same optimal solutions within a given accuracy [1]. Generally, stochastic algorithms have two types: heuristic and metaheuristic. Recently, nature-inspired metaheuristic algorithms perform powerfully and efficiently in solving modern nonlinear numerical global optimization problems. To some extent, all metaheuristic algorithms strive for making balance between randomization (global search) and local search [2]. Inspired by nature, these strong metaheuristic algorithms are applied to solve NP-hard problems, such as UCAV path planning [1114], test-sheet composition [15], WSN deployment [16] and water, and geotechnical and transport engineering [17]. Optimization algorithms cover all searching for extreme value problems. ese kinds of metaheuristic algorithms carry out on a population of solutions and always find best solutions. During the 1950s and 1960s, computer sci- entists studied the possibility of conceptualizing evolution as an optimization tool, and this generated a subset of gradient free approaches named genetic algorithms (GA) [18]. Since then, many other nature-inspired metaheuristic algorithms have emerged, such as differential evolution (DE) [10, 19, 20], particle swarm optimization (PSO) [2123], biogeography- based optimization (BBO) [24, 25], harmony search (HS) [26, 27], and more recently, the bat algorithm (BA) [28, 29] that is inspired by echolocation behavior of bats in nature. Firstly presented by Yang in 2010, the bat-inspired algo- rithm or bat algorithm (BA) [30, 31] is a metaheuristic search algorithm, inspired by the echolocation behavior of

Upload: others

Post on 20-Nov-2021

1 views

Category:

Documents


0 download

TRANSCRIPT

Hindawi Publishing CorporationJournal of Applied MathematicsVolume 2013 Article ID 696491 21 pageshttpdxdoiorg1011552013696491

Research ArticleA Novel Hybrid Bat Algorithm with Harmony Search forGlobal Numerical Optimization

Gaige Wang12 and Lihong Guo1

1 Changchun Institute of Optics Fine Mechanics and Physics Chinese Academy of Sciences Changchun 130033 China2 Graduate School of Chinese Academy of Sciences Beijing 100039 China

Correspondence should be addressed to Lihong Guo guolhciompaccn

Received 27 June 2012 Revised 27 November 2012 Accepted 15 December 2012

Academic Editor Marco H Terra

Copyright copy 2013 G Wang and L GuoThis is an open access article distributed under the Creative Commons Attribution Licensewhich permits unrestricted use distribution and reproduction in any medium provided the original work is properly cited

A novel robust hybrid metaheuristic optimization approach which can be considered as an improvement of the recently developedbat algorithm is proposed to solve global numerical optimization problems The improvement includes the addition of pitchadjustment operation in HS serving as a mutation operator during the process of the bat updating with the aim of speeding upconvergence thus making the approach more feasible for a wider range of real-world applications The detailed implementationprocedure for this improved metaheuristic method is also described Fourteen standard benchmark functions are applied to verifythe effects of these improvements and it is demonstrated that in most situations the performance of this hybrid metaheuristicmethod (HSBA) is superior to or at least highly competitive with the standard BA and other population-based optimizationmethods such as ACO BA BBO DE ES GA HS PSO and SGA The effect of the HSBA parameters is also analyzed

1 Introduction

Theprocess of optimization is searching a vector in a functionthat produces an optimal solution All of feasible valuesare available solutions and the extreme value is optimalsolution In general optimization algorithms are applied tosolve these optimization problems A simple classificationway for optimization algorithms is considering the natureof the algorithms and optimization algorithms can bedivided into two main categories deterministic algorithmsand stochastic algorithms Deterministic algorithms usinggradients such as hill climbing have a rigorous move andwill generate the same set of solutions if the iterationscommence with the same initial starting point On the otherhand stochastic algorithms without using gradients oftengenerate different solutions even with the same initial valueHowever generally speaking the final values though slightlydifferent will converge to the same optimal solutions within agiven accuracy [1] Generally stochastic algorithms have twotypes heuristic and metaheuristic Recently nature-inspiredmetaheuristic algorithms perform powerfully and efficientlyin solving modern nonlinear numerical global optimizationproblems To some extent all metaheuristic algorithms strive

for making balance between randomization (global search)and local search [2]

Inspired by nature these strongmetaheuristic algorithmsare applied to solve NP-hard problems such as UCAVpath planning [11ndash14] test-sheet composition [15] WSNdeployment [16] and water and geotechnical and transportengineering [17] Optimization algorithms cover all searchingfor extreme value problems These kinds of metaheuristicalgorithms carry out on a population of solutions and alwaysfind best solutions During the 1950s and 1960s computer sci-entists studied the possibility of conceptualizing evolution asan optimization tool and this generated a subset of gradientfree approaches named genetic algorithms (GA) [18] Sincethen many other nature-inspired metaheuristic algorithmshave emerged such as differential evolution (DE) [10 19 20]particle swarm optimization (PSO) [21ndash23] biogeography-based optimization (BBO) [24 25] harmony search (HS)[26 27] and more recently the bat algorithm (BA) [28 29]that is inspired by echolocation behavior of bats in nature

Firstly presented by Yang in 2010 the bat-inspired algo-rithm or bat algorithm (BA) [30 31] is a metaheuristicsearch algorithm inspired by the echolocation behavior of

2 Journal of Applied Mathematics

bats with varying pulse rates of emission and loudness Theprimary purpose for a batrsquos echolocation is to act as a signalsystem to sense distance and hunt for foodprey A com-prehensive review about swarm intelligence involving BA isperformed by Parpinelli and Lopes [32] Furthermore Tsai etal proposed an evolving bat algorithm (EBA) to improve theperformance of BA with better efficiency [33]

Firstly proposed by Geem et al in 2001 harmony search(HS) [26] is a new metaheuristic approach for minimizingpossibly nondifferentiable and nonlinear functions in con-tinuous space HS is inspired by the behavior of a musicianrsquosimprovisation process where every musician attempts toimprove its tune so as to create optimal harmony in a real-world musical performance processes HS algorithm origi-nates in the similarity between engineering optimization andmusic improvisation and the engineers search for a globaloptimal solution as determined by an objective functionjust like the musicians strive for finding aesthetic harmonyas determined by aesthetician In music improvisation eachmusician chooses any pitchwithin the feasible range togetherproducing one harmony vector If all the pitches produce agood solution that experience is reserved in each variablersquosmemory and the possibility of producing a better solutionis also increased next time Furthermore this new approachrequires few control variables which makes HS easy toimplement more robust and very appropriate for parallelcomputation

BA is a powerful algorithm in exploitation (ie localsearch) but at times it may trap into some local optima sothat it cannot perform global search well For bat algorithmthe search depends completely on random walks so a fastconvergence cannot be guaranteed Firstly presented here inorder to increase the diversity of the population for BA soas to avoid trapping into local optima a main improvementof adding pitch adjustment operation in HS serving as amutation operator ismade to the BAwith the aim of speedingup convergence thus making the approach more feasiblefor a wider range of practical applications while preservingthe attractive characteristics of the basic BA That is tosay we combine two approaches to propose a new hybridmetaheuristic algorithm according to the principle of HS andBA and then this improved BA method is used to searchthe optimal objective function valueThe proposed approachis evaluated on fourteen standard benchmark functions thathave ever been applied to verify optimization algorithmson continuous optimization problems Experimental resultsshow that the HSBA performs more efficiently and accu-rately than basic BA ACO BBO DE ES GA HS PSO andSGA

The structure of this paper is organized as followsSection 2 describes global numerical optimization problemthe HS algorithm and basic BA in brief Our proposedapproach HSBA is presented in detail in Section 3 Subse-quently our method is evaluated through fourteen bench-mark functions in Section 4 In addition the HSBA isalso compared with BA ACO BBO DE ES GA HS PSOand SGA Finally Section 5 consists of the conclusion andproposals for future work

2 Preliminary

To begin with in this section we will provide a brief back-ground on the optimization problem harmony search (HS)and bat algorithm (BA)

21 Optimization Problem In computer science mathemat-ics management science and control theory optimization(also called mathematical optimization or mathematical pro-gramming) means the selection of an optimal solution fromsome set of feasible alternatives In general an optimizationproblem includes minimizing or maximizing a function bysystematically selecting input values from a given feasibleset and calculating the value of the function More gener-ally optimization consists of finding the optimal values ofsome objective function within a given domain including anumber of different types of domains and different types ofobjective functions [34]

A global optimization problem can be described asfollows

Given a function 119891 119863 rarr 119877 from some set 119863 to thereal numbersSought a parameter 119909

0in 119863 such that 119891(119909

0) le 119891(119909)

for all 119909 in 119863 (ldquominimizationrdquo) or such that 119891(1199090) ge

119891(119909) for all 119909 in119863 (ldquomaximizationrdquo)

Such a formulation is named a numerical optimizationproblem Many theoretical and practical problems may bemodeled in this general framework In general 119863 is somesubset of the Euclidean space 119877119899 often specified by a groupof constraints equalities or inequalities that the componentsof 119863 have to satisfy The domain 119863 of 119891 is named the searchspace while the elements of 119863 are named feasible solutionsor candidate solutions In general the function 119891 is calledan objective function cost function (minimization) or utilityfunction (maximization) An optimal solution is an availablesolution that is the extreme of (minimum or maximum) theobjective function

Conventionally the standard formulation of an optimiza-tion problem is stated in terms of minimization In generalunless both the feasible region and the objective function areconvex in a minimization problem there may be more thanone local minima A local minimum 119909lowast is defined as a pointfor which the following expression

119891 (119909lowast) le 119891 (119909) (1)

holds More details about the optimization problem can befound in [35]

The branch of numerical analysis and applied mathe-matics that investigates deterministic algorithms that canguarantee convergence in limited time to the true optimalsolution of a nonconvex problem is called global numer-ical optimization problems A variety of algorithms havebeen proposed to solve nonconvex problems Among themheuristics algorithms can evaluate approximate solutions tosome optimization problems as described in introduction[36]

Journal of Applied Mathematics 3

22 Harmony Search Firstly developed by Geem et al in2001 harmony search (HS) [26] is a relatively new meta-heuristic optimization algorithm and it is based on naturalmusical performance processes that occur when a musiciansearches for an optimal state of harmony The optimizationoperators of HS algorithm are specified as the harmonymemory (HM) which keeps the solution vectors which areall within the search space as shown in (2) the harmonymemory size HMS which represents the number of solutionvectors kept in the HM the harmony memory considerationrate (HMCR) the pitch adjustment rate (PAR) the pitchadjustment bandwidth (bw)

HM =

[[[[[

[

11990911

11990912

1199091119863

fitness (1199091)11990921

11990922

1199092119863

fitness (1199092)

119909HMS1

119909HMS2

119909HMS119863

fitness (119909HMS)

]]]]]

]

(2)

In more detail we can explain the HS algorithm withthe help of discussing the improvisation process by a musicplayer When a player is improvising he or she has threepossible options (1) play any well-known piece of music (aseries of pitches in harmony) exactly from his or her memoryas the harmony memory consideration rate (HMCR) (2)play something similar to a known piece in playerrsquos memory(thus adjusting the pitch slightly) or (3) play totally newor random pitch from feasible ranges If these three optionsare formalized for optimization we have three correspond-ing components employment of harmony memory pitchadjusting and randomization Similarly when each decisionvariable selects one value in theHS algorithm it canmake useof one of the above three rules in the whole HS procedure If anew harmony vector is better than the worst harmony vectorin the HM the new harmony vector takes the place of theworst harmony vector in the HMThis procedure is repeateduntil a stopping criterion is satisfied

The employment of harmony memory is significant asit is analogous to select the optimal fit individuals in theGA (genetic algorithm) This will make sure that the bestharmonies will be kept on to the new harmony memory Forthe purpose of using this memorymore effectively we shouldproperly set the value of the parameter HMCR isin [0 1]If this rate is very approaching to 1 (too high) almost allthe harmonies are utilized in the harmony memory thenother harmonies are not exploredwell resulting in potentiallywrong solutions If this rate is very close to 0 (extremely low)only few best harmonies are chosen and it may have a slowconvergence rate Therefore generally HMCR = 07 sim 095

To adjust the pitch slightly in the second component anappropriate approach is to be applied to adjust the frequencyefficiently In theory we can adjust the pitch linearly ornonlinearly but in fact linear adjustment is utilized If 119909old isthe current solution (or pitch) then the new solution (pitch)119909new is generated by

119909new = 119909old + bw (2120576 minus 1) (3)

where 120576 is a random real number drawn from a uniformdistribution [0 1] Here bw is the bandwidth controlling the

local range of pitch adjustment Actually we can see that thepitch adjustment (3) is a random walk

Pitch adjustment is similar to the mutation operator inGA Also we must appropriately set the parameter PAR tocontrol the degree of the adjustment If PAR nears 1 (toohigh) then the solution is always changing and the algorithmmay not converge at all If it is very close to 0 (too low) thenthere is very little change and the algorithm may prematureTherefore we use PAR = 01 sim 05 in most simulations asusual

For the purpose of increasing the diversity of the solu-tions the randomization is needed in the third componentAlthough adjusting pitch has a similar role it is confinedto certain local pitch adjustment and thus corresponds toa local search The usage of randomization can make thesystem move further to explore multifarious regions withhigh solution diversity in order to search for the globaloptimal solution In real-world engineering applicationsHS has been applied to solve many optimization problemsincluding function optimization water distribution networkgroundwater modeling energy-saving dispatch structuraldesign and vehicle routing

The mainframe of the basic HS algorithm can bedescribed as shown in Algorithm 1 where119863 is the number ofdecision variables and rand is a random real number in inter-val (0 1) drawn fromuniformdistribution FromAlgorithm 1it is clear that there are only two control parameters in HSwhich are HMCR and PAR

23 BatAlgorithm Thebat algorithm is a novelmetaheuristicswarm intelligence optimization method developed for theglobal numerical optimization in which the search algorithmis inspired by social behavior of bats and the phenomenon ofecholocation to sense distance

In [28] for simplicity bat algorithm is based on idealizingsome of the echolocation characteristics of bats which are thefollowing approximate or idealized rules

(1) all bats apply echolocation to sense distance and theyalways ldquoknowrdquo the surroundings in some magicalway

(2) bats fly randomly with velocity V119894and a fixed fre-

quency 119891min at position 119909119894 varying wavelength 120582 andloudness119860

0to hunt for preyThey can spontaneously

accommodate the wavelength (or frequency) of theiremitted pulses and adjust the rate of pulse emission119903 isin [0 1] depending on the proximity of their target

(3) although the loudness can change in different ways itis supposed that the loudness varies from aminimumconstant (positive) 119860min to a large 1198600

Based on these approximations and idealization the basicsteps of the bat algorithm (BA) can be described as shown inAlgorithm 2

In BA each bat is defined by its position 119909119905119894 velocity V

119905

119894

frequency 119891119894 loudness 119860119905

119894 and the emission pulse rate 119903119905

119894

4 Journal of Applied Mathematics

BeginStep 1 Set the parameters and initialize the HMStep 2 Evaluate the fitness for each individual in HMStep 3 while the halting criteria is not satisfied do

for 119889 = 1119863 doif rand ltHMCR then memory consideration

119909new (119889) = 119909119886(119889) where 119886 isin (1 2 HMS)

if rand lt PAR then pitch adjustment119909new (119889) = 119909old (119889) + bw times (2 times rand minus 1)

endifelse random selection

119909new(119889) = 119909min119889 + rand times (119909max119889 minus 119909minminus119889)endif

endfor 119889Update the HM as 119909

119908= 119909new if 119891(119909new) lt 119891(119909

119908) (minimization objective)

Update the best harmony vector found so farStep 4 end whileStep 5 Post-processing the results and visualization

End

Algorithm 1 Harmony search algorithm

BeginStep 1 Initialization Set the generation counter 119905 = 1 Initialize the population of NP bats 119875

randomly and each bat corresponding to a potential solution tothe given problem define loudness 119860

119894 pulse frequency 119876

119894

and the initial velocities V119894(119894 = 1 2 NP) set pulse rate 119903

119894

Step 2 While the termination criteria is not satisfied or 119905 ltMaxGeneration doGenerate new solutions by adjusting frequency and updating velocitiesand locationssolutions [(4)ndash(6)]if (rand gt 119903

119894) then

Select a solution among the best solutionsGenerate a local solution around the selected best solution

end ifGenerate a new solution by flying randomlyif (rand lt 119860

119894amp 119891(119909

119894) lt 119891(119909

lowast)) then

Accept the new solutionsIncrease 119903

119894and reduce 119860

119894

end ifRank the bats and find the current best 119909

lowast

119905 = 119905 + 1Step 3 end whileStep 4 Post-processing the results and visualization

End

Algorithm 2 Bat algorithm

in a 119863-dimensional search space The new solutions 119909119905119894and

velocities V119905119894at time step t are given by

119891119894= 119891min + (119891max minus 119891min) 120573 (4)

V119905

119894= V119905minus1

119894+ (119909119905119894minus 119909lowast) 119891119894 (5)

119909119905119894= 119909119905minus1119894

+ V119905

119894 (6)

where 120573 isin [0 1] is a random vector drawn from a uniformdistribution Here 119909

lowastis the current global best location (solu-

tion)which is located after comparing all the solutions among

all the 119899 bats Generally speaking depending on the domainsize of the problem of interest the frequency 119891 is assignedto 119891min = 0 and 119891max = 100 in practical implementationInitially each bat is randomly given a frequency which isdrawn uniformly from [119891min 119891max]

For the local search part once a solution is selectedamong the current best solutions a new solution for each batis generated locally using random walk

119909new = 119909old + 120576119860119905 (7)

Journal of Applied Mathematics 5

where 120576 isin [minus1 1] is a scaling factor which is a random num-ber while 119860

119905= ⟨119860119905119894⟩ is the average loudness of all the bats at

time step 119905The update of the velocities and positions of bats is similar

to the procedure in the standard PSO [21] as 119891119894controls the

pace and range of the movement of the swarming particles inessence To some degree BA can be considered as a balancedcombination of the standard PSO and the intensive localsearch controlled by the loudness and pulse rate

Furthermore the loudness 119860119894and the rate 119903

119894of pulse

emission update accordingly as the iterations proceed asshown in (8)

119860119905+1119894

= 120572119860119905119894

119903119905+1119894

= 1199030119894[1 minus exp (minus120574119905)]

(8)

where 120572 and 120574 are constants In essence 120572 is similar tothe cooling factor of a cooling schedule in the simulatedannealing (SA) [37] For simplicity we set 120572 = 120574 = 09 inthis work

3 Our Approach HSBA

Based on the introduction of HS and BA in previous sectionwe will explain how we combine the two approaches to formthe proposed bat algorithm with harmony search (HSBA) inthis section whichmodifies the solutions with poor fitness inorder to add diversity of the population to improve the searchefficiency

In general the standard BA algorithm is adept at exploit-ing the search space but at times it may trap into somelocal optima so that it cannot perform global search wellFor BA the search depends completely on random walks soa fast convergence cannot be guaranteed Firstly presentedhere in order to increase the diversity of the populationfor BA so as to avoid trapping into local optima a mainimprovement of adding pitch adjustment operation in HSserving as a mutation operator is made to the BA with theaim of speeding up convergence thus making the approachmore feasible for a wider range of practical applications whilepreserving the attractive characteristics of the basic BA Inthis paper a hybrid metaheuristic algorithm by inducing thepitch adjustment operation in HS as a mutation operatorinto bat algorithm so-called harmony searchbat algorithm(HSBA) is used to optimize the benchmark functions Thedifference between HSBA and BA is that the mutationoperator is used to improve the original BA generating a newsolution for each bat In this way this method can explore thenew search space by the mutation of the HS algorithm andexploit the population information with BA and thereforecan avoid trapping into local optima in BA In the followingwe will show the algorithm HSBA which is an improvementof HS and BA

The critical operator of HSBA is the hybrid harmonysearchmutation operator which composes the improvisationof harmony in HS with the BAThe core idea of the proposedhybrid mutation operator is based on two considerationsFirstly poor solutions can take in many new used features

from good solutions Secondly the mutation operator canimprove the exploration of the new search space In this waythe strong exploration abilities of the original HS and theexploitation abilities of the BA can be fully developed

For bat algorithm as the search relies entirely on randomwalks a fast convergence cannot be guaranteed Describedhere for the first time a main improvement of addingmutation operator is made to the BA including three minorimprovements which are made with the aim of speeding upconvergence thus making the method more practical for awider range of applications but without losing the attractivefeatures of the original method

The first improvement is that we use fixed frequency119891 and loudness 119860 instead of various frequency 119891

119894and 119860119905

119894

Similar to BA inHSBA each bat is defined by its position 119909119905119894

velocity V119905119894 the emission pulse rate 119903119905

119894and the fixed frequency

119891 and loudness 119860 in a 119889-dimensional search space The newsolutions 119909119905

119894and velocities V119905

119894at time step 119905 are given by

V119905

119894= V119905minus1

119894+ (119909119905119894minus 119909lowast) 119891

119909119905119894= 119909119905minus1119894

+ V119905

119894

(9)

where 119909lowastis the current global best location (solution) which

is located after comparing all the solutions among all the 119899bats In our experiments we make 119891 = 05 Through a seriesof simulation experiments on benchmarks in Section 42 itwas found that setting the parameter of pulse rate 119903 to 06 andthe loudness 119860 to 095 produced the best results

The second improvement is to add mutation operator inan attempt to increase diversity of the population to improvethe search efficiency and speed up the convergence to optimaFor the local search part once a solution is selected among thecurrent best solutions a new solution for each bat is generatedlocally using random walk by (7) when 120585 is larger than pulserate 119903 that is 120585 gt 119903 where 120585 isin [0 1] is a random real numberdrawn from a uniform distribution while when 120585 le 119903 weuse pitch adjustment operation in HS serving as a mutationoperator updating the new solution to increase diversity ofthe population to improve the search efficiency as shown in(3) Through testing benchmarks in Section 42 it was foundthat setting the parameter of harmonymemory considerationrate HMCR to 095 and pitch adjustment rate PAR to 01produced the best results

The last improvement is the addition of elitism schemeinto the HSBA As with other population-based optimiza-tion algorithms we typically incorporate some sort of elitismin order to retain the best solutions in the population Thisprevents the best solutions from being corrupted by pitchadjustment operator Note that we use an elitism approach tosave the property of the bats that has the best solution in theHSBA process so even if pitch adjustment operation ruinsits corresponding bat we have saved it and can revert back toit if needed

Based on the above-mentioned analyses the mainframeof the harmony searchbat algorithm (HSBA) is presentedin Algorithm 3

6 Journal of Applied Mathematics

BeginStep 1 Initialization Set the generation counter 119905 = 1 initialize the population of NP

bats 119875 randomly and each bat corresponding to a potentialsolution to the given problem define loudness 119860 set frequency 119876the initial velocities V and pulse rate 119903 set the harmony memory considerationrate HMCR the pitch adjustment rate PAR and bandwidth bwset maximum of elite individuals retained KEEP

Step 2 Evaluate the quality 119891 for each bat in 119875 determined by the objective function 119891(119909)Step 3 While the termination criteria is not satisfied or 119905 ltMaxGeneration do

Sort the population of bats 119875 from best to worst by order of quality 119891 for each batStore the KEEP best bats as KEEPBATfor 119894 = 1NP (all bats) do

V119905

119894= V119905minus1

119894+ (V119905119894minus 119909lowast) times 119876

119909119905119894= 119909119905minus1119894

+ V119905

119894

if (rand gt 119903) then119909119905119906= 119909lowast+ 120572120576119905

end iffor 119895 = 1119863 (all elements) do Mutate

if (rand lt HMCR) then1199031= lceilNP lowast randrceil

119909V (119895) = 1199091199031(119895) where 119903

1isin (1 2 HMS)

if (rand lt PAR) then119909V (119895) = 119909V (119895) + bw times (2 times rand minus 1)

endifelse

119909V(119895) = 119909min 119895 + rand times (119909max119895 minus 119909minminus119895)

endifendfor 119895Evaluate the fitness for the offsprings 119909119905

119906 119909119905119894 119909119905

V

Select the offspring 119909119905119896with the best fitness among the offsprings 119909119905

119906 119909119905119894 119909119905

V

if (rand lt 119860) then1199091199051199031= 119909119905119896

end ifReplace the KEEP worst bats with the KEEP best bats KEEPBAT stored

end for 119894119905 = 119905 + 1

Step 4 end whileStep 5 Post-processing the results and visualization

End

Algorithm 3 The hybrid meta-heuristic algorithm of HSBA

4 Simulation Experiments

In this section we test the performance of the proposedmeta-heuristic HSBA to global numerical optimization through aseries of experiments conducted on benchmark functions

To allow a fair comparison of running times all the exper-iments were conducted on a PC with a Pentium IV processorrunning at 20GHz 512MB of RAM and a hard driveof 160Gbytes Our implementation was compiled usingMATLAB R2012a (714) running under Windows XP3 Nocommercial BA or HS tools were used in the followingexperiments

41 General Performance of HSBA In order to explorethe benefits of HSBA in this subsection we compared itsperformance on global numeric optimization problem withnine other population-based optimization methods which

are ACO BA BBO DE ES GA HS PSO and SGA ACO(ant colony optimization) [38 39] is a swarm intelligencealgorithm for solving optimization problems which is basedon the pheromone deposition of ants BBO (biogeography-based optimization) [24] is a new excellent powerful andefficient evolution algorithm developed for the global opti-mization inspired by the immigration and emigration ofspecies between islands (or habitats) in search of morecompatible islands DE (differential evolution) [19] is a simplebut excellent optimization method that uses the differencebetween two solutions to probabilistically adapt a thirdsolution An ES (evolutionary strategy) [40] is an algorithmthat generally distributes equal importance to mutation andrecombination and that allows two or more parents toreproduce an offspring A GA (genetic algorithm) [18] is asearch heuristic that mimics the process of natural evolutionPSO (particle swarm optimization) [21] is also a swarm

Journal of Applied Mathematics 7

intelligence algorithm which is based on the swarm behaviorof fish and bird schooling in nature A stud genetic algorithm(SGA) [41] is a GA that uses the best individual at eachgeneration for crossover

In all experiments we will use the same parameters forHS BA and HSBA that are loudness 119860 = 095 pulse rate119903 = 06 scaling factor 120576 = 01 the harmony memoryconsideration rate HMCR = 095 and the pitch adjustmentrate PAR = 01 For ACO BBO DE ES GA PSO and SGAwe set the parameters as follows For ACO initial pheromonevalue 120591

0= 1119864 minus 6 pheromone update constant 119876 = 20

exploration constant 1199020= 1 global pheromone decay rate

120588119892= 09 local pheromone decay rate 120588

119897= 05 pheromone

sensitivity 119904 = 1 and visibility sensitivity 120573 = 5 for BBOhabitat modification probability = 1 immigration probabilitybounds per gene = [0 1] step size for numerical integration ofprobabilities = 1 maximum immigration and migration ratesfor each island = 1 and mutation probability = 0005 for DEa weighting factor 119865 = 05 and a crossover constant CR = 05for the ES the number of offspring 120582 = 10 produced ineach generation and standard deviation 120590 = 1 for changingsolutions For the GA we used roulette wheel selection andsingle-point crossover with a crossover probability of 1 anda mutation probability of 001 For PSO we set an inertialconstant = 03 a cognitive constant = 1 and a social constantfor swarm interaction = 1 For the SGA we used single-pointcrossover with a crossover probability of 1 and a mutationprobability of 001

Well-defined problem sets are favorable for evaluatingthe performance of optimization methods proposed in thispaper Based on mathematical functions benchmark func-tions can be applied as objective functions to perform suchtests The properties of these benchmark functions can beeasily achieved from their definitions Fourteen differentbenchmark functions are applied to verify our proposedmetaheuristic algorithmHSBA Each of the functions in thisstudy has 20 independent variables (ie119863 = 20)

The benchmark functions described in Table 1 are stan-dard testing functionsTheproperties of the benchmark func-tions are given in Table 2 The modality property means thenumber of the best solutions in the search space Unimodalbenchmark functions only have an optimum which is theglobal optimum Multimodal benchmark functions have atleast two optima in their search space indicating that theyhave more than one local optima except the global optimumMore details of all the benchmark functions can be found in[6]

We set population size 119873119875 = 50 an elitism parameter119870119890119890119901 = 2 and maximum generation119872119886119909119892119890119899 = 50 for eachalgorithm We ran 100 Monte Carlo simulations of eachalgorithm on each benchmark function to get representativeperformances Tables 3 and 4 illustrate the results of thesimulations Table 3 shows the average minima found byeach algorithm averaged over 100 Monte Carlo runs Table 4shows the absolute best minima found by each algorithmover 100 Monte Carlo runs In other words Table 3 showsthe average performance of each algorithm while Table 4shows the best performance of each algorithmThe best valueachieved for each test problem is shown in bold Note that the

normalizations in the tables are based on different scales sovalues cannot be compared between the two tables

From Table 3 we see that on average HSBA is the mosteffective at finding objective function minimum on ten ofthe fourteen benchmarks (F02 F03 F06ndashF11 and F13-F14)ACO is the second most effective performing the best on thebenchmarks F04 and F05 BBO and SGA are the third mosteffective performing the best on the benchmarks F12 andF01 respectively whenmultiple runs aremade Table 4 showsthat HSBA performs the best on thirteen of the fourteenbenchmarks (F02ndashF14) while BBO performs the secondbest at finding objective function minimum on the onlybenchmark (F01) when multiple runs are made In additionstatistical analysis [42] on these values obtained by the tenmethods on 14 benchmark functions based on the Friedmanrsquostest [43] reveals that the differences in the obtained averageand the best functionminima are statistically significant (119875 =

166 lowast 10minus17 and119875 = 725 lowast 10minus17 resp) at the confidence levelof 5

Furthermore the computational requirements of the tenoptimization methods were similar We collected the averagecomputational time of the optimization methods as appliedto the 14 benchmarks discussed in this sectionThe results areshown in Table 3 BA was the quickest optimization methodHSBA was the third fastest of the ten algorithms Howeverit should be noted that in the vast majority of real-worldapplications it is the fitness function evaluation that is by farthe most expensive part of a population-based optimizationalgorithm

Furthermore convergence graphs of ACO BA BBODE ES GA HS HSBA PSO and SGA are shown inFigures 1ndash14 which represent the process of optimizationThevalues shown in Figures 1ndash14 are the mean objective functionoptimum achieved from 100Monte Carlo simulations whichare the true objective function values not normalized

Figure 1 shows the results obtained for the ten methodswhen the F01 Ackley function is applied From Figure 1clearly we can draw the conclusion that BBO is significantlysuperior to all the other algorithms including HSBA duringthe process of optimization while HSBA performs the sec-ond best on this multimodal benchmark function Here allthe other algorithms show the almost same starting point andHSBA has a faster convergence rate than other algorithmswhile HSBA is outperformed by BBO after 13 generationsAlthough slower SGA eventually finds the global minimumclose to HSBA Obviously BBO HSBA and SGA outper-form ACO BA DE ES GA HS and PSO during the wholeprocess of searching the global minimum

Figure 2 illustrates the optimization results for F02 Fletc-her-Powell function In thismultimodal benchmark problemit is obvious that HSBA outperforms all other methods dur-ing the whole progress of optimization (except the generation20ndash33 BBO and HSBA performs approximately equally)

Figure 3 shows the optimization results for F03 Griewankfunction From Table 3 and Figure 3 we can see that HSBAperforms the best on this multimodal benchmark problemLooking at Figure 3 carefully HSBA shows a faster conver-gence rate initially than ACO however it is outperformed

8 Journal of Applied Mathematics

Table1Be

nchm

arkfunctio

ns

No

Nam

eDefinitio

nSource

F01

Ackley

119891(119909)=20

+119890minus20

sdot119890minus02sdot radic(1119899)

119899 sum 119894=1

1199092 119894

minus119890(1119899)

119899 sum 119894=1

cos(2120587119909119894)

[3]

F02

Fletcher-Pow

ell119891(119909)=119899 sum 119894=1

(119860119894minus119861119894)2

119860119894=119899 sum 119895=1

(119886119894119895sin

120572119895+119887 119894119895cos120572119895)

119861119894=119899 sum 119895=1

(119886119894119895sin

119909119895+119887 119894119895cos119909119895)

[4]

F03

Grie

wank

119891(119909)=119899 sum 119894=1

1199092 119894

4000

minus119899 prod 119894=1

cos(

119909119894

radic119894)

+1

[5]

F04

Penalty

1119891(119909)=

120587 3010sin2(1205871199101)+119899minus1

sum 119894=1

(119910119894minus1)2

sdot[1+10sin2(120587119910119894+1)]

+(119910119899minus1)2

+119899 sum 119894=1

119906(119909119894101004)

119910119894=1+025

(119909119894+1)

[6]

F05

Penalty

2119891(119909)=

01sin2(31205871199091)+119899minus1

sum 119894=1

(119909119894minus1)2

sdot[1+sin2(3120587119909119894+1)]+(119909119899minus1)2

[1+sin2(2120587119909119899)]

+119899 sum 119894=1

119906(11990911989451004)

[6]

F06

Quarticwith

noise

119891(119909)=119899 sum 119894=1

(119894sdot1199094 119894+119880(01))

[7]

F07

Rastrig

in119891(119909)=10

sdot119899+119899 sum 119894=1

(1199092 119894minus10

sdotcos

(2120587119909119894))

[8]

F08

Rosenb

rock

119891(119909)=119899minus1

sum 119894=1

[100(119909119894+1minus1199092 119894)2+(119909119894minus1)2

][7]

F09

Schw

efel226

119891(119909)=4189829

times119863minus119863 sum 119894=1

119909119894sin

(1003816 1003816 1003816 1003816119909119894

1003816 1003816 1003816 100381612

)[9]

F10

Schw

efel12

119891(119909)=119899 sum 119894=1

(119894 sum 119895=1

119909119895)

2

[10]

F11

Schw

efel222

119891(119909)=119899 sum 119894=1

|119909119894|+119899 prod 119894=1

|119909119894|

[10]

F12

Schw

efel221

119891(119909)=max 119894

1003816 1003816 1003816 1003816119909119894

1003816 1003816 1003816 10038161le119894le119899

[6]

Journal of Applied Mathematics 9

Table1Con

tinued

No

Nam

eDefinitio

nSource

F13

Sphere

119891(119909)=119899 sum 119894=1

1199092 119894

[7]

F14

Step

119891(119909)=6sdot119899

+119899 sum 119894=1

lfloor 119909119894rfloor

[7]

lowast

Inbenchm

arkfunctio

nF0

2them

atrix

elem

entsa 119899times119899b 119899times119899isin(minus100100)120572119899times1isin(minus120587120587)ared

rawnfro

mun

iform

distr

ibution[4]

lowastIn

benchm

arkfunctio

nsF0

4andF0

5thed

efinitio

nof

thefun

ction119906(119909119894119886119896119898)isgivenby119906(119909119894119886119896119898)=119896(119909119894minus119886)119898119909119894gt1198860minus119886le119909119894le119886119896(minus119909119894minus119886)119898119909119894ltminus119886

10 Journal of Applied Mathematics

Table 2 Properties of benchmark functions lb denotes lower bound ub denotes upper bound and opt denotes optimum point

No Function lb ub opt Continuity ModalityF01 Ackley minus32768 32768 0 Continuous MultimodalF02 Fletcher-Powell minus120587 120587 0 Continuous MultimodalF03 Griewank minus600 600 0 Continuous MultimodalF04 Penalty 1 minus50 50 0 Continuous MultimodalF05 Penalty 2 minus50 50 0 Continuous MultimodalF06 Quartic with noise minus128 128 1 Continuous MultimodalF07 Rastrigin minus512 512 0 Continuous MultimodalF08 Rosenbrock minus2048 2048 0 Continuous UnimodalF09 Schwefel 226 minus512 512 0 Continuous MultimodalF10 Schwefel 12 minus100 100 0 Continuous UnimodalF11 Schwefel 222 minus10 10 0 Continuous UnimodalF12 Schwefel 221 minus100 100 0 Continuous UnimodalF13 Sphere minus512 512 0 Continuous UnimodalF14 Step minus512 512 0 Discontinuous Unimodal

Table 3 Mean normalized optimization results in fourteen benchmark functions The values shown are the minimum objective functionvalues found by each algorithm averaged over 100 Monte Carlo simulations

ACO BA BBO DE ES GA HS HSBA PSO SGAF01 231 333 115 202 338 272 347 109 266 100F02 2458 2582 158 894 2435 545 1569 100 1396 133F03 316 6072 193 544 2385 322 7722 100 2577 142F04 100 30E38 40E32 56E33 27E38 31E32 14E39 23E32 41E36 96E31F05 100 11E8 29942 15E6 46E8 54E3 41E8 21551 55E7 11110F06 48901 68E3 3532 30829 18E4 27483 15E4 100 25E3 1009F07 809 1155 128 656 1187 617 1022 100 844 180F08 4225 2901 229 759 5999 905 4785 100 1204 215F09 317 2026 199 1358 1333 181 1992 100 1761 115F10 175 373 138 295 493 125 422 100 248 148F11 105 1970 183 714 2312 1113 1945 100 1322 246F12 186 403 100 299 391 192 374 138 238 112F13 9830 15084 380 1903 22652 4774 18232 100 7291 402F14 773 12048 393 1331 10256 1153 14655 100 6344 328TimE 274 100 132 164 167 179 233 143 203 176lowast

The values are normalized so that the minimum in each row is 100 These are not the absolute minima found by each algorithm but the average minimafound by each algorithm

by ACO after 5 generations For other algorithms althoughslower BBO GA and SGA eventually find the global mini-mum close to HSBA while BA DE ES HS and PSO fail tosearch the global minimum within the limited iterations

Figure 4 shows the results for F04 Penalty 1 functionFrom Figure 4 it is obvious that HSBA outperforms all othermethods during the whole progress of optimization in thismultimodal benchmark function Although slower SGA andBBO perform the second and third best at finding the globalminimum

Figure 5 shows the performance achieved for F05 Penalty2 function For this multimodal function similar to F04Penalty 1 function shown in Figure 4 HSBA is significantly

superior to all the other algorithms during the process ofoptimization Here PSO shows a faster convergence rateinitially than HSBA however it is outperformed by HSBAafter 3 generations

Figure 6 shows the results achieved for the ten methodswhen using the F06 Quartic (with noise) function FromTable 3 and Figure 6 we can conclude that HSBA performsthe best in this multimodal function Looking carefullyat Figure 6 PSO and HSBA show the same initial fastconvergence towards the known minimum as the procedureproceeds HSBA gets closer and closer to the minimumwhile PSO comes into being premature and traps into thelocal minimum BA ES GA HS and PSO do not manage

Journal of Applied Mathematics 11

Table 4 Best normalized optimization results in fourteen benchmark functions The values shown are the minimum objective functionvalues found by each algorithm

ACO BA BBO DE ES GA HS HSBA PSO SGAF01 185 231 100 143 225 203 230 105 191 104F02 1042 1448 109 422 1039 403 991 100 828 133F03 273 4622 187 447 2138 802 4324 100 1675 176F04 44E6 50E6 12E3 19E4 22E6 30E4 42E6 100 3755 322F05 30E4 32E4 5101 38633 16E4 88446 26E4 100 4113 457F06 5851 99255 650 2469 80848 5924 74691 100 18971 202F07 571 853 125 513 794 523 747 100 600 168F08 2642 2503 148 370 3352 616 2150 100 775 145F09 243 866 128 490 593 209 728 100 740 142F10 189 494 118 266 300 208 276 100 201 174F11 774 1261 121 336 1214 601 960 100 781 162F12 133 233 144 169 208 174 211 100 170 121F13 2804 5657 217 554 6057 1971 5286 100 2098 228F14 428 5402 197 485 3361 1060 4911 100 1972 185lowast

The values are normalized so that the minimum in each row is 100 These are the absolute best minima found by each algorithm

0 10 20 30 40 508

10

12

14

16

18

20

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 1 Comparison of the performance of the different methodsfor the F01 Ackley function

to succeed in this benchmark function within maximumnumber of generations At last BBO DE and SGA convergeto the value very close to HSBA

Figure 7 shows the optimization results for the F07Rastrigin function Very clearly HSBA has the fastest con-vergence rate at finding the globalminimumand significantlyoutperforms all other approaches Looking carefully at Fig-ure 7 approximately we can divide all the algorithms intothree groups one group including BBO HSBA and SGAperforming the best the other group including ACO DE

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

106

105

Figure 2 Comparison of the performance of the different methodsfor the F02 Fletcher-Powell function

GA and PSO performing the second best another groupincluding BA ES and HS performing the worst

Figure 8 shows the results for F08 Rosenbrock functionFrom Table 3 and Figure 8 we can see that HSBA is superiorto the other algorithms during the optimization process inthis relatively simple unimodal benchmark function Alsowe see that similar to the F06 Quartic (with noise) functionshown in Figure 6 PSO shows a faster convergence rateinitially than HSBA however it is outperformed by HSBAafter 4 generations and is premature to the local minimum

Figure 9 shows the equivalent results for the F09 Schwefel226 function From Figure 9 very clearly though HSBA isoutperformed by BBO between the generations 10ndash30 it hasthe stable convergence rate at finding the global minimum

12 Journal of Applied Mathematics

0 10 20 30 40 50

50

100

150

200

250

300

350

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 3 Comparison of the performance of the different methodsfor the F03 Griewank function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

1010

109

108

107

106

105

104

103

102

101

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 4 Comparison of the performance of the different methodsfor the F04 Penalty 1 function

and significantly outperforms all other approaches in thismultimodal benchmark function

Figure 10 shows the results for F10 Schwefel 12 functionFrom Figure 10 we can see that HSBA performs far betterthan other algorithms during the optimization process in thisrelative simple unimodal benchmark function PSO showsa faster convergence rate initially than HSBA howeverit is outperformed by HSBA after 3 generations Lookingcarefully at Figure 10 akin to F07 Rastrigin function shown inFigure 7 all the algorithms except BA can be approximately

0

Ben

chm

ark

func

tion

val

ue

5 10 15 20 25 30 35 40 45 50Number of generations

109

108

107

106

105

104

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 5 Comparison of the performance of the different methodsfor the F05 Penalty 2 function

0 10 20 30 40 500

5

10

15

20

25

30

35

40

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 6 Comparison of the performance of the different methodsfor the F06 Quartic (with noise) function

divided into three groups one group including BBO andHSBA performing the best the other group including ACOGA PSO and SGA performing the second best anothergroup including DE ES and HS performing the worst

Figure 11 shows the results for F11 Schwefel 222 functionVery clearly BBO shows a faster convergence rate initiallythan HSBA however it is outperformed by HSBA after 40generations At last HSBA reaches the optimal solution sig-nificantly superior to other algorithms BBO is only inferior

Journal of Applied Mathematics 13B

ench

mar

k fu

ncti

on v

alue

0 10 20 30 40 50

50

100

150

200

250

Number of generations

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 7 Comparison of the performance of the different methodsfor the F07 Rastrigin function

0 10 20 30 40 50

500

1000

1500

2000

2500

3000

3500

4000

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 8 Comparison of the performance of the different methodsfor the F08 Rosenbrock function

to HSBA and performs the second best in this unimodalfunction

Figure 12 shows the results for F12 Schwefel 221 functionVery clearly HSBAhas the fastest convergence rate at findingthe global minimum and significantly outperforms all otherapproaches

0 10 20 30 40 50

1000

2000

3000

4000

5000

6000

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 9 Comparison of the performance of the different methodsfor the F09 Schwefel 226 function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

104

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 10 Comparison of the performance of the differentmethodsfor the F10 Schwefel 12 function

Figure 13 shows the results for F13 Sphere function FromTable 3 and Figure 13 HSBA has the fastest convergencerate at finding the global minimum and outperforms all otherapproaches Looking carefully at Figure 13 for BBO and SGAwe can see that BBO has a faster convergence rate than SGAbut SGA does finally converge to the value of BBO that isapproaching to HSBA

Figure 14 shows the results for F14 Step function Appar-ently HSBA shows the fastest convergence rate at findingthe global minimum and significantly outperforms all otherapproaches Looking carefully at Figure 14 for BBO and SGA

14 Journal of Applied MathematicsB

ench

mar

k fu

ncti

on v

alue

0 5 10 15 20 25 30 35 40 45 50Number of generations

102

101

100

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 11 Comparison of the performance of the different methodsfor the F11 Schwefel 222 function

0 10 20 30 40 5030

35

40

45

50

55

60

65

70

75

80

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 12 Comparison of the performance of the different methodsfor the F12 Schwefel 221 function

we can see that BBO has a faster convergence rate than SGAbut SGA does finally converge to the value of BBO

From the above-analyses about Figures 1ndash14 we can cometo the conclusion that our proposed hybrid metaheuristicalgorithm HSBA significantly outperforms the other ninealgorithms In general ACO BBO and SGA are only inferiorto the HSBA and ACO BBO and SGA perform betterthanHSBA in the benchmarks F04-F05 the benchmark F12and the benchmark F01 respectively Further benchmarksF04 F05 F06 F08 and F10 illustrate that PSO has a faster

0 10 20 30 40 500

10

20

30

40

50

60

70

80

90

100

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 13 Comparison of the performance of the different methodsfor the F13 Sphere function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

104

103

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 14 Comparison of the performance of the differentmethodsfor the F14 Step function

convergence rate initially while later it converges slowerand slower to the true objective function value At lastwe must point out that in [24] Simon compared BBOwith seven state-of-the-art EAs over 14 benchmark func-tions and a real-world sensor selection problem The resultsdemonstrated the good performance of BBO It is also indi-rectly demonstrated that our proposed hybrid metaheuristicmethod HSBA is a more powerful and efficient optimiza-tion algorithm than other population-based optimizationalgorithms

Journal of Applied Mathematics 15

Table 5 Best normalized optimization results in fourteen benchmark functions with different 119860 The numbers shown are the best resultsfound after 100 Monte Carlo simulations HSBA algorithm

119860

0 01 02 03 04 05 06 07 08 09 10F01 166 133 155 140 100 133 121 122 116 113 114F02 125E4 100 100 100 100 100 100 100 100 100 100F03 341 1115 100 100 100 100 100 100 100 100 100F04 273E4 100 125E4 104 104 104 104 104 104 100 100F05 503E5 530 133 168E4 100 100 100 100 100 100 100F06 100 256 547 473 1781 990 289 674 990 260 557F07 1014 1492 435 110 110 1537 101 100 100 100 100F08 3849 108 1406 108 108 108 9901 101 100 100 100F09 28518 40458 138 474 119 119 119 36791 101 100 100F10 66522 468 169E3 1527 118 118 118 118 121 100 100F11 2003 100 996 1154 3922 996 996 996 996 3897 849F12 1032 100 1322 323 1469 279 279 279 279 279 1931F13 146 415 284 447 115 156 100 100 100 100 100F14 52752 1357 395 101 115 1291 100 100 100 100 100

1 4 2 2 3 3 5 6 7 10 10

42 Influence of Control Parameter In [28] Dr Yang con-cluded that if the parameters in BA can be adjusted properlyit can outperform GA HS (harmony search) [26] and PSOThe choice of the control parameters is of vital importancefor different problems To investigate the influence of theloudness119860 pulse rate 119903 harmonymemory consideration rateHMCR and pitch adjustment rate PAR on the performanceof HSBA we carry out this experiment in the benchmarkproblem with different parameters All other parametersettings are kept unchanged (unless noted otherwise in thefollowing paragraph) The results are recorded in Tables 5ndash12 after 100 Monte Carlo runs Among them Tables 5 7 9and 11 show the best minima found by HSBA algorithm over100Monte Carlo runs Tables 6 8 10 and 12 show the averageminima found byHSBA algorithm averaged over 100MonteCarlo runs In other words Tables 5 7 9 11 and Tables 68 10 12 show the best and average performance of HSBAalgorithm respectively In each table the last row is the totalnumber of functions on which HSBA performs the best withsome parameters

421 Loudness 119860 Tables 5 and 6 recorded the resultsperformed in the benchmark problem with the loudness 119860 =0 01 02 09 10 and fixed pulse rate 119903 = 06 harmonymemory consideration rate HMCR= 095 and pitch adjust-ment rate PAR= 01 From Tables 5 and 6 obviously it can beseen that (i) for the three benchmark functions F02 F03 andF04 HSBA performs slightly differently that is to say thesethree benchmark functions are insensitive to the parameter119860 (ii) For benchmark functions F01 F06 F11 and F12HSBA performs better on smaller119860(lt05) (iii) However forfunctions F05 F07ndash10 F13 and F14 HSBA performs betteron bigger119860(gt05) In sum HSBA performs the best when119860is equal or very close to 10 So we set119860 = 095 which is veryclose to 09 and 10 in other experiments

422 Pulse Rate 119903 Tables 7 and 8 recorded the resultsperformed on the benchmark problem with the pulse rate119903 = 0 01 02 09 10 and fixed loudness119860 = 095harmony memory consideration rate HMCR=095 andpitch adjustment rate PAR= 01 From Tables 7 and 8 wecan evidently conclude that HSBA performs significantlybetter on bigger 119903(gt05) than on smaller 119903(lt05) and HSBAperforms the best when r is equal or very close to 06 So weset 119903 = 06 in other experiments

423 Harmony Memory Consideration Rate119867119872119862119877 Tables9 and 10 recorded the results performed in the bench-mark problem with the harmony memory consideration rateHMCR = 0 01 02 09 10 fixed loudness 119860 = 095pulse rate 119903 = 06 and pitch adjustment rate PAR = 01 FromTables 9 and 10 we can recognize that the function valuesevaluated by HSBA are bettersmaller on bigger HMCR(gt05) than on smaller r (lt05) and most benchmarks reachminimum when HMCR is equal or very close to 1 So we setHMCR=095 in other experiments

424 Pitch Adjustment Rate 119875119860119877 Tables 11 and 12 recordedthe results performed on the benchmark problems with thepitch adjustment rate PAR = 0 01 02 09 10 fixedloudness119860 = 095 pulse rate 119903 = 06 and harmonymemory consideration rate HMCR = 095 From Tables 11and 12 we can recognize that the function values for HSBAvary little with the increasing PAR and HSBA reachesoptimumminimum in most benchmarks when PAR is equalor very close to 01 So we set PAR = 01 in other experiments

43 Discussion In the HSBA the bats fly in the sky usingecholocation to find foodprey (ie best solutions) Fourother parameters are the loudness (119860) the rate of pulseemission (119903) harmony memory consideration rate (HMCR)

16 Journal of Applied Mathematics

Table 6 Mean normalized optimization results in fourteen benchmark functions with different A The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119860

0 01 02 03 04 05 06 07 08 09 10F01 101 101 100 100 100 100 100 100 100 100 100F02 152 100 146 117 108 137 188 165 167 179 148F03 669 1041 106 101 103 103 101 102 105 100 100F04 20131 126 410 421 333 272 100 519 291 100 311F05 59122 1682 425 587 101 414 2404 100 368 733 900F06 100 490E4 63799 25871 495E4 218 217 217 218 217 217F07 857 221E6 643 27234 11048 211E4 102 101 100 100 100F08 4980 114E4 172E4 16160 22459 9119 174E4 103 111 100 100F09 8237 173E6 106 314 7445 10310 4226 794E3 100 137 129F10 9031 145E3 245E5 232E3 2334 2234 3138 1289 234E3 140 100F11 498 100 315E4 233 1441 52023 44365 61626 24991 478E4 214F12 369 210E4 457E4 100 212E4 26635 23313 19880 27614 11201 214E4F13 190 825 448E6 214E6 100 615 25451 22275 18996 26386 10701F14 6691 195E5 117E4 124E3 2104 186E3 2940 2392 100 1835 2475

1 2 1 2 2 1 2 2 4 5 5

Table 7 Best normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 184 184 484 159 171 100 134 109 116 101F02 944E3 100 100 100 100 100 100 100 100 100 100F03 373 546 186 186 186 186 100 172 121 161 101F04 1142 111 2429 123 126 100 129 129 129 129 129F05 177E5 230 115 109E4 100 100 100 100 100 100 100F06 6244 2996 4858 1560 2634 780 182 100 3520 239 155F07 831 448 258 129 129 349 100 100 100 100 100F08 961 118 420 137 137 137 695 101 101 100 100F09 37394 44422 168 335 168 168 100 29198 101 101 101F10 38693 316 1397 463 158 158 100 158 30931 158 101F11 4255 100 1829 2135 2018 1111 1904 2135 1581 1876 1155F12 11440 100 14184 4448 12547 4448 4448 4448 4448 4448 6943F13 639 637 601 296 302 100 152 139 100 257 249F14 12052 404 233 100 116 340 100 100 116 116 116

0 3 1 2 2 4 8 5 4 4 4

and pitch adjustment rate (PAR) The appropriate update forthe pitch adjustment rate (PAR) and the rate of pulse emission(119903) balances the exploration and exploitation behavior of eachbat respectively as the loudness usually decreases once a bathas found its preysolution while the rate of pulse emissionincreases in order to raise the attack accuracy

For all of the standard benchmark functions that havebeen considered the HSBA has been demonstrated toperform better than or be equal to the standard BA and otheracclaimed state-of-the-art population-based algorithms withthe HSBA performing significantly better in some functionsThe HSBA performs excellently and efficiently because ofits ability to simultaneously carry out a local search still

searching globally at the same time It succeeds in doingthis due to the local search via harmony search algorithmand global search via bat algorithm concurrently A similarbehavior may be performed in the PSO by using multiswarmfrom a particle population initially [44] However HSBArsquosadvantages include performing simply and easily and onlyhaving four parameters to regulate The work carried outhere demonstrates the HSBA to be robust over all kinds ofbenchmark functions

Benchmark evaluation is a good way for verifying theperformance of the metaheuristic algorithms but it also haslimitations First we did not make any special effort to tunethe optimization algorithms in this section Different tuning

Journal of Applied Mathematics 17

Table 8 Mean normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 199 186 194 159 100 142 134 109 116 171F02 331 538 457 308 382 114 100 289 279 402 311F03 373 546 533 229 336 241 100 172 121 161 136F04 1142 111 2429 149E4 113E3 394E3 278E4 1966 17522 114 100F05 1609 12862 3269 2446 6440 1277 2959 7617 456 100 446F06 6244 2696 4858 1560 2634 780 182 100 3520 239 155F07 330 178 134 103 123 139 155 100 119 136 349F08 488 393 394 194 219 383 353 433 100 227 314F09 139 166 114 100 121 115 115 109 108 104 109F10 721 994 875 360 846 611 483 414 576 100 271F11 382 423 373 205 181 100 171 220 142 168 189F12 164 217 204 147 180 186 100 121 113 134 156F13 639 637 601 296 302 190 152 139 100 257 249F14 574 1543 742 682 644 317 100 485 221 174 209

0 0 0 1 0 2 4 2 2 2 1

Table 9 Best normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 164 164 164 164 164 164 164 151 128 100 163F02 187E4 100 100 100 100 100 100 100 100 100 100F03 1982 2649 371 371 371 371 371 371 371 206 100F04 126E5 100 187E4 100 100 100 100 100 100 100 100F05 607E5 534 100 187E4 100 100 100 100 100 100 100F06 10892 24631 36504 34540 33857 23465 14349 4528 2330 100 2575F07 922 1428 534 100 100 930 100 100 100 100 100F08 1810 108 772 108 108 108 3594 100 100 100 100F09 28004 39161 127 681 127 127 127 22795 100 100 100F10 55165 876 176E3 1170 164 164 164 164 27448 100 100F11 1467 100 618 618 1399 618 618 618 440 269 616F12 768 100 1110 273 1021 273 273 273 273 273 473F13 772 2675 1590 1776 612 1040 612 595 255 100 225F14 53716 1428 534 100 100 713 100 100 100 100 100

0 4 2 4 5 3 5 6 7 11 9

parameter values in the optimization algorithmsmight resultin significant differences in their performance Second real-world optimization problems may not have much of arelationship to benchmark functionsThird benchmark testsmight result in different conclusions if the grading criteria orproblem setup change In our work we examined the meanand best results obtained with a certain population size andafter a certain number of generations However we mightarrive at different conclusions if (for example) we change thegeneration limit or look at how many generations it takes toreach a certain function value or if we change the populationsize In spite of these caveats the benchmark results shown

here are promising for HSBA and indicate that this novelmethod might be able to find a niche among the plethora ofpopulation-based optimization algorithms

We note that CPU time is a bottleneck to the implemen-tation of many population-based optimization algorithmsIf an algorithm cannot converge fast it will be impracticalsince it would take too long to find an optimal or suboptimalsolution HSBA does not seem to require an unreasonableamount of computational effort of the ten optimization algo-rithms compared in this paper HSBA was the third fastestNevertheless findingmechanisms to accelerate HSBA couldbe an important area for further research

18 Journal of Applied Mathematics

Table 10 Mean normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 101 101 101 101 101 101 101 101 101 100 101F02 74756 787 525 343 509 710 307 187 160 100 101F03 913 312E4 109 107 105 106 105 102 101 101 100F04 2101198646 1001198644 3951198644 9481198643 5971198643 2951198643 1511198643 1501198643 4694 100 237F05 3241198646 3271198644 2091198644 4141198644 7131198643 1541198643 8931198643 1531198643 62948 100 20373F06 100 5941198644 42204 63215 6001198644 192 191 191 191 191 191F07 1063 2071198646 900 21672 32455 307E4 104 103 102 101 100F08 5950 9881198643 304E4 14180 21701 32468 307E4 107 107 103 100F09 22687 4951198646 308 891 11422 17360 25941 245E4 145 100 171F10 25737 2831198644 935E5 137E4 9709 6656 9925 14940 139E4 100 273F11 607 100 183E4 202 1661 39040 26299 40300 60361 572E4 184F12 300 279E4 360E4 100 282E4 27095 19414 13077 20040 30016 284E4F13 232 966 561E6 189E6 100 815 26778 19186 12923 19805 29665F14 18466 384E5 117E4 185E3 100 571E3 2511 5539 3960 2657 4149

1 1 0 1 2 0 0 0 0 6 3

Table 11 Best normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 391E3 100 100 100 100 100 100 100 100 100 100F03 100 425 151 220 220 212 220 199 220 128 180F04 100 255 6554 255 255 255 255 255 255 255 255F05 252 100 171 169E3 171 171 171 171 171 171 171F06 100 5987 3418 2285 4652 2344 4354 4484 3688 1899 812F07 807 100 148 255 255 1306 255 255 255 255 255F08 543 100 192 100 100 100 1145 100 100 100 100F09 7672 252 117 100 171 171 171 15556 171 171 171F10 92910 148 100 491 255 255 255 222 235E3 255 255F11 318E3 100 582E3 399E3 339E3 582E3 491E3 582E3 582E3 958E3 582E3F12 30506 100 53728 9734 18758 9734 9734 9734 9734 9734 39827F13 192 422 587 1007 1298 466 148 401 317 100 409F14 8812 100 148 255 255 491 255 255 255 255 255

4 8 3 4 3 3 2 3 3 4 3

5 Conclusion and Future Work

This paper proposed a hybrid metaheuristic HSBA methodfor optimization problemWe improved the BAby combiningoriginal harmony search (HS) algorithm and evaluating theHSBA on multimodal numerical optimization problemsA novel type of BA model has been presented and animprovement is applied to mutate between bats using har-mony search algorithm during the process of bats updatingUsing the original configuration of the bat algorithm wegenerate the new harmonies based on the newly generatedbat each iteration after batrsquos position has been updated Thenew harmony vector substitutes the newly generated bat only

if it has better fitness This selection scheme is rather greedywhich often overtakes original HS and BA The HSBAattempts to take merits of the BA and the HS in order toavoid all bats getting trapped in inferior local optimal regionsThe HSBA enables the bats to have more diverse exemplarsto learn from as the bats are updated each iteration andalso form new harmonies to search in a larger search spaceThis new method can speed up the global convergence ratewithout losing the strong robustness of the basic BA Fromthe analysis of the experimental results we observe that theproposed HSBA makes good use of the information in pastsolutions more effectively to generate better quality solutionsfrequently when compared to the other population-based

Journal of Applied Mathematics 19

Table 12Mean normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 407 100 100 452 277 257 341 536 369 489 103F03 100 127E4 134 135 135 135 134 135 134 134 134F04 115 8127 939E3 100 101 101 129 100 100 101 101F05 34573 5061 8619 907E3 2501 108 101 191 333 100 118F06 100 542E6 427E4 474E4 548E6 57788 57788 57788 57788 57787 57787F07 486 153 100 9543 10596 122E4 134 136 132 133 135F08 1269 7600 876E3 1703 6913 7672 885E3 110 105 104 100F09 6345 23047 182 167 1278 4860 5349 609E3 111 130 100F10 13355 1251 126 197E3 1148 464 1645 1893 199E3 100 188F11 6302 100 639E3 7944 5924 396E3 142E3 581E3 645E3 745E5 7981F12 390 881E3 890E3 100 890E3 4440 4778 1725 7011 7784 899E3F13 100 2166 207E3 669 580 430 27167 28246 10539 42926 47662F14 4614 100 3610 5593 122 640E3 4215 3289 3481 1272 5129

4 4 3 3 1 1 1 2 2 3 3

optimization algorithms such as ACO BA BBO DE ESGA HS PSO and SGA Based on the results of the tenapproaches on the test problems we can conclude that theHSBA significantly improves the performances of the HSand the BA on most multimodal and unimodal problems

In this work 14 benchmark functions are used to evaluatethe performance of our approach we will test our approachon more problems such as the high-dimensional (119863 ge20) CEC 2010 test suit [45] and the real-world problemsMoreover we will compare HSBA with other metaheuristicmethod such as DLHS [46 47] SGHS [48] adaptive DE[49] CLPSO [50] and DMS-PSO [51] In addition we onlyconsider the unconstrained function optimization in thiswork Our future work consists on adding the diversity rulesinto HSBA for constrained optimization problems such asconstrained real-parameter optimization CEC 2010 test suit[52]

In the field of optimization there are many issues worthyof further study and efficient optimizationmethod should bedeveloped depending on the analysis of specific engineeringproblem Our future work will focus on the two issues onthe one hand we would apply our proposed approach HSBA to solve practical engineering optimization problemsand obviously HSBA can become a fascinating method forreal-world engineering optimization problems on the otherhand we would develop new meta-hybrid approach to solveoptimization problem

Acknowledgments

This work was supported by State Key Laboratory of LaserInteraction with Material Research Fund under Grant noSKLLIM0902-01 and Key Research Technology of Electric-discharge Non-chain Pulsed DF Laser under Grant no LXJJ-11-Q80

References

[1] G Wang L Guo A H Gandomi et al ldquoLevy-flight krill herdalgorithmrdquo Mathematical Problems in Engineering vol 2013Article ID 682073 14 pages 2013

[2] A H Gandomi X S Yang S Talatahari and A H AlaviMetaheuristic Applications in Structures and InfrastructuresElsevier Waltham Mass USA 2013

[3] D H Ackley ldquoAn empirical study of bit vector functionoptimizationrdquo in Genetic Algorithms and Simulated AnnealingL Davis Ed pp 170ndash204 Pitman London UK 1987

[4] R Fletcher and M J D Powell ldquoA rapidly convergent descentmethod for minimizationrdquoThe Computer Journal vol 6 no 2pp 163ndash168 1963

[5] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[6] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[7] K A De Jong Analysis of the Behavior of a Class of GeneticAdaptive Systems University of Michigan 1975

[8] L A Rastrigin Extremal Control Systems NaukaMoscow Rus-sian 1974Theoretical Foundations of Engineering CyberneticsSeries

[9] S-K S Fan and J-M Chang ldquoDynamic multi-swarm particleswarm optimizer using parallel PC cluster systems for globaloptimization of large-scale multimodal functionsrdquo EngineeringOptimization vol 42 no 5 pp 431ndash451 2010

[10] W Gong Z Cai and C X Ling ldquoDEBBO a hybrid differentialevolution with biogeography-based optimization for globalnumerical optimizationrdquo Soft Computing vol 15 no 4 pp 645ndash665 2010

[11] G Wang L Guo H Duan L Liu and H Wang ldquoA modifiedfirefly algorithm for UCAV path planningrdquo International Jour-nal of Hybrid Information Technology vol 5 no 3 pp 123ndash1442012

20 Journal of Applied Mathematics

[12] GWang LGuoHDuan L Liu andHWang ldquoAhybridmeta-heuristic DECS algorithm for UCAVpath planningrdquo Journal ofInformation and Computational Science vol 9 no 16 pp 1ndash82012

[13] GWang L Guo H Duan L Liu HWang andM Shao ldquoPathplanning for uninhabited combat aerial vehicle using hybridmeta-heuristic DEBBO algorithmrdquo Advanced Science Engi-neering and Medicine vol 4 no 6 pp 550ndash564 2012

[14] G Wang L Guo H Duan H Wang L Liu and M Shao ldquoAhybrid meta-heuristic DECS algorithm for UCAV three-dimension path planningrdquo The Scientific World Journal vol2012 Article ID 583973 11 pages 2012

[15] H DuanW Zhao GWang and X Feng ldquoTest-sheet composi-tion using analytic hierarchy process and hybrid metaheuristicalgorithmTSBBOrdquoMathematical Problems in Engineering vol2012 Article ID 712752 22 pages 2012

[16] G Wang L Guo H Duan L Liu and H Wang ldquoDynamicdeployment of wireless sensor networks by biogeography basedoptimization algorithmrdquo Journal of Sensor and Actuator Net-works vol 1 no 2 pp 86ndash96 2012

[17] X S Yang A H Gandomi S Talatahari and A H AlaviMetaheuristics in Water Geotechnical and Transport Engineer-ing Elsevier Waltham Mass USA 2013

[18] D E GoldbergGenetic Algorithms in Search Optimization andMachine Learning Addison-Wesley New York NY USA 1998

[19] R Storn and K Price ldquoDifferential evolutionmdasha simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997

[20] S Das and P N Suganthan ldquoDifferential evolution a survey ofthe state-of-the-artrdquo IEEE Transactions on Evolutionary Com-putation vol 15 no 1 pp 4ndash31 2011

[21] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[22] A H Gandomi G J Yun X-S Yang and S Talatahari ldquoChaos-enhanced accelerated particle swarm optimizationrdquo Communi-cations in Nonlinear Science and Numerical Simulation vol 18no 2 pp 327ndash340 2013

[23] S Talatahari X S Yang R Sheikholeslami andAH GandomildquoOptimal parameter estimation for muskingum model usinghybrid CSS and PSO methodrdquo Journal of Computational andTheoretical Nanoscience In press

[24] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[25] G Wang L Guo H Duan L Liu H Wang and M ShaoldquoHybridizing harmony search with biogeography based opti-mization for global numerical optimizationrdquo Journal of Com-putational and Theoretical Nanoscience In press

[26] Z W Geem J H Kim and G V Loganathan ldquoA new heuristicoptimization algorithm harmony searchrdquo Simulation vol 76no 2 pp 60ndash68 2001

[27] G Wang L Guo H Duan H Wang L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[28] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress Frome UK 2nd edition 2010

[29] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing amp Applications In press

[30] X S Yang andAHGandomi ldquoBat algorithm a novel approachfor global engineering optimizationrdquo Engineering Computa-tions vol 29 no 5 pp 464ndash483 2012

[31] G Wang L Guo H Duan L Liu and H Wang ldquoA Bat algo-rithm with mutation for UCAV path planningrdquo The ScientificWorld Journal vol 2012 Article ID 418946 15 pages 2012

[32] R Parpinelli and H Lopes ldquoNew inspirations in swarm intelli-gence a surveyrdquo International Journal of Bio-Inspired Computa-tion vol 3 no 1 pp 1ndash16 2011

[33] P W Tsai J S Pan B Y Liao M J Tsai and V Istanda ldquoBatalgorithm inspired algorithm for solving numerical optimiza-tion problemsrdquo Applied Mechanics and Materials vol 148 pp134ndash137 2012

[34] Wikipedia ldquoMathematical optimizationrdquo httpenwikipediaorgwikiNumerical optimization

[35] W Y Zhang S Xu and S J Li ldquoNecessary conditions for weaksharp minima in cone-constrained optimization problemsrdquoAbstract and Applied Analysis vol 2011 Article ID 909520 11pages 2012

[36] Wikipedia ldquoGlobal optimizationrdquo httpenwikipediaorgwikiGlobal optimization

[37] S Kirkpatrick C D Gelatt Jr and M P Vecchi ldquoOptimizationby simulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983

[38] M Dorigo and T Stutzle Ant Colony Optimization The MITPress Cambridge Mass USA 2004

[39] M Dorigo Optimization Learning and Natural AlgorithmsPolitecnico di Milano Italy 1992

[40] H G Beyer The Theory of Evolution Strategies Springer NewYork NY USA 2001

[41] W Khatib and P Fleming ldquoThe stud GA a mini revolutionrdquo inParallel Problem Solving From Nature pp 683ndash691 1998

[42] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[43] M Friedman ldquoA comparison of alternative tests of significancefor the problem of 119898 rankingsrdquo The Annals of MathematicalStatistics vol 11 no 1 pp 86ndash92 1940

[44] R Brits A P Engelbrecht and F van den Bergh ldquoLocatingmultiple optima using particle swarm optimizationrdquo AppliedMathematics and Computation vol 189 no 2 pp 1859ndash18832007

[45] K Tang X Li P N Suganthan Z Yang and T Weise ldquoBench-mark functions for the CECrsquo2010 special session and competi-tion on large scale global optimizationrdquo Nature Inspired Com-putation and Applications Laboratory USTC Hefei China2010

[46] Q-K Pan P N Suganthan J J Liang and M F Tasgetiren ldquoAlocal-best harmony search algorithm with dynamic subpopula-tionsrdquo Engineering Optimization vol 42 no 2 pp 101ndash117 2010

[47] Q-K Pan P N Suganthan J J Liang and M F TasgetirenldquoA local-best harmony search algorithm with dynamic sub-harmony memories for lot-streaming flow shop schedulingproblemrdquo Expert Systems with Applications vol 38 no 4 pp3252ndash3259 2011

[48] Q-K Pan P N Suganthan M F Tasgetiren and J J LiangldquoA self-adaptive global best harmony search algorithm forcontinuous optimization problemsrdquo Applied Mathematics andComputation vol 216 no 3 pp 830ndash848 2010

Journal of Applied Mathematics 21

[49] S M Islam S Das S Ghosh S Roy and P N Suganthan ldquoAnadaptive differential evolution algorithm with novel mutationand crossover strategies for global numerical optimizationrdquoIEEE Transactions on Systems Man and Cybernetics B vol 42no 2 pp 482ndash500 2012

[50] J J Liang A K Qin P N Suganthan and S Baskar ldquoCompre-hensive learning particle swarm optimizer for global optimiza-tion of multimodal functionsrdquo IEEE Transactions on Evolution-ary Computation vol 10 no 3 pp 281ndash295 2006

[51] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[52] R Mallipeddi and P Suganthan Problem Definitions and Eval-uation Criteria for the CEC 2010 Competition on ConstrainedReal-Parameter Optimization Nanyang Technological Univer-sity Singapore 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

2 Journal of Applied Mathematics

bats with varying pulse rates of emission and loudness Theprimary purpose for a batrsquos echolocation is to act as a signalsystem to sense distance and hunt for foodprey A com-prehensive review about swarm intelligence involving BA isperformed by Parpinelli and Lopes [32] Furthermore Tsai etal proposed an evolving bat algorithm (EBA) to improve theperformance of BA with better efficiency [33]

Firstly proposed by Geem et al in 2001 harmony search(HS) [26] is a new metaheuristic approach for minimizingpossibly nondifferentiable and nonlinear functions in con-tinuous space HS is inspired by the behavior of a musicianrsquosimprovisation process where every musician attempts toimprove its tune so as to create optimal harmony in a real-world musical performance processes HS algorithm origi-nates in the similarity between engineering optimization andmusic improvisation and the engineers search for a globaloptimal solution as determined by an objective functionjust like the musicians strive for finding aesthetic harmonyas determined by aesthetician In music improvisation eachmusician chooses any pitchwithin the feasible range togetherproducing one harmony vector If all the pitches produce agood solution that experience is reserved in each variablersquosmemory and the possibility of producing a better solutionis also increased next time Furthermore this new approachrequires few control variables which makes HS easy toimplement more robust and very appropriate for parallelcomputation

BA is a powerful algorithm in exploitation (ie localsearch) but at times it may trap into some local optima sothat it cannot perform global search well For bat algorithmthe search depends completely on random walks so a fastconvergence cannot be guaranteed Firstly presented here inorder to increase the diversity of the population for BA soas to avoid trapping into local optima a main improvementof adding pitch adjustment operation in HS serving as amutation operator ismade to the BAwith the aim of speedingup convergence thus making the approach more feasiblefor a wider range of practical applications while preservingthe attractive characteristics of the basic BA That is tosay we combine two approaches to propose a new hybridmetaheuristic algorithm according to the principle of HS andBA and then this improved BA method is used to searchthe optimal objective function valueThe proposed approachis evaluated on fourteen standard benchmark functions thathave ever been applied to verify optimization algorithmson continuous optimization problems Experimental resultsshow that the HSBA performs more efficiently and accu-rately than basic BA ACO BBO DE ES GA HS PSO andSGA

The structure of this paper is organized as followsSection 2 describes global numerical optimization problemthe HS algorithm and basic BA in brief Our proposedapproach HSBA is presented in detail in Section 3 Subse-quently our method is evaluated through fourteen bench-mark functions in Section 4 In addition the HSBA isalso compared with BA ACO BBO DE ES GA HS PSOand SGA Finally Section 5 consists of the conclusion andproposals for future work

2 Preliminary

To begin with in this section we will provide a brief back-ground on the optimization problem harmony search (HS)and bat algorithm (BA)

21 Optimization Problem In computer science mathemat-ics management science and control theory optimization(also called mathematical optimization or mathematical pro-gramming) means the selection of an optimal solution fromsome set of feasible alternatives In general an optimizationproblem includes minimizing or maximizing a function bysystematically selecting input values from a given feasibleset and calculating the value of the function More gener-ally optimization consists of finding the optimal values ofsome objective function within a given domain including anumber of different types of domains and different types ofobjective functions [34]

A global optimization problem can be described asfollows

Given a function 119891 119863 rarr 119877 from some set 119863 to thereal numbersSought a parameter 119909

0in 119863 such that 119891(119909

0) le 119891(119909)

for all 119909 in 119863 (ldquominimizationrdquo) or such that 119891(1199090) ge

119891(119909) for all 119909 in119863 (ldquomaximizationrdquo)

Such a formulation is named a numerical optimizationproblem Many theoretical and practical problems may bemodeled in this general framework In general 119863 is somesubset of the Euclidean space 119877119899 often specified by a groupof constraints equalities or inequalities that the componentsof 119863 have to satisfy The domain 119863 of 119891 is named the searchspace while the elements of 119863 are named feasible solutionsor candidate solutions In general the function 119891 is calledan objective function cost function (minimization) or utilityfunction (maximization) An optimal solution is an availablesolution that is the extreme of (minimum or maximum) theobjective function

Conventionally the standard formulation of an optimiza-tion problem is stated in terms of minimization In generalunless both the feasible region and the objective function areconvex in a minimization problem there may be more thanone local minima A local minimum 119909lowast is defined as a pointfor which the following expression

119891 (119909lowast) le 119891 (119909) (1)

holds More details about the optimization problem can befound in [35]

The branch of numerical analysis and applied mathe-matics that investigates deterministic algorithms that canguarantee convergence in limited time to the true optimalsolution of a nonconvex problem is called global numer-ical optimization problems A variety of algorithms havebeen proposed to solve nonconvex problems Among themheuristics algorithms can evaluate approximate solutions tosome optimization problems as described in introduction[36]

Journal of Applied Mathematics 3

22 Harmony Search Firstly developed by Geem et al in2001 harmony search (HS) [26] is a relatively new meta-heuristic optimization algorithm and it is based on naturalmusical performance processes that occur when a musiciansearches for an optimal state of harmony The optimizationoperators of HS algorithm are specified as the harmonymemory (HM) which keeps the solution vectors which areall within the search space as shown in (2) the harmonymemory size HMS which represents the number of solutionvectors kept in the HM the harmony memory considerationrate (HMCR) the pitch adjustment rate (PAR) the pitchadjustment bandwidth (bw)

HM =

[[[[[

[

11990911

11990912

1199091119863

fitness (1199091)11990921

11990922

1199092119863

fitness (1199092)

119909HMS1

119909HMS2

119909HMS119863

fitness (119909HMS)

]]]]]

]

(2)

In more detail we can explain the HS algorithm withthe help of discussing the improvisation process by a musicplayer When a player is improvising he or she has threepossible options (1) play any well-known piece of music (aseries of pitches in harmony) exactly from his or her memoryas the harmony memory consideration rate (HMCR) (2)play something similar to a known piece in playerrsquos memory(thus adjusting the pitch slightly) or (3) play totally newor random pitch from feasible ranges If these three optionsare formalized for optimization we have three correspond-ing components employment of harmony memory pitchadjusting and randomization Similarly when each decisionvariable selects one value in theHS algorithm it canmake useof one of the above three rules in the whole HS procedure If anew harmony vector is better than the worst harmony vectorin the HM the new harmony vector takes the place of theworst harmony vector in the HMThis procedure is repeateduntil a stopping criterion is satisfied

The employment of harmony memory is significant asit is analogous to select the optimal fit individuals in theGA (genetic algorithm) This will make sure that the bestharmonies will be kept on to the new harmony memory Forthe purpose of using this memorymore effectively we shouldproperly set the value of the parameter HMCR isin [0 1]If this rate is very approaching to 1 (too high) almost allthe harmonies are utilized in the harmony memory thenother harmonies are not exploredwell resulting in potentiallywrong solutions If this rate is very close to 0 (extremely low)only few best harmonies are chosen and it may have a slowconvergence rate Therefore generally HMCR = 07 sim 095

To adjust the pitch slightly in the second component anappropriate approach is to be applied to adjust the frequencyefficiently In theory we can adjust the pitch linearly ornonlinearly but in fact linear adjustment is utilized If 119909old isthe current solution (or pitch) then the new solution (pitch)119909new is generated by

119909new = 119909old + bw (2120576 minus 1) (3)

where 120576 is a random real number drawn from a uniformdistribution [0 1] Here bw is the bandwidth controlling the

local range of pitch adjustment Actually we can see that thepitch adjustment (3) is a random walk

Pitch adjustment is similar to the mutation operator inGA Also we must appropriately set the parameter PAR tocontrol the degree of the adjustment If PAR nears 1 (toohigh) then the solution is always changing and the algorithmmay not converge at all If it is very close to 0 (too low) thenthere is very little change and the algorithm may prematureTherefore we use PAR = 01 sim 05 in most simulations asusual

For the purpose of increasing the diversity of the solu-tions the randomization is needed in the third componentAlthough adjusting pitch has a similar role it is confinedto certain local pitch adjustment and thus corresponds toa local search The usage of randomization can make thesystem move further to explore multifarious regions withhigh solution diversity in order to search for the globaloptimal solution In real-world engineering applicationsHS has been applied to solve many optimization problemsincluding function optimization water distribution networkgroundwater modeling energy-saving dispatch structuraldesign and vehicle routing

The mainframe of the basic HS algorithm can bedescribed as shown in Algorithm 1 where119863 is the number ofdecision variables and rand is a random real number in inter-val (0 1) drawn fromuniformdistribution FromAlgorithm 1it is clear that there are only two control parameters in HSwhich are HMCR and PAR

23 BatAlgorithm Thebat algorithm is a novelmetaheuristicswarm intelligence optimization method developed for theglobal numerical optimization in which the search algorithmis inspired by social behavior of bats and the phenomenon ofecholocation to sense distance

In [28] for simplicity bat algorithm is based on idealizingsome of the echolocation characteristics of bats which are thefollowing approximate or idealized rules

(1) all bats apply echolocation to sense distance and theyalways ldquoknowrdquo the surroundings in some magicalway

(2) bats fly randomly with velocity V119894and a fixed fre-

quency 119891min at position 119909119894 varying wavelength 120582 andloudness119860

0to hunt for preyThey can spontaneously

accommodate the wavelength (or frequency) of theiremitted pulses and adjust the rate of pulse emission119903 isin [0 1] depending on the proximity of their target

(3) although the loudness can change in different ways itis supposed that the loudness varies from aminimumconstant (positive) 119860min to a large 1198600

Based on these approximations and idealization the basicsteps of the bat algorithm (BA) can be described as shown inAlgorithm 2

In BA each bat is defined by its position 119909119905119894 velocity V

119905

119894

frequency 119891119894 loudness 119860119905

119894 and the emission pulse rate 119903119905

119894

4 Journal of Applied Mathematics

BeginStep 1 Set the parameters and initialize the HMStep 2 Evaluate the fitness for each individual in HMStep 3 while the halting criteria is not satisfied do

for 119889 = 1119863 doif rand ltHMCR then memory consideration

119909new (119889) = 119909119886(119889) where 119886 isin (1 2 HMS)

if rand lt PAR then pitch adjustment119909new (119889) = 119909old (119889) + bw times (2 times rand minus 1)

endifelse random selection

119909new(119889) = 119909min119889 + rand times (119909max119889 minus 119909minminus119889)endif

endfor 119889Update the HM as 119909

119908= 119909new if 119891(119909new) lt 119891(119909

119908) (minimization objective)

Update the best harmony vector found so farStep 4 end whileStep 5 Post-processing the results and visualization

End

Algorithm 1 Harmony search algorithm

BeginStep 1 Initialization Set the generation counter 119905 = 1 Initialize the population of NP bats 119875

randomly and each bat corresponding to a potential solution tothe given problem define loudness 119860

119894 pulse frequency 119876

119894

and the initial velocities V119894(119894 = 1 2 NP) set pulse rate 119903

119894

Step 2 While the termination criteria is not satisfied or 119905 ltMaxGeneration doGenerate new solutions by adjusting frequency and updating velocitiesand locationssolutions [(4)ndash(6)]if (rand gt 119903

119894) then

Select a solution among the best solutionsGenerate a local solution around the selected best solution

end ifGenerate a new solution by flying randomlyif (rand lt 119860

119894amp 119891(119909

119894) lt 119891(119909

lowast)) then

Accept the new solutionsIncrease 119903

119894and reduce 119860

119894

end ifRank the bats and find the current best 119909

lowast

119905 = 119905 + 1Step 3 end whileStep 4 Post-processing the results and visualization

End

Algorithm 2 Bat algorithm

in a 119863-dimensional search space The new solutions 119909119905119894and

velocities V119905119894at time step t are given by

119891119894= 119891min + (119891max minus 119891min) 120573 (4)

V119905

119894= V119905minus1

119894+ (119909119905119894minus 119909lowast) 119891119894 (5)

119909119905119894= 119909119905minus1119894

+ V119905

119894 (6)

where 120573 isin [0 1] is a random vector drawn from a uniformdistribution Here 119909

lowastis the current global best location (solu-

tion)which is located after comparing all the solutions among

all the 119899 bats Generally speaking depending on the domainsize of the problem of interest the frequency 119891 is assignedto 119891min = 0 and 119891max = 100 in practical implementationInitially each bat is randomly given a frequency which isdrawn uniformly from [119891min 119891max]

For the local search part once a solution is selectedamong the current best solutions a new solution for each batis generated locally using random walk

119909new = 119909old + 120576119860119905 (7)

Journal of Applied Mathematics 5

where 120576 isin [minus1 1] is a scaling factor which is a random num-ber while 119860

119905= ⟨119860119905119894⟩ is the average loudness of all the bats at

time step 119905The update of the velocities and positions of bats is similar

to the procedure in the standard PSO [21] as 119891119894controls the

pace and range of the movement of the swarming particles inessence To some degree BA can be considered as a balancedcombination of the standard PSO and the intensive localsearch controlled by the loudness and pulse rate

Furthermore the loudness 119860119894and the rate 119903

119894of pulse

emission update accordingly as the iterations proceed asshown in (8)

119860119905+1119894

= 120572119860119905119894

119903119905+1119894

= 1199030119894[1 minus exp (minus120574119905)]

(8)

where 120572 and 120574 are constants In essence 120572 is similar tothe cooling factor of a cooling schedule in the simulatedannealing (SA) [37] For simplicity we set 120572 = 120574 = 09 inthis work

3 Our Approach HSBA

Based on the introduction of HS and BA in previous sectionwe will explain how we combine the two approaches to formthe proposed bat algorithm with harmony search (HSBA) inthis section whichmodifies the solutions with poor fitness inorder to add diversity of the population to improve the searchefficiency

In general the standard BA algorithm is adept at exploit-ing the search space but at times it may trap into somelocal optima so that it cannot perform global search wellFor BA the search depends completely on random walks soa fast convergence cannot be guaranteed Firstly presentedhere in order to increase the diversity of the populationfor BA so as to avoid trapping into local optima a mainimprovement of adding pitch adjustment operation in HSserving as a mutation operator is made to the BA with theaim of speeding up convergence thus making the approachmore feasible for a wider range of practical applications whilepreserving the attractive characteristics of the basic BA Inthis paper a hybrid metaheuristic algorithm by inducing thepitch adjustment operation in HS as a mutation operatorinto bat algorithm so-called harmony searchbat algorithm(HSBA) is used to optimize the benchmark functions Thedifference between HSBA and BA is that the mutationoperator is used to improve the original BA generating a newsolution for each bat In this way this method can explore thenew search space by the mutation of the HS algorithm andexploit the population information with BA and thereforecan avoid trapping into local optima in BA In the followingwe will show the algorithm HSBA which is an improvementof HS and BA

The critical operator of HSBA is the hybrid harmonysearchmutation operator which composes the improvisationof harmony in HS with the BAThe core idea of the proposedhybrid mutation operator is based on two considerationsFirstly poor solutions can take in many new used features

from good solutions Secondly the mutation operator canimprove the exploration of the new search space In this waythe strong exploration abilities of the original HS and theexploitation abilities of the BA can be fully developed

For bat algorithm as the search relies entirely on randomwalks a fast convergence cannot be guaranteed Describedhere for the first time a main improvement of addingmutation operator is made to the BA including three minorimprovements which are made with the aim of speeding upconvergence thus making the method more practical for awider range of applications but without losing the attractivefeatures of the original method

The first improvement is that we use fixed frequency119891 and loudness 119860 instead of various frequency 119891

119894and 119860119905

119894

Similar to BA inHSBA each bat is defined by its position 119909119905119894

velocity V119905119894 the emission pulse rate 119903119905

119894and the fixed frequency

119891 and loudness 119860 in a 119889-dimensional search space The newsolutions 119909119905

119894and velocities V119905

119894at time step 119905 are given by

V119905

119894= V119905minus1

119894+ (119909119905119894minus 119909lowast) 119891

119909119905119894= 119909119905minus1119894

+ V119905

119894

(9)

where 119909lowastis the current global best location (solution) which

is located after comparing all the solutions among all the 119899bats In our experiments we make 119891 = 05 Through a seriesof simulation experiments on benchmarks in Section 42 itwas found that setting the parameter of pulse rate 119903 to 06 andthe loudness 119860 to 095 produced the best results

The second improvement is to add mutation operator inan attempt to increase diversity of the population to improvethe search efficiency and speed up the convergence to optimaFor the local search part once a solution is selected among thecurrent best solutions a new solution for each bat is generatedlocally using random walk by (7) when 120585 is larger than pulserate 119903 that is 120585 gt 119903 where 120585 isin [0 1] is a random real numberdrawn from a uniform distribution while when 120585 le 119903 weuse pitch adjustment operation in HS serving as a mutationoperator updating the new solution to increase diversity ofthe population to improve the search efficiency as shown in(3) Through testing benchmarks in Section 42 it was foundthat setting the parameter of harmonymemory considerationrate HMCR to 095 and pitch adjustment rate PAR to 01produced the best results

The last improvement is the addition of elitism schemeinto the HSBA As with other population-based optimiza-tion algorithms we typically incorporate some sort of elitismin order to retain the best solutions in the population Thisprevents the best solutions from being corrupted by pitchadjustment operator Note that we use an elitism approach tosave the property of the bats that has the best solution in theHSBA process so even if pitch adjustment operation ruinsits corresponding bat we have saved it and can revert back toit if needed

Based on the above-mentioned analyses the mainframeof the harmony searchbat algorithm (HSBA) is presentedin Algorithm 3

6 Journal of Applied Mathematics

BeginStep 1 Initialization Set the generation counter 119905 = 1 initialize the population of NP

bats 119875 randomly and each bat corresponding to a potentialsolution to the given problem define loudness 119860 set frequency 119876the initial velocities V and pulse rate 119903 set the harmony memory considerationrate HMCR the pitch adjustment rate PAR and bandwidth bwset maximum of elite individuals retained KEEP

Step 2 Evaluate the quality 119891 for each bat in 119875 determined by the objective function 119891(119909)Step 3 While the termination criteria is not satisfied or 119905 ltMaxGeneration do

Sort the population of bats 119875 from best to worst by order of quality 119891 for each batStore the KEEP best bats as KEEPBATfor 119894 = 1NP (all bats) do

V119905

119894= V119905minus1

119894+ (V119905119894minus 119909lowast) times 119876

119909119905119894= 119909119905minus1119894

+ V119905

119894

if (rand gt 119903) then119909119905119906= 119909lowast+ 120572120576119905

end iffor 119895 = 1119863 (all elements) do Mutate

if (rand lt HMCR) then1199031= lceilNP lowast randrceil

119909V (119895) = 1199091199031(119895) where 119903

1isin (1 2 HMS)

if (rand lt PAR) then119909V (119895) = 119909V (119895) + bw times (2 times rand minus 1)

endifelse

119909V(119895) = 119909min 119895 + rand times (119909max119895 minus 119909minminus119895)

endifendfor 119895Evaluate the fitness for the offsprings 119909119905

119906 119909119905119894 119909119905

V

Select the offspring 119909119905119896with the best fitness among the offsprings 119909119905

119906 119909119905119894 119909119905

V

if (rand lt 119860) then1199091199051199031= 119909119905119896

end ifReplace the KEEP worst bats with the KEEP best bats KEEPBAT stored

end for 119894119905 = 119905 + 1

Step 4 end whileStep 5 Post-processing the results and visualization

End

Algorithm 3 The hybrid meta-heuristic algorithm of HSBA

4 Simulation Experiments

In this section we test the performance of the proposedmeta-heuristic HSBA to global numerical optimization through aseries of experiments conducted on benchmark functions

To allow a fair comparison of running times all the exper-iments were conducted on a PC with a Pentium IV processorrunning at 20GHz 512MB of RAM and a hard driveof 160Gbytes Our implementation was compiled usingMATLAB R2012a (714) running under Windows XP3 Nocommercial BA or HS tools were used in the followingexperiments

41 General Performance of HSBA In order to explorethe benefits of HSBA in this subsection we compared itsperformance on global numeric optimization problem withnine other population-based optimization methods which

are ACO BA BBO DE ES GA HS PSO and SGA ACO(ant colony optimization) [38 39] is a swarm intelligencealgorithm for solving optimization problems which is basedon the pheromone deposition of ants BBO (biogeography-based optimization) [24] is a new excellent powerful andefficient evolution algorithm developed for the global opti-mization inspired by the immigration and emigration ofspecies between islands (or habitats) in search of morecompatible islands DE (differential evolution) [19] is a simplebut excellent optimization method that uses the differencebetween two solutions to probabilistically adapt a thirdsolution An ES (evolutionary strategy) [40] is an algorithmthat generally distributes equal importance to mutation andrecombination and that allows two or more parents toreproduce an offspring A GA (genetic algorithm) [18] is asearch heuristic that mimics the process of natural evolutionPSO (particle swarm optimization) [21] is also a swarm

Journal of Applied Mathematics 7

intelligence algorithm which is based on the swarm behaviorof fish and bird schooling in nature A stud genetic algorithm(SGA) [41] is a GA that uses the best individual at eachgeneration for crossover

In all experiments we will use the same parameters forHS BA and HSBA that are loudness 119860 = 095 pulse rate119903 = 06 scaling factor 120576 = 01 the harmony memoryconsideration rate HMCR = 095 and the pitch adjustmentrate PAR = 01 For ACO BBO DE ES GA PSO and SGAwe set the parameters as follows For ACO initial pheromonevalue 120591

0= 1119864 minus 6 pheromone update constant 119876 = 20

exploration constant 1199020= 1 global pheromone decay rate

120588119892= 09 local pheromone decay rate 120588

119897= 05 pheromone

sensitivity 119904 = 1 and visibility sensitivity 120573 = 5 for BBOhabitat modification probability = 1 immigration probabilitybounds per gene = [0 1] step size for numerical integration ofprobabilities = 1 maximum immigration and migration ratesfor each island = 1 and mutation probability = 0005 for DEa weighting factor 119865 = 05 and a crossover constant CR = 05for the ES the number of offspring 120582 = 10 produced ineach generation and standard deviation 120590 = 1 for changingsolutions For the GA we used roulette wheel selection andsingle-point crossover with a crossover probability of 1 anda mutation probability of 001 For PSO we set an inertialconstant = 03 a cognitive constant = 1 and a social constantfor swarm interaction = 1 For the SGA we used single-pointcrossover with a crossover probability of 1 and a mutationprobability of 001

Well-defined problem sets are favorable for evaluatingthe performance of optimization methods proposed in thispaper Based on mathematical functions benchmark func-tions can be applied as objective functions to perform suchtests The properties of these benchmark functions can beeasily achieved from their definitions Fourteen differentbenchmark functions are applied to verify our proposedmetaheuristic algorithmHSBA Each of the functions in thisstudy has 20 independent variables (ie119863 = 20)

The benchmark functions described in Table 1 are stan-dard testing functionsTheproperties of the benchmark func-tions are given in Table 2 The modality property means thenumber of the best solutions in the search space Unimodalbenchmark functions only have an optimum which is theglobal optimum Multimodal benchmark functions have atleast two optima in their search space indicating that theyhave more than one local optima except the global optimumMore details of all the benchmark functions can be found in[6]

We set population size 119873119875 = 50 an elitism parameter119870119890119890119901 = 2 and maximum generation119872119886119909119892119890119899 = 50 for eachalgorithm We ran 100 Monte Carlo simulations of eachalgorithm on each benchmark function to get representativeperformances Tables 3 and 4 illustrate the results of thesimulations Table 3 shows the average minima found byeach algorithm averaged over 100 Monte Carlo runs Table 4shows the absolute best minima found by each algorithmover 100 Monte Carlo runs In other words Table 3 showsthe average performance of each algorithm while Table 4shows the best performance of each algorithmThe best valueachieved for each test problem is shown in bold Note that the

normalizations in the tables are based on different scales sovalues cannot be compared between the two tables

From Table 3 we see that on average HSBA is the mosteffective at finding objective function minimum on ten ofthe fourteen benchmarks (F02 F03 F06ndashF11 and F13-F14)ACO is the second most effective performing the best on thebenchmarks F04 and F05 BBO and SGA are the third mosteffective performing the best on the benchmarks F12 andF01 respectively whenmultiple runs aremade Table 4 showsthat HSBA performs the best on thirteen of the fourteenbenchmarks (F02ndashF14) while BBO performs the secondbest at finding objective function minimum on the onlybenchmark (F01) when multiple runs are made In additionstatistical analysis [42] on these values obtained by the tenmethods on 14 benchmark functions based on the Friedmanrsquostest [43] reveals that the differences in the obtained averageand the best functionminima are statistically significant (119875 =

166 lowast 10minus17 and119875 = 725 lowast 10minus17 resp) at the confidence levelof 5

Furthermore the computational requirements of the tenoptimization methods were similar We collected the averagecomputational time of the optimization methods as appliedto the 14 benchmarks discussed in this sectionThe results areshown in Table 3 BA was the quickest optimization methodHSBA was the third fastest of the ten algorithms Howeverit should be noted that in the vast majority of real-worldapplications it is the fitness function evaluation that is by farthe most expensive part of a population-based optimizationalgorithm

Furthermore convergence graphs of ACO BA BBODE ES GA HS HSBA PSO and SGA are shown inFigures 1ndash14 which represent the process of optimizationThevalues shown in Figures 1ndash14 are the mean objective functionoptimum achieved from 100Monte Carlo simulations whichare the true objective function values not normalized

Figure 1 shows the results obtained for the ten methodswhen the F01 Ackley function is applied From Figure 1clearly we can draw the conclusion that BBO is significantlysuperior to all the other algorithms including HSBA duringthe process of optimization while HSBA performs the sec-ond best on this multimodal benchmark function Here allthe other algorithms show the almost same starting point andHSBA has a faster convergence rate than other algorithmswhile HSBA is outperformed by BBO after 13 generationsAlthough slower SGA eventually finds the global minimumclose to HSBA Obviously BBO HSBA and SGA outper-form ACO BA DE ES GA HS and PSO during the wholeprocess of searching the global minimum

Figure 2 illustrates the optimization results for F02 Fletc-her-Powell function In thismultimodal benchmark problemit is obvious that HSBA outperforms all other methods dur-ing the whole progress of optimization (except the generation20ndash33 BBO and HSBA performs approximately equally)

Figure 3 shows the optimization results for F03 Griewankfunction From Table 3 and Figure 3 we can see that HSBAperforms the best on this multimodal benchmark problemLooking at Figure 3 carefully HSBA shows a faster conver-gence rate initially than ACO however it is outperformed

8 Journal of Applied Mathematics

Table1Be

nchm

arkfunctio

ns

No

Nam

eDefinitio

nSource

F01

Ackley

119891(119909)=20

+119890minus20

sdot119890minus02sdot radic(1119899)

119899 sum 119894=1

1199092 119894

minus119890(1119899)

119899 sum 119894=1

cos(2120587119909119894)

[3]

F02

Fletcher-Pow

ell119891(119909)=119899 sum 119894=1

(119860119894minus119861119894)2

119860119894=119899 sum 119895=1

(119886119894119895sin

120572119895+119887 119894119895cos120572119895)

119861119894=119899 sum 119895=1

(119886119894119895sin

119909119895+119887 119894119895cos119909119895)

[4]

F03

Grie

wank

119891(119909)=119899 sum 119894=1

1199092 119894

4000

minus119899 prod 119894=1

cos(

119909119894

radic119894)

+1

[5]

F04

Penalty

1119891(119909)=

120587 3010sin2(1205871199101)+119899minus1

sum 119894=1

(119910119894minus1)2

sdot[1+10sin2(120587119910119894+1)]

+(119910119899minus1)2

+119899 sum 119894=1

119906(119909119894101004)

119910119894=1+025

(119909119894+1)

[6]

F05

Penalty

2119891(119909)=

01sin2(31205871199091)+119899minus1

sum 119894=1

(119909119894minus1)2

sdot[1+sin2(3120587119909119894+1)]+(119909119899minus1)2

[1+sin2(2120587119909119899)]

+119899 sum 119894=1

119906(11990911989451004)

[6]

F06

Quarticwith

noise

119891(119909)=119899 sum 119894=1

(119894sdot1199094 119894+119880(01))

[7]

F07

Rastrig

in119891(119909)=10

sdot119899+119899 sum 119894=1

(1199092 119894minus10

sdotcos

(2120587119909119894))

[8]

F08

Rosenb

rock

119891(119909)=119899minus1

sum 119894=1

[100(119909119894+1minus1199092 119894)2+(119909119894minus1)2

][7]

F09

Schw

efel226

119891(119909)=4189829

times119863minus119863 sum 119894=1

119909119894sin

(1003816 1003816 1003816 1003816119909119894

1003816 1003816 1003816 100381612

)[9]

F10

Schw

efel12

119891(119909)=119899 sum 119894=1

(119894 sum 119895=1

119909119895)

2

[10]

F11

Schw

efel222

119891(119909)=119899 sum 119894=1

|119909119894|+119899 prod 119894=1

|119909119894|

[10]

F12

Schw

efel221

119891(119909)=max 119894

1003816 1003816 1003816 1003816119909119894

1003816 1003816 1003816 10038161le119894le119899

[6]

Journal of Applied Mathematics 9

Table1Con

tinued

No

Nam

eDefinitio

nSource

F13

Sphere

119891(119909)=119899 sum 119894=1

1199092 119894

[7]

F14

Step

119891(119909)=6sdot119899

+119899 sum 119894=1

lfloor 119909119894rfloor

[7]

lowast

Inbenchm

arkfunctio

nF0

2them

atrix

elem

entsa 119899times119899b 119899times119899isin(minus100100)120572119899times1isin(minus120587120587)ared

rawnfro

mun

iform

distr

ibution[4]

lowastIn

benchm

arkfunctio

nsF0

4andF0

5thed

efinitio

nof

thefun

ction119906(119909119894119886119896119898)isgivenby119906(119909119894119886119896119898)=119896(119909119894minus119886)119898119909119894gt1198860minus119886le119909119894le119886119896(minus119909119894minus119886)119898119909119894ltminus119886

10 Journal of Applied Mathematics

Table 2 Properties of benchmark functions lb denotes lower bound ub denotes upper bound and opt denotes optimum point

No Function lb ub opt Continuity ModalityF01 Ackley minus32768 32768 0 Continuous MultimodalF02 Fletcher-Powell minus120587 120587 0 Continuous MultimodalF03 Griewank minus600 600 0 Continuous MultimodalF04 Penalty 1 minus50 50 0 Continuous MultimodalF05 Penalty 2 minus50 50 0 Continuous MultimodalF06 Quartic with noise minus128 128 1 Continuous MultimodalF07 Rastrigin minus512 512 0 Continuous MultimodalF08 Rosenbrock minus2048 2048 0 Continuous UnimodalF09 Schwefel 226 minus512 512 0 Continuous MultimodalF10 Schwefel 12 minus100 100 0 Continuous UnimodalF11 Schwefel 222 minus10 10 0 Continuous UnimodalF12 Schwefel 221 minus100 100 0 Continuous UnimodalF13 Sphere minus512 512 0 Continuous UnimodalF14 Step minus512 512 0 Discontinuous Unimodal

Table 3 Mean normalized optimization results in fourteen benchmark functions The values shown are the minimum objective functionvalues found by each algorithm averaged over 100 Monte Carlo simulations

ACO BA BBO DE ES GA HS HSBA PSO SGAF01 231 333 115 202 338 272 347 109 266 100F02 2458 2582 158 894 2435 545 1569 100 1396 133F03 316 6072 193 544 2385 322 7722 100 2577 142F04 100 30E38 40E32 56E33 27E38 31E32 14E39 23E32 41E36 96E31F05 100 11E8 29942 15E6 46E8 54E3 41E8 21551 55E7 11110F06 48901 68E3 3532 30829 18E4 27483 15E4 100 25E3 1009F07 809 1155 128 656 1187 617 1022 100 844 180F08 4225 2901 229 759 5999 905 4785 100 1204 215F09 317 2026 199 1358 1333 181 1992 100 1761 115F10 175 373 138 295 493 125 422 100 248 148F11 105 1970 183 714 2312 1113 1945 100 1322 246F12 186 403 100 299 391 192 374 138 238 112F13 9830 15084 380 1903 22652 4774 18232 100 7291 402F14 773 12048 393 1331 10256 1153 14655 100 6344 328TimE 274 100 132 164 167 179 233 143 203 176lowast

The values are normalized so that the minimum in each row is 100 These are not the absolute minima found by each algorithm but the average minimafound by each algorithm

by ACO after 5 generations For other algorithms althoughslower BBO GA and SGA eventually find the global mini-mum close to HSBA while BA DE ES HS and PSO fail tosearch the global minimum within the limited iterations

Figure 4 shows the results for F04 Penalty 1 functionFrom Figure 4 it is obvious that HSBA outperforms all othermethods during the whole progress of optimization in thismultimodal benchmark function Although slower SGA andBBO perform the second and third best at finding the globalminimum

Figure 5 shows the performance achieved for F05 Penalty2 function For this multimodal function similar to F04Penalty 1 function shown in Figure 4 HSBA is significantly

superior to all the other algorithms during the process ofoptimization Here PSO shows a faster convergence rateinitially than HSBA however it is outperformed by HSBAafter 3 generations

Figure 6 shows the results achieved for the ten methodswhen using the F06 Quartic (with noise) function FromTable 3 and Figure 6 we can conclude that HSBA performsthe best in this multimodal function Looking carefullyat Figure 6 PSO and HSBA show the same initial fastconvergence towards the known minimum as the procedureproceeds HSBA gets closer and closer to the minimumwhile PSO comes into being premature and traps into thelocal minimum BA ES GA HS and PSO do not manage

Journal of Applied Mathematics 11

Table 4 Best normalized optimization results in fourteen benchmark functions The values shown are the minimum objective functionvalues found by each algorithm

ACO BA BBO DE ES GA HS HSBA PSO SGAF01 185 231 100 143 225 203 230 105 191 104F02 1042 1448 109 422 1039 403 991 100 828 133F03 273 4622 187 447 2138 802 4324 100 1675 176F04 44E6 50E6 12E3 19E4 22E6 30E4 42E6 100 3755 322F05 30E4 32E4 5101 38633 16E4 88446 26E4 100 4113 457F06 5851 99255 650 2469 80848 5924 74691 100 18971 202F07 571 853 125 513 794 523 747 100 600 168F08 2642 2503 148 370 3352 616 2150 100 775 145F09 243 866 128 490 593 209 728 100 740 142F10 189 494 118 266 300 208 276 100 201 174F11 774 1261 121 336 1214 601 960 100 781 162F12 133 233 144 169 208 174 211 100 170 121F13 2804 5657 217 554 6057 1971 5286 100 2098 228F14 428 5402 197 485 3361 1060 4911 100 1972 185lowast

The values are normalized so that the minimum in each row is 100 These are the absolute best minima found by each algorithm

0 10 20 30 40 508

10

12

14

16

18

20

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 1 Comparison of the performance of the different methodsfor the F01 Ackley function

to succeed in this benchmark function within maximumnumber of generations At last BBO DE and SGA convergeto the value very close to HSBA

Figure 7 shows the optimization results for the F07Rastrigin function Very clearly HSBA has the fastest con-vergence rate at finding the globalminimumand significantlyoutperforms all other approaches Looking carefully at Fig-ure 7 approximately we can divide all the algorithms intothree groups one group including BBO HSBA and SGAperforming the best the other group including ACO DE

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

106

105

Figure 2 Comparison of the performance of the different methodsfor the F02 Fletcher-Powell function

GA and PSO performing the second best another groupincluding BA ES and HS performing the worst

Figure 8 shows the results for F08 Rosenbrock functionFrom Table 3 and Figure 8 we can see that HSBA is superiorto the other algorithms during the optimization process inthis relatively simple unimodal benchmark function Alsowe see that similar to the F06 Quartic (with noise) functionshown in Figure 6 PSO shows a faster convergence rateinitially than HSBA however it is outperformed by HSBAafter 4 generations and is premature to the local minimum

Figure 9 shows the equivalent results for the F09 Schwefel226 function From Figure 9 very clearly though HSBA isoutperformed by BBO between the generations 10ndash30 it hasthe stable convergence rate at finding the global minimum

12 Journal of Applied Mathematics

0 10 20 30 40 50

50

100

150

200

250

300

350

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 3 Comparison of the performance of the different methodsfor the F03 Griewank function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

1010

109

108

107

106

105

104

103

102

101

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 4 Comparison of the performance of the different methodsfor the F04 Penalty 1 function

and significantly outperforms all other approaches in thismultimodal benchmark function

Figure 10 shows the results for F10 Schwefel 12 functionFrom Figure 10 we can see that HSBA performs far betterthan other algorithms during the optimization process in thisrelative simple unimodal benchmark function PSO showsa faster convergence rate initially than HSBA howeverit is outperformed by HSBA after 3 generations Lookingcarefully at Figure 10 akin to F07 Rastrigin function shown inFigure 7 all the algorithms except BA can be approximately

0

Ben

chm

ark

func

tion

val

ue

5 10 15 20 25 30 35 40 45 50Number of generations

109

108

107

106

105

104

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 5 Comparison of the performance of the different methodsfor the F05 Penalty 2 function

0 10 20 30 40 500

5

10

15

20

25

30

35

40

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 6 Comparison of the performance of the different methodsfor the F06 Quartic (with noise) function

divided into three groups one group including BBO andHSBA performing the best the other group including ACOGA PSO and SGA performing the second best anothergroup including DE ES and HS performing the worst

Figure 11 shows the results for F11 Schwefel 222 functionVery clearly BBO shows a faster convergence rate initiallythan HSBA however it is outperformed by HSBA after 40generations At last HSBA reaches the optimal solution sig-nificantly superior to other algorithms BBO is only inferior

Journal of Applied Mathematics 13B

ench

mar

k fu

ncti

on v

alue

0 10 20 30 40 50

50

100

150

200

250

Number of generations

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 7 Comparison of the performance of the different methodsfor the F07 Rastrigin function

0 10 20 30 40 50

500

1000

1500

2000

2500

3000

3500

4000

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 8 Comparison of the performance of the different methodsfor the F08 Rosenbrock function

to HSBA and performs the second best in this unimodalfunction

Figure 12 shows the results for F12 Schwefel 221 functionVery clearly HSBAhas the fastest convergence rate at findingthe global minimum and significantly outperforms all otherapproaches

0 10 20 30 40 50

1000

2000

3000

4000

5000

6000

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 9 Comparison of the performance of the different methodsfor the F09 Schwefel 226 function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

104

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 10 Comparison of the performance of the differentmethodsfor the F10 Schwefel 12 function

Figure 13 shows the results for F13 Sphere function FromTable 3 and Figure 13 HSBA has the fastest convergencerate at finding the global minimum and outperforms all otherapproaches Looking carefully at Figure 13 for BBO and SGAwe can see that BBO has a faster convergence rate than SGAbut SGA does finally converge to the value of BBO that isapproaching to HSBA

Figure 14 shows the results for F14 Step function Appar-ently HSBA shows the fastest convergence rate at findingthe global minimum and significantly outperforms all otherapproaches Looking carefully at Figure 14 for BBO and SGA

14 Journal of Applied MathematicsB

ench

mar

k fu

ncti

on v

alue

0 5 10 15 20 25 30 35 40 45 50Number of generations

102

101

100

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 11 Comparison of the performance of the different methodsfor the F11 Schwefel 222 function

0 10 20 30 40 5030

35

40

45

50

55

60

65

70

75

80

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 12 Comparison of the performance of the different methodsfor the F12 Schwefel 221 function

we can see that BBO has a faster convergence rate than SGAbut SGA does finally converge to the value of BBO

From the above-analyses about Figures 1ndash14 we can cometo the conclusion that our proposed hybrid metaheuristicalgorithm HSBA significantly outperforms the other ninealgorithms In general ACO BBO and SGA are only inferiorto the HSBA and ACO BBO and SGA perform betterthanHSBA in the benchmarks F04-F05 the benchmark F12and the benchmark F01 respectively Further benchmarksF04 F05 F06 F08 and F10 illustrate that PSO has a faster

0 10 20 30 40 500

10

20

30

40

50

60

70

80

90

100

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 13 Comparison of the performance of the different methodsfor the F13 Sphere function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

104

103

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 14 Comparison of the performance of the differentmethodsfor the F14 Step function

convergence rate initially while later it converges slowerand slower to the true objective function value At lastwe must point out that in [24] Simon compared BBOwith seven state-of-the-art EAs over 14 benchmark func-tions and a real-world sensor selection problem The resultsdemonstrated the good performance of BBO It is also indi-rectly demonstrated that our proposed hybrid metaheuristicmethod HSBA is a more powerful and efficient optimiza-tion algorithm than other population-based optimizationalgorithms

Journal of Applied Mathematics 15

Table 5 Best normalized optimization results in fourteen benchmark functions with different 119860 The numbers shown are the best resultsfound after 100 Monte Carlo simulations HSBA algorithm

119860

0 01 02 03 04 05 06 07 08 09 10F01 166 133 155 140 100 133 121 122 116 113 114F02 125E4 100 100 100 100 100 100 100 100 100 100F03 341 1115 100 100 100 100 100 100 100 100 100F04 273E4 100 125E4 104 104 104 104 104 104 100 100F05 503E5 530 133 168E4 100 100 100 100 100 100 100F06 100 256 547 473 1781 990 289 674 990 260 557F07 1014 1492 435 110 110 1537 101 100 100 100 100F08 3849 108 1406 108 108 108 9901 101 100 100 100F09 28518 40458 138 474 119 119 119 36791 101 100 100F10 66522 468 169E3 1527 118 118 118 118 121 100 100F11 2003 100 996 1154 3922 996 996 996 996 3897 849F12 1032 100 1322 323 1469 279 279 279 279 279 1931F13 146 415 284 447 115 156 100 100 100 100 100F14 52752 1357 395 101 115 1291 100 100 100 100 100

1 4 2 2 3 3 5 6 7 10 10

42 Influence of Control Parameter In [28] Dr Yang con-cluded that if the parameters in BA can be adjusted properlyit can outperform GA HS (harmony search) [26] and PSOThe choice of the control parameters is of vital importancefor different problems To investigate the influence of theloudness119860 pulse rate 119903 harmonymemory consideration rateHMCR and pitch adjustment rate PAR on the performanceof HSBA we carry out this experiment in the benchmarkproblem with different parameters All other parametersettings are kept unchanged (unless noted otherwise in thefollowing paragraph) The results are recorded in Tables 5ndash12 after 100 Monte Carlo runs Among them Tables 5 7 9and 11 show the best minima found by HSBA algorithm over100Monte Carlo runs Tables 6 8 10 and 12 show the averageminima found byHSBA algorithm averaged over 100MonteCarlo runs In other words Tables 5 7 9 11 and Tables 68 10 12 show the best and average performance of HSBAalgorithm respectively In each table the last row is the totalnumber of functions on which HSBA performs the best withsome parameters

421 Loudness 119860 Tables 5 and 6 recorded the resultsperformed in the benchmark problem with the loudness 119860 =0 01 02 09 10 and fixed pulse rate 119903 = 06 harmonymemory consideration rate HMCR= 095 and pitch adjust-ment rate PAR= 01 From Tables 5 and 6 obviously it can beseen that (i) for the three benchmark functions F02 F03 andF04 HSBA performs slightly differently that is to say thesethree benchmark functions are insensitive to the parameter119860 (ii) For benchmark functions F01 F06 F11 and F12HSBA performs better on smaller119860(lt05) (iii) However forfunctions F05 F07ndash10 F13 and F14 HSBA performs betteron bigger119860(gt05) In sum HSBA performs the best when119860is equal or very close to 10 So we set119860 = 095 which is veryclose to 09 and 10 in other experiments

422 Pulse Rate 119903 Tables 7 and 8 recorded the resultsperformed on the benchmark problem with the pulse rate119903 = 0 01 02 09 10 and fixed loudness119860 = 095harmony memory consideration rate HMCR=095 andpitch adjustment rate PAR= 01 From Tables 7 and 8 wecan evidently conclude that HSBA performs significantlybetter on bigger 119903(gt05) than on smaller 119903(lt05) and HSBAperforms the best when r is equal or very close to 06 So weset 119903 = 06 in other experiments

423 Harmony Memory Consideration Rate119867119872119862119877 Tables9 and 10 recorded the results performed in the bench-mark problem with the harmony memory consideration rateHMCR = 0 01 02 09 10 fixed loudness 119860 = 095pulse rate 119903 = 06 and pitch adjustment rate PAR = 01 FromTables 9 and 10 we can recognize that the function valuesevaluated by HSBA are bettersmaller on bigger HMCR(gt05) than on smaller r (lt05) and most benchmarks reachminimum when HMCR is equal or very close to 1 So we setHMCR=095 in other experiments

424 Pitch Adjustment Rate 119875119860119877 Tables 11 and 12 recordedthe results performed on the benchmark problems with thepitch adjustment rate PAR = 0 01 02 09 10 fixedloudness119860 = 095 pulse rate 119903 = 06 and harmonymemory consideration rate HMCR = 095 From Tables 11and 12 we can recognize that the function values for HSBAvary little with the increasing PAR and HSBA reachesoptimumminimum in most benchmarks when PAR is equalor very close to 01 So we set PAR = 01 in other experiments

43 Discussion In the HSBA the bats fly in the sky usingecholocation to find foodprey (ie best solutions) Fourother parameters are the loudness (119860) the rate of pulseemission (119903) harmony memory consideration rate (HMCR)

16 Journal of Applied Mathematics

Table 6 Mean normalized optimization results in fourteen benchmark functions with different A The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119860

0 01 02 03 04 05 06 07 08 09 10F01 101 101 100 100 100 100 100 100 100 100 100F02 152 100 146 117 108 137 188 165 167 179 148F03 669 1041 106 101 103 103 101 102 105 100 100F04 20131 126 410 421 333 272 100 519 291 100 311F05 59122 1682 425 587 101 414 2404 100 368 733 900F06 100 490E4 63799 25871 495E4 218 217 217 218 217 217F07 857 221E6 643 27234 11048 211E4 102 101 100 100 100F08 4980 114E4 172E4 16160 22459 9119 174E4 103 111 100 100F09 8237 173E6 106 314 7445 10310 4226 794E3 100 137 129F10 9031 145E3 245E5 232E3 2334 2234 3138 1289 234E3 140 100F11 498 100 315E4 233 1441 52023 44365 61626 24991 478E4 214F12 369 210E4 457E4 100 212E4 26635 23313 19880 27614 11201 214E4F13 190 825 448E6 214E6 100 615 25451 22275 18996 26386 10701F14 6691 195E5 117E4 124E3 2104 186E3 2940 2392 100 1835 2475

1 2 1 2 2 1 2 2 4 5 5

Table 7 Best normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 184 184 484 159 171 100 134 109 116 101F02 944E3 100 100 100 100 100 100 100 100 100 100F03 373 546 186 186 186 186 100 172 121 161 101F04 1142 111 2429 123 126 100 129 129 129 129 129F05 177E5 230 115 109E4 100 100 100 100 100 100 100F06 6244 2996 4858 1560 2634 780 182 100 3520 239 155F07 831 448 258 129 129 349 100 100 100 100 100F08 961 118 420 137 137 137 695 101 101 100 100F09 37394 44422 168 335 168 168 100 29198 101 101 101F10 38693 316 1397 463 158 158 100 158 30931 158 101F11 4255 100 1829 2135 2018 1111 1904 2135 1581 1876 1155F12 11440 100 14184 4448 12547 4448 4448 4448 4448 4448 6943F13 639 637 601 296 302 100 152 139 100 257 249F14 12052 404 233 100 116 340 100 100 116 116 116

0 3 1 2 2 4 8 5 4 4 4

and pitch adjustment rate (PAR) The appropriate update forthe pitch adjustment rate (PAR) and the rate of pulse emission(119903) balances the exploration and exploitation behavior of eachbat respectively as the loudness usually decreases once a bathas found its preysolution while the rate of pulse emissionincreases in order to raise the attack accuracy

For all of the standard benchmark functions that havebeen considered the HSBA has been demonstrated toperform better than or be equal to the standard BA and otheracclaimed state-of-the-art population-based algorithms withthe HSBA performing significantly better in some functionsThe HSBA performs excellently and efficiently because ofits ability to simultaneously carry out a local search still

searching globally at the same time It succeeds in doingthis due to the local search via harmony search algorithmand global search via bat algorithm concurrently A similarbehavior may be performed in the PSO by using multiswarmfrom a particle population initially [44] However HSBArsquosadvantages include performing simply and easily and onlyhaving four parameters to regulate The work carried outhere demonstrates the HSBA to be robust over all kinds ofbenchmark functions

Benchmark evaluation is a good way for verifying theperformance of the metaheuristic algorithms but it also haslimitations First we did not make any special effort to tunethe optimization algorithms in this section Different tuning

Journal of Applied Mathematics 17

Table 8 Mean normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 199 186 194 159 100 142 134 109 116 171F02 331 538 457 308 382 114 100 289 279 402 311F03 373 546 533 229 336 241 100 172 121 161 136F04 1142 111 2429 149E4 113E3 394E3 278E4 1966 17522 114 100F05 1609 12862 3269 2446 6440 1277 2959 7617 456 100 446F06 6244 2696 4858 1560 2634 780 182 100 3520 239 155F07 330 178 134 103 123 139 155 100 119 136 349F08 488 393 394 194 219 383 353 433 100 227 314F09 139 166 114 100 121 115 115 109 108 104 109F10 721 994 875 360 846 611 483 414 576 100 271F11 382 423 373 205 181 100 171 220 142 168 189F12 164 217 204 147 180 186 100 121 113 134 156F13 639 637 601 296 302 190 152 139 100 257 249F14 574 1543 742 682 644 317 100 485 221 174 209

0 0 0 1 0 2 4 2 2 2 1

Table 9 Best normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 164 164 164 164 164 164 164 151 128 100 163F02 187E4 100 100 100 100 100 100 100 100 100 100F03 1982 2649 371 371 371 371 371 371 371 206 100F04 126E5 100 187E4 100 100 100 100 100 100 100 100F05 607E5 534 100 187E4 100 100 100 100 100 100 100F06 10892 24631 36504 34540 33857 23465 14349 4528 2330 100 2575F07 922 1428 534 100 100 930 100 100 100 100 100F08 1810 108 772 108 108 108 3594 100 100 100 100F09 28004 39161 127 681 127 127 127 22795 100 100 100F10 55165 876 176E3 1170 164 164 164 164 27448 100 100F11 1467 100 618 618 1399 618 618 618 440 269 616F12 768 100 1110 273 1021 273 273 273 273 273 473F13 772 2675 1590 1776 612 1040 612 595 255 100 225F14 53716 1428 534 100 100 713 100 100 100 100 100

0 4 2 4 5 3 5 6 7 11 9

parameter values in the optimization algorithmsmight resultin significant differences in their performance Second real-world optimization problems may not have much of arelationship to benchmark functionsThird benchmark testsmight result in different conclusions if the grading criteria orproblem setup change In our work we examined the meanand best results obtained with a certain population size andafter a certain number of generations However we mightarrive at different conclusions if (for example) we change thegeneration limit or look at how many generations it takes toreach a certain function value or if we change the populationsize In spite of these caveats the benchmark results shown

here are promising for HSBA and indicate that this novelmethod might be able to find a niche among the plethora ofpopulation-based optimization algorithms

We note that CPU time is a bottleneck to the implemen-tation of many population-based optimization algorithmsIf an algorithm cannot converge fast it will be impracticalsince it would take too long to find an optimal or suboptimalsolution HSBA does not seem to require an unreasonableamount of computational effort of the ten optimization algo-rithms compared in this paper HSBA was the third fastestNevertheless findingmechanisms to accelerate HSBA couldbe an important area for further research

18 Journal of Applied Mathematics

Table 10 Mean normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 101 101 101 101 101 101 101 101 101 100 101F02 74756 787 525 343 509 710 307 187 160 100 101F03 913 312E4 109 107 105 106 105 102 101 101 100F04 2101198646 1001198644 3951198644 9481198643 5971198643 2951198643 1511198643 1501198643 4694 100 237F05 3241198646 3271198644 2091198644 4141198644 7131198643 1541198643 8931198643 1531198643 62948 100 20373F06 100 5941198644 42204 63215 6001198644 192 191 191 191 191 191F07 1063 2071198646 900 21672 32455 307E4 104 103 102 101 100F08 5950 9881198643 304E4 14180 21701 32468 307E4 107 107 103 100F09 22687 4951198646 308 891 11422 17360 25941 245E4 145 100 171F10 25737 2831198644 935E5 137E4 9709 6656 9925 14940 139E4 100 273F11 607 100 183E4 202 1661 39040 26299 40300 60361 572E4 184F12 300 279E4 360E4 100 282E4 27095 19414 13077 20040 30016 284E4F13 232 966 561E6 189E6 100 815 26778 19186 12923 19805 29665F14 18466 384E5 117E4 185E3 100 571E3 2511 5539 3960 2657 4149

1 1 0 1 2 0 0 0 0 6 3

Table 11 Best normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 391E3 100 100 100 100 100 100 100 100 100 100F03 100 425 151 220 220 212 220 199 220 128 180F04 100 255 6554 255 255 255 255 255 255 255 255F05 252 100 171 169E3 171 171 171 171 171 171 171F06 100 5987 3418 2285 4652 2344 4354 4484 3688 1899 812F07 807 100 148 255 255 1306 255 255 255 255 255F08 543 100 192 100 100 100 1145 100 100 100 100F09 7672 252 117 100 171 171 171 15556 171 171 171F10 92910 148 100 491 255 255 255 222 235E3 255 255F11 318E3 100 582E3 399E3 339E3 582E3 491E3 582E3 582E3 958E3 582E3F12 30506 100 53728 9734 18758 9734 9734 9734 9734 9734 39827F13 192 422 587 1007 1298 466 148 401 317 100 409F14 8812 100 148 255 255 491 255 255 255 255 255

4 8 3 4 3 3 2 3 3 4 3

5 Conclusion and Future Work

This paper proposed a hybrid metaheuristic HSBA methodfor optimization problemWe improved the BAby combiningoriginal harmony search (HS) algorithm and evaluating theHSBA on multimodal numerical optimization problemsA novel type of BA model has been presented and animprovement is applied to mutate between bats using har-mony search algorithm during the process of bats updatingUsing the original configuration of the bat algorithm wegenerate the new harmonies based on the newly generatedbat each iteration after batrsquos position has been updated Thenew harmony vector substitutes the newly generated bat only

if it has better fitness This selection scheme is rather greedywhich often overtakes original HS and BA The HSBAattempts to take merits of the BA and the HS in order toavoid all bats getting trapped in inferior local optimal regionsThe HSBA enables the bats to have more diverse exemplarsto learn from as the bats are updated each iteration andalso form new harmonies to search in a larger search spaceThis new method can speed up the global convergence ratewithout losing the strong robustness of the basic BA Fromthe analysis of the experimental results we observe that theproposed HSBA makes good use of the information in pastsolutions more effectively to generate better quality solutionsfrequently when compared to the other population-based

Journal of Applied Mathematics 19

Table 12Mean normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 407 100 100 452 277 257 341 536 369 489 103F03 100 127E4 134 135 135 135 134 135 134 134 134F04 115 8127 939E3 100 101 101 129 100 100 101 101F05 34573 5061 8619 907E3 2501 108 101 191 333 100 118F06 100 542E6 427E4 474E4 548E6 57788 57788 57788 57788 57787 57787F07 486 153 100 9543 10596 122E4 134 136 132 133 135F08 1269 7600 876E3 1703 6913 7672 885E3 110 105 104 100F09 6345 23047 182 167 1278 4860 5349 609E3 111 130 100F10 13355 1251 126 197E3 1148 464 1645 1893 199E3 100 188F11 6302 100 639E3 7944 5924 396E3 142E3 581E3 645E3 745E5 7981F12 390 881E3 890E3 100 890E3 4440 4778 1725 7011 7784 899E3F13 100 2166 207E3 669 580 430 27167 28246 10539 42926 47662F14 4614 100 3610 5593 122 640E3 4215 3289 3481 1272 5129

4 4 3 3 1 1 1 2 2 3 3

optimization algorithms such as ACO BA BBO DE ESGA HS PSO and SGA Based on the results of the tenapproaches on the test problems we can conclude that theHSBA significantly improves the performances of the HSand the BA on most multimodal and unimodal problems

In this work 14 benchmark functions are used to evaluatethe performance of our approach we will test our approachon more problems such as the high-dimensional (119863 ge20) CEC 2010 test suit [45] and the real-world problemsMoreover we will compare HSBA with other metaheuristicmethod such as DLHS [46 47] SGHS [48] adaptive DE[49] CLPSO [50] and DMS-PSO [51] In addition we onlyconsider the unconstrained function optimization in thiswork Our future work consists on adding the diversity rulesinto HSBA for constrained optimization problems such asconstrained real-parameter optimization CEC 2010 test suit[52]

In the field of optimization there are many issues worthyof further study and efficient optimizationmethod should bedeveloped depending on the analysis of specific engineeringproblem Our future work will focus on the two issues onthe one hand we would apply our proposed approach HSBA to solve practical engineering optimization problemsand obviously HSBA can become a fascinating method forreal-world engineering optimization problems on the otherhand we would develop new meta-hybrid approach to solveoptimization problem

Acknowledgments

This work was supported by State Key Laboratory of LaserInteraction with Material Research Fund under Grant noSKLLIM0902-01 and Key Research Technology of Electric-discharge Non-chain Pulsed DF Laser under Grant no LXJJ-11-Q80

References

[1] G Wang L Guo A H Gandomi et al ldquoLevy-flight krill herdalgorithmrdquo Mathematical Problems in Engineering vol 2013Article ID 682073 14 pages 2013

[2] A H Gandomi X S Yang S Talatahari and A H AlaviMetaheuristic Applications in Structures and InfrastructuresElsevier Waltham Mass USA 2013

[3] D H Ackley ldquoAn empirical study of bit vector functionoptimizationrdquo in Genetic Algorithms and Simulated AnnealingL Davis Ed pp 170ndash204 Pitman London UK 1987

[4] R Fletcher and M J D Powell ldquoA rapidly convergent descentmethod for minimizationrdquoThe Computer Journal vol 6 no 2pp 163ndash168 1963

[5] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[6] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[7] K A De Jong Analysis of the Behavior of a Class of GeneticAdaptive Systems University of Michigan 1975

[8] L A Rastrigin Extremal Control Systems NaukaMoscow Rus-sian 1974Theoretical Foundations of Engineering CyberneticsSeries

[9] S-K S Fan and J-M Chang ldquoDynamic multi-swarm particleswarm optimizer using parallel PC cluster systems for globaloptimization of large-scale multimodal functionsrdquo EngineeringOptimization vol 42 no 5 pp 431ndash451 2010

[10] W Gong Z Cai and C X Ling ldquoDEBBO a hybrid differentialevolution with biogeography-based optimization for globalnumerical optimizationrdquo Soft Computing vol 15 no 4 pp 645ndash665 2010

[11] G Wang L Guo H Duan L Liu and H Wang ldquoA modifiedfirefly algorithm for UCAV path planningrdquo International Jour-nal of Hybrid Information Technology vol 5 no 3 pp 123ndash1442012

20 Journal of Applied Mathematics

[12] GWang LGuoHDuan L Liu andHWang ldquoAhybridmeta-heuristic DECS algorithm for UCAVpath planningrdquo Journal ofInformation and Computational Science vol 9 no 16 pp 1ndash82012

[13] GWang L Guo H Duan L Liu HWang andM Shao ldquoPathplanning for uninhabited combat aerial vehicle using hybridmeta-heuristic DEBBO algorithmrdquo Advanced Science Engi-neering and Medicine vol 4 no 6 pp 550ndash564 2012

[14] G Wang L Guo H Duan H Wang L Liu and M Shao ldquoAhybrid meta-heuristic DECS algorithm for UCAV three-dimension path planningrdquo The Scientific World Journal vol2012 Article ID 583973 11 pages 2012

[15] H DuanW Zhao GWang and X Feng ldquoTest-sheet composi-tion using analytic hierarchy process and hybrid metaheuristicalgorithmTSBBOrdquoMathematical Problems in Engineering vol2012 Article ID 712752 22 pages 2012

[16] G Wang L Guo H Duan L Liu and H Wang ldquoDynamicdeployment of wireless sensor networks by biogeography basedoptimization algorithmrdquo Journal of Sensor and Actuator Net-works vol 1 no 2 pp 86ndash96 2012

[17] X S Yang A H Gandomi S Talatahari and A H AlaviMetaheuristics in Water Geotechnical and Transport Engineer-ing Elsevier Waltham Mass USA 2013

[18] D E GoldbergGenetic Algorithms in Search Optimization andMachine Learning Addison-Wesley New York NY USA 1998

[19] R Storn and K Price ldquoDifferential evolutionmdasha simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997

[20] S Das and P N Suganthan ldquoDifferential evolution a survey ofthe state-of-the-artrdquo IEEE Transactions on Evolutionary Com-putation vol 15 no 1 pp 4ndash31 2011

[21] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[22] A H Gandomi G J Yun X-S Yang and S Talatahari ldquoChaos-enhanced accelerated particle swarm optimizationrdquo Communi-cations in Nonlinear Science and Numerical Simulation vol 18no 2 pp 327ndash340 2013

[23] S Talatahari X S Yang R Sheikholeslami andAH GandomildquoOptimal parameter estimation for muskingum model usinghybrid CSS and PSO methodrdquo Journal of Computational andTheoretical Nanoscience In press

[24] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[25] G Wang L Guo H Duan L Liu H Wang and M ShaoldquoHybridizing harmony search with biogeography based opti-mization for global numerical optimizationrdquo Journal of Com-putational and Theoretical Nanoscience In press

[26] Z W Geem J H Kim and G V Loganathan ldquoA new heuristicoptimization algorithm harmony searchrdquo Simulation vol 76no 2 pp 60ndash68 2001

[27] G Wang L Guo H Duan H Wang L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[28] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress Frome UK 2nd edition 2010

[29] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing amp Applications In press

[30] X S Yang andAHGandomi ldquoBat algorithm a novel approachfor global engineering optimizationrdquo Engineering Computa-tions vol 29 no 5 pp 464ndash483 2012

[31] G Wang L Guo H Duan L Liu and H Wang ldquoA Bat algo-rithm with mutation for UCAV path planningrdquo The ScientificWorld Journal vol 2012 Article ID 418946 15 pages 2012

[32] R Parpinelli and H Lopes ldquoNew inspirations in swarm intelli-gence a surveyrdquo International Journal of Bio-Inspired Computa-tion vol 3 no 1 pp 1ndash16 2011

[33] P W Tsai J S Pan B Y Liao M J Tsai and V Istanda ldquoBatalgorithm inspired algorithm for solving numerical optimiza-tion problemsrdquo Applied Mechanics and Materials vol 148 pp134ndash137 2012

[34] Wikipedia ldquoMathematical optimizationrdquo httpenwikipediaorgwikiNumerical optimization

[35] W Y Zhang S Xu and S J Li ldquoNecessary conditions for weaksharp minima in cone-constrained optimization problemsrdquoAbstract and Applied Analysis vol 2011 Article ID 909520 11pages 2012

[36] Wikipedia ldquoGlobal optimizationrdquo httpenwikipediaorgwikiGlobal optimization

[37] S Kirkpatrick C D Gelatt Jr and M P Vecchi ldquoOptimizationby simulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983

[38] M Dorigo and T Stutzle Ant Colony Optimization The MITPress Cambridge Mass USA 2004

[39] M Dorigo Optimization Learning and Natural AlgorithmsPolitecnico di Milano Italy 1992

[40] H G Beyer The Theory of Evolution Strategies Springer NewYork NY USA 2001

[41] W Khatib and P Fleming ldquoThe stud GA a mini revolutionrdquo inParallel Problem Solving From Nature pp 683ndash691 1998

[42] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[43] M Friedman ldquoA comparison of alternative tests of significancefor the problem of 119898 rankingsrdquo The Annals of MathematicalStatistics vol 11 no 1 pp 86ndash92 1940

[44] R Brits A P Engelbrecht and F van den Bergh ldquoLocatingmultiple optima using particle swarm optimizationrdquo AppliedMathematics and Computation vol 189 no 2 pp 1859ndash18832007

[45] K Tang X Li P N Suganthan Z Yang and T Weise ldquoBench-mark functions for the CECrsquo2010 special session and competi-tion on large scale global optimizationrdquo Nature Inspired Com-putation and Applications Laboratory USTC Hefei China2010

[46] Q-K Pan P N Suganthan J J Liang and M F Tasgetiren ldquoAlocal-best harmony search algorithm with dynamic subpopula-tionsrdquo Engineering Optimization vol 42 no 2 pp 101ndash117 2010

[47] Q-K Pan P N Suganthan J J Liang and M F TasgetirenldquoA local-best harmony search algorithm with dynamic sub-harmony memories for lot-streaming flow shop schedulingproblemrdquo Expert Systems with Applications vol 38 no 4 pp3252ndash3259 2011

[48] Q-K Pan P N Suganthan M F Tasgetiren and J J LiangldquoA self-adaptive global best harmony search algorithm forcontinuous optimization problemsrdquo Applied Mathematics andComputation vol 216 no 3 pp 830ndash848 2010

Journal of Applied Mathematics 21

[49] S M Islam S Das S Ghosh S Roy and P N Suganthan ldquoAnadaptive differential evolution algorithm with novel mutationand crossover strategies for global numerical optimizationrdquoIEEE Transactions on Systems Man and Cybernetics B vol 42no 2 pp 482ndash500 2012

[50] J J Liang A K Qin P N Suganthan and S Baskar ldquoCompre-hensive learning particle swarm optimizer for global optimiza-tion of multimodal functionsrdquo IEEE Transactions on Evolution-ary Computation vol 10 no 3 pp 281ndash295 2006

[51] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[52] R Mallipeddi and P Suganthan Problem Definitions and Eval-uation Criteria for the CEC 2010 Competition on ConstrainedReal-Parameter Optimization Nanyang Technological Univer-sity Singapore 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Journal of Applied Mathematics 3

22 Harmony Search Firstly developed by Geem et al in2001 harmony search (HS) [26] is a relatively new meta-heuristic optimization algorithm and it is based on naturalmusical performance processes that occur when a musiciansearches for an optimal state of harmony The optimizationoperators of HS algorithm are specified as the harmonymemory (HM) which keeps the solution vectors which areall within the search space as shown in (2) the harmonymemory size HMS which represents the number of solutionvectors kept in the HM the harmony memory considerationrate (HMCR) the pitch adjustment rate (PAR) the pitchadjustment bandwidth (bw)

HM =

[[[[[

[

11990911

11990912

1199091119863

fitness (1199091)11990921

11990922

1199092119863

fitness (1199092)

119909HMS1

119909HMS2

119909HMS119863

fitness (119909HMS)

]]]]]

]

(2)

In more detail we can explain the HS algorithm withthe help of discussing the improvisation process by a musicplayer When a player is improvising he or she has threepossible options (1) play any well-known piece of music (aseries of pitches in harmony) exactly from his or her memoryas the harmony memory consideration rate (HMCR) (2)play something similar to a known piece in playerrsquos memory(thus adjusting the pitch slightly) or (3) play totally newor random pitch from feasible ranges If these three optionsare formalized for optimization we have three correspond-ing components employment of harmony memory pitchadjusting and randomization Similarly when each decisionvariable selects one value in theHS algorithm it canmake useof one of the above three rules in the whole HS procedure If anew harmony vector is better than the worst harmony vectorin the HM the new harmony vector takes the place of theworst harmony vector in the HMThis procedure is repeateduntil a stopping criterion is satisfied

The employment of harmony memory is significant asit is analogous to select the optimal fit individuals in theGA (genetic algorithm) This will make sure that the bestharmonies will be kept on to the new harmony memory Forthe purpose of using this memorymore effectively we shouldproperly set the value of the parameter HMCR isin [0 1]If this rate is very approaching to 1 (too high) almost allthe harmonies are utilized in the harmony memory thenother harmonies are not exploredwell resulting in potentiallywrong solutions If this rate is very close to 0 (extremely low)only few best harmonies are chosen and it may have a slowconvergence rate Therefore generally HMCR = 07 sim 095

To adjust the pitch slightly in the second component anappropriate approach is to be applied to adjust the frequencyefficiently In theory we can adjust the pitch linearly ornonlinearly but in fact linear adjustment is utilized If 119909old isthe current solution (or pitch) then the new solution (pitch)119909new is generated by

119909new = 119909old + bw (2120576 minus 1) (3)

where 120576 is a random real number drawn from a uniformdistribution [0 1] Here bw is the bandwidth controlling the

local range of pitch adjustment Actually we can see that thepitch adjustment (3) is a random walk

Pitch adjustment is similar to the mutation operator inGA Also we must appropriately set the parameter PAR tocontrol the degree of the adjustment If PAR nears 1 (toohigh) then the solution is always changing and the algorithmmay not converge at all If it is very close to 0 (too low) thenthere is very little change and the algorithm may prematureTherefore we use PAR = 01 sim 05 in most simulations asusual

For the purpose of increasing the diversity of the solu-tions the randomization is needed in the third componentAlthough adjusting pitch has a similar role it is confinedto certain local pitch adjustment and thus corresponds toa local search The usage of randomization can make thesystem move further to explore multifarious regions withhigh solution diversity in order to search for the globaloptimal solution In real-world engineering applicationsHS has been applied to solve many optimization problemsincluding function optimization water distribution networkgroundwater modeling energy-saving dispatch structuraldesign and vehicle routing

The mainframe of the basic HS algorithm can bedescribed as shown in Algorithm 1 where119863 is the number ofdecision variables and rand is a random real number in inter-val (0 1) drawn fromuniformdistribution FromAlgorithm 1it is clear that there are only two control parameters in HSwhich are HMCR and PAR

23 BatAlgorithm Thebat algorithm is a novelmetaheuristicswarm intelligence optimization method developed for theglobal numerical optimization in which the search algorithmis inspired by social behavior of bats and the phenomenon ofecholocation to sense distance

In [28] for simplicity bat algorithm is based on idealizingsome of the echolocation characteristics of bats which are thefollowing approximate or idealized rules

(1) all bats apply echolocation to sense distance and theyalways ldquoknowrdquo the surroundings in some magicalway

(2) bats fly randomly with velocity V119894and a fixed fre-

quency 119891min at position 119909119894 varying wavelength 120582 andloudness119860

0to hunt for preyThey can spontaneously

accommodate the wavelength (or frequency) of theiremitted pulses and adjust the rate of pulse emission119903 isin [0 1] depending on the proximity of their target

(3) although the loudness can change in different ways itis supposed that the loudness varies from aminimumconstant (positive) 119860min to a large 1198600

Based on these approximations and idealization the basicsteps of the bat algorithm (BA) can be described as shown inAlgorithm 2

In BA each bat is defined by its position 119909119905119894 velocity V

119905

119894

frequency 119891119894 loudness 119860119905

119894 and the emission pulse rate 119903119905

119894

4 Journal of Applied Mathematics

BeginStep 1 Set the parameters and initialize the HMStep 2 Evaluate the fitness for each individual in HMStep 3 while the halting criteria is not satisfied do

for 119889 = 1119863 doif rand ltHMCR then memory consideration

119909new (119889) = 119909119886(119889) where 119886 isin (1 2 HMS)

if rand lt PAR then pitch adjustment119909new (119889) = 119909old (119889) + bw times (2 times rand minus 1)

endifelse random selection

119909new(119889) = 119909min119889 + rand times (119909max119889 minus 119909minminus119889)endif

endfor 119889Update the HM as 119909

119908= 119909new if 119891(119909new) lt 119891(119909

119908) (minimization objective)

Update the best harmony vector found so farStep 4 end whileStep 5 Post-processing the results and visualization

End

Algorithm 1 Harmony search algorithm

BeginStep 1 Initialization Set the generation counter 119905 = 1 Initialize the population of NP bats 119875

randomly and each bat corresponding to a potential solution tothe given problem define loudness 119860

119894 pulse frequency 119876

119894

and the initial velocities V119894(119894 = 1 2 NP) set pulse rate 119903

119894

Step 2 While the termination criteria is not satisfied or 119905 ltMaxGeneration doGenerate new solutions by adjusting frequency and updating velocitiesand locationssolutions [(4)ndash(6)]if (rand gt 119903

119894) then

Select a solution among the best solutionsGenerate a local solution around the selected best solution

end ifGenerate a new solution by flying randomlyif (rand lt 119860

119894amp 119891(119909

119894) lt 119891(119909

lowast)) then

Accept the new solutionsIncrease 119903

119894and reduce 119860

119894

end ifRank the bats and find the current best 119909

lowast

119905 = 119905 + 1Step 3 end whileStep 4 Post-processing the results and visualization

End

Algorithm 2 Bat algorithm

in a 119863-dimensional search space The new solutions 119909119905119894and

velocities V119905119894at time step t are given by

119891119894= 119891min + (119891max minus 119891min) 120573 (4)

V119905

119894= V119905minus1

119894+ (119909119905119894minus 119909lowast) 119891119894 (5)

119909119905119894= 119909119905minus1119894

+ V119905

119894 (6)

where 120573 isin [0 1] is a random vector drawn from a uniformdistribution Here 119909

lowastis the current global best location (solu-

tion)which is located after comparing all the solutions among

all the 119899 bats Generally speaking depending on the domainsize of the problem of interest the frequency 119891 is assignedto 119891min = 0 and 119891max = 100 in practical implementationInitially each bat is randomly given a frequency which isdrawn uniformly from [119891min 119891max]

For the local search part once a solution is selectedamong the current best solutions a new solution for each batis generated locally using random walk

119909new = 119909old + 120576119860119905 (7)

Journal of Applied Mathematics 5

where 120576 isin [minus1 1] is a scaling factor which is a random num-ber while 119860

119905= ⟨119860119905119894⟩ is the average loudness of all the bats at

time step 119905The update of the velocities and positions of bats is similar

to the procedure in the standard PSO [21] as 119891119894controls the

pace and range of the movement of the swarming particles inessence To some degree BA can be considered as a balancedcombination of the standard PSO and the intensive localsearch controlled by the loudness and pulse rate

Furthermore the loudness 119860119894and the rate 119903

119894of pulse

emission update accordingly as the iterations proceed asshown in (8)

119860119905+1119894

= 120572119860119905119894

119903119905+1119894

= 1199030119894[1 minus exp (minus120574119905)]

(8)

where 120572 and 120574 are constants In essence 120572 is similar tothe cooling factor of a cooling schedule in the simulatedannealing (SA) [37] For simplicity we set 120572 = 120574 = 09 inthis work

3 Our Approach HSBA

Based on the introduction of HS and BA in previous sectionwe will explain how we combine the two approaches to formthe proposed bat algorithm with harmony search (HSBA) inthis section whichmodifies the solutions with poor fitness inorder to add diversity of the population to improve the searchefficiency

In general the standard BA algorithm is adept at exploit-ing the search space but at times it may trap into somelocal optima so that it cannot perform global search wellFor BA the search depends completely on random walks soa fast convergence cannot be guaranteed Firstly presentedhere in order to increase the diversity of the populationfor BA so as to avoid trapping into local optima a mainimprovement of adding pitch adjustment operation in HSserving as a mutation operator is made to the BA with theaim of speeding up convergence thus making the approachmore feasible for a wider range of practical applications whilepreserving the attractive characteristics of the basic BA Inthis paper a hybrid metaheuristic algorithm by inducing thepitch adjustment operation in HS as a mutation operatorinto bat algorithm so-called harmony searchbat algorithm(HSBA) is used to optimize the benchmark functions Thedifference between HSBA and BA is that the mutationoperator is used to improve the original BA generating a newsolution for each bat In this way this method can explore thenew search space by the mutation of the HS algorithm andexploit the population information with BA and thereforecan avoid trapping into local optima in BA In the followingwe will show the algorithm HSBA which is an improvementof HS and BA

The critical operator of HSBA is the hybrid harmonysearchmutation operator which composes the improvisationof harmony in HS with the BAThe core idea of the proposedhybrid mutation operator is based on two considerationsFirstly poor solutions can take in many new used features

from good solutions Secondly the mutation operator canimprove the exploration of the new search space In this waythe strong exploration abilities of the original HS and theexploitation abilities of the BA can be fully developed

For bat algorithm as the search relies entirely on randomwalks a fast convergence cannot be guaranteed Describedhere for the first time a main improvement of addingmutation operator is made to the BA including three minorimprovements which are made with the aim of speeding upconvergence thus making the method more practical for awider range of applications but without losing the attractivefeatures of the original method

The first improvement is that we use fixed frequency119891 and loudness 119860 instead of various frequency 119891

119894and 119860119905

119894

Similar to BA inHSBA each bat is defined by its position 119909119905119894

velocity V119905119894 the emission pulse rate 119903119905

119894and the fixed frequency

119891 and loudness 119860 in a 119889-dimensional search space The newsolutions 119909119905

119894and velocities V119905

119894at time step 119905 are given by

V119905

119894= V119905minus1

119894+ (119909119905119894minus 119909lowast) 119891

119909119905119894= 119909119905minus1119894

+ V119905

119894

(9)

where 119909lowastis the current global best location (solution) which

is located after comparing all the solutions among all the 119899bats In our experiments we make 119891 = 05 Through a seriesof simulation experiments on benchmarks in Section 42 itwas found that setting the parameter of pulse rate 119903 to 06 andthe loudness 119860 to 095 produced the best results

The second improvement is to add mutation operator inan attempt to increase diversity of the population to improvethe search efficiency and speed up the convergence to optimaFor the local search part once a solution is selected among thecurrent best solutions a new solution for each bat is generatedlocally using random walk by (7) when 120585 is larger than pulserate 119903 that is 120585 gt 119903 where 120585 isin [0 1] is a random real numberdrawn from a uniform distribution while when 120585 le 119903 weuse pitch adjustment operation in HS serving as a mutationoperator updating the new solution to increase diversity ofthe population to improve the search efficiency as shown in(3) Through testing benchmarks in Section 42 it was foundthat setting the parameter of harmonymemory considerationrate HMCR to 095 and pitch adjustment rate PAR to 01produced the best results

The last improvement is the addition of elitism schemeinto the HSBA As with other population-based optimiza-tion algorithms we typically incorporate some sort of elitismin order to retain the best solutions in the population Thisprevents the best solutions from being corrupted by pitchadjustment operator Note that we use an elitism approach tosave the property of the bats that has the best solution in theHSBA process so even if pitch adjustment operation ruinsits corresponding bat we have saved it and can revert back toit if needed

Based on the above-mentioned analyses the mainframeof the harmony searchbat algorithm (HSBA) is presentedin Algorithm 3

6 Journal of Applied Mathematics

BeginStep 1 Initialization Set the generation counter 119905 = 1 initialize the population of NP

bats 119875 randomly and each bat corresponding to a potentialsolution to the given problem define loudness 119860 set frequency 119876the initial velocities V and pulse rate 119903 set the harmony memory considerationrate HMCR the pitch adjustment rate PAR and bandwidth bwset maximum of elite individuals retained KEEP

Step 2 Evaluate the quality 119891 for each bat in 119875 determined by the objective function 119891(119909)Step 3 While the termination criteria is not satisfied or 119905 ltMaxGeneration do

Sort the population of bats 119875 from best to worst by order of quality 119891 for each batStore the KEEP best bats as KEEPBATfor 119894 = 1NP (all bats) do

V119905

119894= V119905minus1

119894+ (V119905119894minus 119909lowast) times 119876

119909119905119894= 119909119905minus1119894

+ V119905

119894

if (rand gt 119903) then119909119905119906= 119909lowast+ 120572120576119905

end iffor 119895 = 1119863 (all elements) do Mutate

if (rand lt HMCR) then1199031= lceilNP lowast randrceil

119909V (119895) = 1199091199031(119895) where 119903

1isin (1 2 HMS)

if (rand lt PAR) then119909V (119895) = 119909V (119895) + bw times (2 times rand minus 1)

endifelse

119909V(119895) = 119909min 119895 + rand times (119909max119895 minus 119909minminus119895)

endifendfor 119895Evaluate the fitness for the offsprings 119909119905

119906 119909119905119894 119909119905

V

Select the offspring 119909119905119896with the best fitness among the offsprings 119909119905

119906 119909119905119894 119909119905

V

if (rand lt 119860) then1199091199051199031= 119909119905119896

end ifReplace the KEEP worst bats with the KEEP best bats KEEPBAT stored

end for 119894119905 = 119905 + 1

Step 4 end whileStep 5 Post-processing the results and visualization

End

Algorithm 3 The hybrid meta-heuristic algorithm of HSBA

4 Simulation Experiments

In this section we test the performance of the proposedmeta-heuristic HSBA to global numerical optimization through aseries of experiments conducted on benchmark functions

To allow a fair comparison of running times all the exper-iments were conducted on a PC with a Pentium IV processorrunning at 20GHz 512MB of RAM and a hard driveof 160Gbytes Our implementation was compiled usingMATLAB R2012a (714) running under Windows XP3 Nocommercial BA or HS tools were used in the followingexperiments

41 General Performance of HSBA In order to explorethe benefits of HSBA in this subsection we compared itsperformance on global numeric optimization problem withnine other population-based optimization methods which

are ACO BA BBO DE ES GA HS PSO and SGA ACO(ant colony optimization) [38 39] is a swarm intelligencealgorithm for solving optimization problems which is basedon the pheromone deposition of ants BBO (biogeography-based optimization) [24] is a new excellent powerful andefficient evolution algorithm developed for the global opti-mization inspired by the immigration and emigration ofspecies between islands (or habitats) in search of morecompatible islands DE (differential evolution) [19] is a simplebut excellent optimization method that uses the differencebetween two solutions to probabilistically adapt a thirdsolution An ES (evolutionary strategy) [40] is an algorithmthat generally distributes equal importance to mutation andrecombination and that allows two or more parents toreproduce an offspring A GA (genetic algorithm) [18] is asearch heuristic that mimics the process of natural evolutionPSO (particle swarm optimization) [21] is also a swarm

Journal of Applied Mathematics 7

intelligence algorithm which is based on the swarm behaviorof fish and bird schooling in nature A stud genetic algorithm(SGA) [41] is a GA that uses the best individual at eachgeneration for crossover

In all experiments we will use the same parameters forHS BA and HSBA that are loudness 119860 = 095 pulse rate119903 = 06 scaling factor 120576 = 01 the harmony memoryconsideration rate HMCR = 095 and the pitch adjustmentrate PAR = 01 For ACO BBO DE ES GA PSO and SGAwe set the parameters as follows For ACO initial pheromonevalue 120591

0= 1119864 minus 6 pheromone update constant 119876 = 20

exploration constant 1199020= 1 global pheromone decay rate

120588119892= 09 local pheromone decay rate 120588

119897= 05 pheromone

sensitivity 119904 = 1 and visibility sensitivity 120573 = 5 for BBOhabitat modification probability = 1 immigration probabilitybounds per gene = [0 1] step size for numerical integration ofprobabilities = 1 maximum immigration and migration ratesfor each island = 1 and mutation probability = 0005 for DEa weighting factor 119865 = 05 and a crossover constant CR = 05for the ES the number of offspring 120582 = 10 produced ineach generation and standard deviation 120590 = 1 for changingsolutions For the GA we used roulette wheel selection andsingle-point crossover with a crossover probability of 1 anda mutation probability of 001 For PSO we set an inertialconstant = 03 a cognitive constant = 1 and a social constantfor swarm interaction = 1 For the SGA we used single-pointcrossover with a crossover probability of 1 and a mutationprobability of 001

Well-defined problem sets are favorable for evaluatingthe performance of optimization methods proposed in thispaper Based on mathematical functions benchmark func-tions can be applied as objective functions to perform suchtests The properties of these benchmark functions can beeasily achieved from their definitions Fourteen differentbenchmark functions are applied to verify our proposedmetaheuristic algorithmHSBA Each of the functions in thisstudy has 20 independent variables (ie119863 = 20)

The benchmark functions described in Table 1 are stan-dard testing functionsTheproperties of the benchmark func-tions are given in Table 2 The modality property means thenumber of the best solutions in the search space Unimodalbenchmark functions only have an optimum which is theglobal optimum Multimodal benchmark functions have atleast two optima in their search space indicating that theyhave more than one local optima except the global optimumMore details of all the benchmark functions can be found in[6]

We set population size 119873119875 = 50 an elitism parameter119870119890119890119901 = 2 and maximum generation119872119886119909119892119890119899 = 50 for eachalgorithm We ran 100 Monte Carlo simulations of eachalgorithm on each benchmark function to get representativeperformances Tables 3 and 4 illustrate the results of thesimulations Table 3 shows the average minima found byeach algorithm averaged over 100 Monte Carlo runs Table 4shows the absolute best minima found by each algorithmover 100 Monte Carlo runs In other words Table 3 showsthe average performance of each algorithm while Table 4shows the best performance of each algorithmThe best valueachieved for each test problem is shown in bold Note that the

normalizations in the tables are based on different scales sovalues cannot be compared between the two tables

From Table 3 we see that on average HSBA is the mosteffective at finding objective function minimum on ten ofthe fourteen benchmarks (F02 F03 F06ndashF11 and F13-F14)ACO is the second most effective performing the best on thebenchmarks F04 and F05 BBO and SGA are the third mosteffective performing the best on the benchmarks F12 andF01 respectively whenmultiple runs aremade Table 4 showsthat HSBA performs the best on thirteen of the fourteenbenchmarks (F02ndashF14) while BBO performs the secondbest at finding objective function minimum on the onlybenchmark (F01) when multiple runs are made In additionstatistical analysis [42] on these values obtained by the tenmethods on 14 benchmark functions based on the Friedmanrsquostest [43] reveals that the differences in the obtained averageand the best functionminima are statistically significant (119875 =

166 lowast 10minus17 and119875 = 725 lowast 10minus17 resp) at the confidence levelof 5

Furthermore the computational requirements of the tenoptimization methods were similar We collected the averagecomputational time of the optimization methods as appliedto the 14 benchmarks discussed in this sectionThe results areshown in Table 3 BA was the quickest optimization methodHSBA was the third fastest of the ten algorithms Howeverit should be noted that in the vast majority of real-worldapplications it is the fitness function evaluation that is by farthe most expensive part of a population-based optimizationalgorithm

Furthermore convergence graphs of ACO BA BBODE ES GA HS HSBA PSO and SGA are shown inFigures 1ndash14 which represent the process of optimizationThevalues shown in Figures 1ndash14 are the mean objective functionoptimum achieved from 100Monte Carlo simulations whichare the true objective function values not normalized

Figure 1 shows the results obtained for the ten methodswhen the F01 Ackley function is applied From Figure 1clearly we can draw the conclusion that BBO is significantlysuperior to all the other algorithms including HSBA duringthe process of optimization while HSBA performs the sec-ond best on this multimodal benchmark function Here allthe other algorithms show the almost same starting point andHSBA has a faster convergence rate than other algorithmswhile HSBA is outperformed by BBO after 13 generationsAlthough slower SGA eventually finds the global minimumclose to HSBA Obviously BBO HSBA and SGA outper-form ACO BA DE ES GA HS and PSO during the wholeprocess of searching the global minimum

Figure 2 illustrates the optimization results for F02 Fletc-her-Powell function In thismultimodal benchmark problemit is obvious that HSBA outperforms all other methods dur-ing the whole progress of optimization (except the generation20ndash33 BBO and HSBA performs approximately equally)

Figure 3 shows the optimization results for F03 Griewankfunction From Table 3 and Figure 3 we can see that HSBAperforms the best on this multimodal benchmark problemLooking at Figure 3 carefully HSBA shows a faster conver-gence rate initially than ACO however it is outperformed

8 Journal of Applied Mathematics

Table1Be

nchm

arkfunctio

ns

No

Nam

eDefinitio

nSource

F01

Ackley

119891(119909)=20

+119890minus20

sdot119890minus02sdot radic(1119899)

119899 sum 119894=1

1199092 119894

minus119890(1119899)

119899 sum 119894=1

cos(2120587119909119894)

[3]

F02

Fletcher-Pow

ell119891(119909)=119899 sum 119894=1

(119860119894minus119861119894)2

119860119894=119899 sum 119895=1

(119886119894119895sin

120572119895+119887 119894119895cos120572119895)

119861119894=119899 sum 119895=1

(119886119894119895sin

119909119895+119887 119894119895cos119909119895)

[4]

F03

Grie

wank

119891(119909)=119899 sum 119894=1

1199092 119894

4000

minus119899 prod 119894=1

cos(

119909119894

radic119894)

+1

[5]

F04

Penalty

1119891(119909)=

120587 3010sin2(1205871199101)+119899minus1

sum 119894=1

(119910119894minus1)2

sdot[1+10sin2(120587119910119894+1)]

+(119910119899minus1)2

+119899 sum 119894=1

119906(119909119894101004)

119910119894=1+025

(119909119894+1)

[6]

F05

Penalty

2119891(119909)=

01sin2(31205871199091)+119899minus1

sum 119894=1

(119909119894minus1)2

sdot[1+sin2(3120587119909119894+1)]+(119909119899minus1)2

[1+sin2(2120587119909119899)]

+119899 sum 119894=1

119906(11990911989451004)

[6]

F06

Quarticwith

noise

119891(119909)=119899 sum 119894=1

(119894sdot1199094 119894+119880(01))

[7]

F07

Rastrig

in119891(119909)=10

sdot119899+119899 sum 119894=1

(1199092 119894minus10

sdotcos

(2120587119909119894))

[8]

F08

Rosenb

rock

119891(119909)=119899minus1

sum 119894=1

[100(119909119894+1minus1199092 119894)2+(119909119894minus1)2

][7]

F09

Schw

efel226

119891(119909)=4189829

times119863minus119863 sum 119894=1

119909119894sin

(1003816 1003816 1003816 1003816119909119894

1003816 1003816 1003816 100381612

)[9]

F10

Schw

efel12

119891(119909)=119899 sum 119894=1

(119894 sum 119895=1

119909119895)

2

[10]

F11

Schw

efel222

119891(119909)=119899 sum 119894=1

|119909119894|+119899 prod 119894=1

|119909119894|

[10]

F12

Schw

efel221

119891(119909)=max 119894

1003816 1003816 1003816 1003816119909119894

1003816 1003816 1003816 10038161le119894le119899

[6]

Journal of Applied Mathematics 9

Table1Con

tinued

No

Nam

eDefinitio

nSource

F13

Sphere

119891(119909)=119899 sum 119894=1

1199092 119894

[7]

F14

Step

119891(119909)=6sdot119899

+119899 sum 119894=1

lfloor 119909119894rfloor

[7]

lowast

Inbenchm

arkfunctio

nF0

2them

atrix

elem

entsa 119899times119899b 119899times119899isin(minus100100)120572119899times1isin(minus120587120587)ared

rawnfro

mun

iform

distr

ibution[4]

lowastIn

benchm

arkfunctio

nsF0

4andF0

5thed

efinitio

nof

thefun

ction119906(119909119894119886119896119898)isgivenby119906(119909119894119886119896119898)=119896(119909119894minus119886)119898119909119894gt1198860minus119886le119909119894le119886119896(minus119909119894minus119886)119898119909119894ltminus119886

10 Journal of Applied Mathematics

Table 2 Properties of benchmark functions lb denotes lower bound ub denotes upper bound and opt denotes optimum point

No Function lb ub opt Continuity ModalityF01 Ackley minus32768 32768 0 Continuous MultimodalF02 Fletcher-Powell minus120587 120587 0 Continuous MultimodalF03 Griewank minus600 600 0 Continuous MultimodalF04 Penalty 1 minus50 50 0 Continuous MultimodalF05 Penalty 2 minus50 50 0 Continuous MultimodalF06 Quartic with noise minus128 128 1 Continuous MultimodalF07 Rastrigin minus512 512 0 Continuous MultimodalF08 Rosenbrock minus2048 2048 0 Continuous UnimodalF09 Schwefel 226 minus512 512 0 Continuous MultimodalF10 Schwefel 12 minus100 100 0 Continuous UnimodalF11 Schwefel 222 minus10 10 0 Continuous UnimodalF12 Schwefel 221 minus100 100 0 Continuous UnimodalF13 Sphere minus512 512 0 Continuous UnimodalF14 Step minus512 512 0 Discontinuous Unimodal

Table 3 Mean normalized optimization results in fourteen benchmark functions The values shown are the minimum objective functionvalues found by each algorithm averaged over 100 Monte Carlo simulations

ACO BA BBO DE ES GA HS HSBA PSO SGAF01 231 333 115 202 338 272 347 109 266 100F02 2458 2582 158 894 2435 545 1569 100 1396 133F03 316 6072 193 544 2385 322 7722 100 2577 142F04 100 30E38 40E32 56E33 27E38 31E32 14E39 23E32 41E36 96E31F05 100 11E8 29942 15E6 46E8 54E3 41E8 21551 55E7 11110F06 48901 68E3 3532 30829 18E4 27483 15E4 100 25E3 1009F07 809 1155 128 656 1187 617 1022 100 844 180F08 4225 2901 229 759 5999 905 4785 100 1204 215F09 317 2026 199 1358 1333 181 1992 100 1761 115F10 175 373 138 295 493 125 422 100 248 148F11 105 1970 183 714 2312 1113 1945 100 1322 246F12 186 403 100 299 391 192 374 138 238 112F13 9830 15084 380 1903 22652 4774 18232 100 7291 402F14 773 12048 393 1331 10256 1153 14655 100 6344 328TimE 274 100 132 164 167 179 233 143 203 176lowast

The values are normalized so that the minimum in each row is 100 These are not the absolute minima found by each algorithm but the average minimafound by each algorithm

by ACO after 5 generations For other algorithms althoughslower BBO GA and SGA eventually find the global mini-mum close to HSBA while BA DE ES HS and PSO fail tosearch the global minimum within the limited iterations

Figure 4 shows the results for F04 Penalty 1 functionFrom Figure 4 it is obvious that HSBA outperforms all othermethods during the whole progress of optimization in thismultimodal benchmark function Although slower SGA andBBO perform the second and third best at finding the globalminimum

Figure 5 shows the performance achieved for F05 Penalty2 function For this multimodal function similar to F04Penalty 1 function shown in Figure 4 HSBA is significantly

superior to all the other algorithms during the process ofoptimization Here PSO shows a faster convergence rateinitially than HSBA however it is outperformed by HSBAafter 3 generations

Figure 6 shows the results achieved for the ten methodswhen using the F06 Quartic (with noise) function FromTable 3 and Figure 6 we can conclude that HSBA performsthe best in this multimodal function Looking carefullyat Figure 6 PSO and HSBA show the same initial fastconvergence towards the known minimum as the procedureproceeds HSBA gets closer and closer to the minimumwhile PSO comes into being premature and traps into thelocal minimum BA ES GA HS and PSO do not manage

Journal of Applied Mathematics 11

Table 4 Best normalized optimization results in fourteen benchmark functions The values shown are the minimum objective functionvalues found by each algorithm

ACO BA BBO DE ES GA HS HSBA PSO SGAF01 185 231 100 143 225 203 230 105 191 104F02 1042 1448 109 422 1039 403 991 100 828 133F03 273 4622 187 447 2138 802 4324 100 1675 176F04 44E6 50E6 12E3 19E4 22E6 30E4 42E6 100 3755 322F05 30E4 32E4 5101 38633 16E4 88446 26E4 100 4113 457F06 5851 99255 650 2469 80848 5924 74691 100 18971 202F07 571 853 125 513 794 523 747 100 600 168F08 2642 2503 148 370 3352 616 2150 100 775 145F09 243 866 128 490 593 209 728 100 740 142F10 189 494 118 266 300 208 276 100 201 174F11 774 1261 121 336 1214 601 960 100 781 162F12 133 233 144 169 208 174 211 100 170 121F13 2804 5657 217 554 6057 1971 5286 100 2098 228F14 428 5402 197 485 3361 1060 4911 100 1972 185lowast

The values are normalized so that the minimum in each row is 100 These are the absolute best minima found by each algorithm

0 10 20 30 40 508

10

12

14

16

18

20

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 1 Comparison of the performance of the different methodsfor the F01 Ackley function

to succeed in this benchmark function within maximumnumber of generations At last BBO DE and SGA convergeto the value very close to HSBA

Figure 7 shows the optimization results for the F07Rastrigin function Very clearly HSBA has the fastest con-vergence rate at finding the globalminimumand significantlyoutperforms all other approaches Looking carefully at Fig-ure 7 approximately we can divide all the algorithms intothree groups one group including BBO HSBA and SGAperforming the best the other group including ACO DE

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

106

105

Figure 2 Comparison of the performance of the different methodsfor the F02 Fletcher-Powell function

GA and PSO performing the second best another groupincluding BA ES and HS performing the worst

Figure 8 shows the results for F08 Rosenbrock functionFrom Table 3 and Figure 8 we can see that HSBA is superiorto the other algorithms during the optimization process inthis relatively simple unimodal benchmark function Alsowe see that similar to the F06 Quartic (with noise) functionshown in Figure 6 PSO shows a faster convergence rateinitially than HSBA however it is outperformed by HSBAafter 4 generations and is premature to the local minimum

Figure 9 shows the equivalent results for the F09 Schwefel226 function From Figure 9 very clearly though HSBA isoutperformed by BBO between the generations 10ndash30 it hasthe stable convergence rate at finding the global minimum

12 Journal of Applied Mathematics

0 10 20 30 40 50

50

100

150

200

250

300

350

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 3 Comparison of the performance of the different methodsfor the F03 Griewank function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

1010

109

108

107

106

105

104

103

102

101

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 4 Comparison of the performance of the different methodsfor the F04 Penalty 1 function

and significantly outperforms all other approaches in thismultimodal benchmark function

Figure 10 shows the results for F10 Schwefel 12 functionFrom Figure 10 we can see that HSBA performs far betterthan other algorithms during the optimization process in thisrelative simple unimodal benchmark function PSO showsa faster convergence rate initially than HSBA howeverit is outperformed by HSBA after 3 generations Lookingcarefully at Figure 10 akin to F07 Rastrigin function shown inFigure 7 all the algorithms except BA can be approximately

0

Ben

chm

ark

func

tion

val

ue

5 10 15 20 25 30 35 40 45 50Number of generations

109

108

107

106

105

104

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 5 Comparison of the performance of the different methodsfor the F05 Penalty 2 function

0 10 20 30 40 500

5

10

15

20

25

30

35

40

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 6 Comparison of the performance of the different methodsfor the F06 Quartic (with noise) function

divided into three groups one group including BBO andHSBA performing the best the other group including ACOGA PSO and SGA performing the second best anothergroup including DE ES and HS performing the worst

Figure 11 shows the results for F11 Schwefel 222 functionVery clearly BBO shows a faster convergence rate initiallythan HSBA however it is outperformed by HSBA after 40generations At last HSBA reaches the optimal solution sig-nificantly superior to other algorithms BBO is only inferior

Journal of Applied Mathematics 13B

ench

mar

k fu

ncti

on v

alue

0 10 20 30 40 50

50

100

150

200

250

Number of generations

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 7 Comparison of the performance of the different methodsfor the F07 Rastrigin function

0 10 20 30 40 50

500

1000

1500

2000

2500

3000

3500

4000

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 8 Comparison of the performance of the different methodsfor the F08 Rosenbrock function

to HSBA and performs the second best in this unimodalfunction

Figure 12 shows the results for F12 Schwefel 221 functionVery clearly HSBAhas the fastest convergence rate at findingthe global minimum and significantly outperforms all otherapproaches

0 10 20 30 40 50

1000

2000

3000

4000

5000

6000

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 9 Comparison of the performance of the different methodsfor the F09 Schwefel 226 function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

104

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 10 Comparison of the performance of the differentmethodsfor the F10 Schwefel 12 function

Figure 13 shows the results for F13 Sphere function FromTable 3 and Figure 13 HSBA has the fastest convergencerate at finding the global minimum and outperforms all otherapproaches Looking carefully at Figure 13 for BBO and SGAwe can see that BBO has a faster convergence rate than SGAbut SGA does finally converge to the value of BBO that isapproaching to HSBA

Figure 14 shows the results for F14 Step function Appar-ently HSBA shows the fastest convergence rate at findingthe global minimum and significantly outperforms all otherapproaches Looking carefully at Figure 14 for BBO and SGA

14 Journal of Applied MathematicsB

ench

mar

k fu

ncti

on v

alue

0 5 10 15 20 25 30 35 40 45 50Number of generations

102

101

100

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 11 Comparison of the performance of the different methodsfor the F11 Schwefel 222 function

0 10 20 30 40 5030

35

40

45

50

55

60

65

70

75

80

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 12 Comparison of the performance of the different methodsfor the F12 Schwefel 221 function

we can see that BBO has a faster convergence rate than SGAbut SGA does finally converge to the value of BBO

From the above-analyses about Figures 1ndash14 we can cometo the conclusion that our proposed hybrid metaheuristicalgorithm HSBA significantly outperforms the other ninealgorithms In general ACO BBO and SGA are only inferiorto the HSBA and ACO BBO and SGA perform betterthanHSBA in the benchmarks F04-F05 the benchmark F12and the benchmark F01 respectively Further benchmarksF04 F05 F06 F08 and F10 illustrate that PSO has a faster

0 10 20 30 40 500

10

20

30

40

50

60

70

80

90

100

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 13 Comparison of the performance of the different methodsfor the F13 Sphere function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

104

103

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 14 Comparison of the performance of the differentmethodsfor the F14 Step function

convergence rate initially while later it converges slowerand slower to the true objective function value At lastwe must point out that in [24] Simon compared BBOwith seven state-of-the-art EAs over 14 benchmark func-tions and a real-world sensor selection problem The resultsdemonstrated the good performance of BBO It is also indi-rectly demonstrated that our proposed hybrid metaheuristicmethod HSBA is a more powerful and efficient optimiza-tion algorithm than other population-based optimizationalgorithms

Journal of Applied Mathematics 15

Table 5 Best normalized optimization results in fourteen benchmark functions with different 119860 The numbers shown are the best resultsfound after 100 Monte Carlo simulations HSBA algorithm

119860

0 01 02 03 04 05 06 07 08 09 10F01 166 133 155 140 100 133 121 122 116 113 114F02 125E4 100 100 100 100 100 100 100 100 100 100F03 341 1115 100 100 100 100 100 100 100 100 100F04 273E4 100 125E4 104 104 104 104 104 104 100 100F05 503E5 530 133 168E4 100 100 100 100 100 100 100F06 100 256 547 473 1781 990 289 674 990 260 557F07 1014 1492 435 110 110 1537 101 100 100 100 100F08 3849 108 1406 108 108 108 9901 101 100 100 100F09 28518 40458 138 474 119 119 119 36791 101 100 100F10 66522 468 169E3 1527 118 118 118 118 121 100 100F11 2003 100 996 1154 3922 996 996 996 996 3897 849F12 1032 100 1322 323 1469 279 279 279 279 279 1931F13 146 415 284 447 115 156 100 100 100 100 100F14 52752 1357 395 101 115 1291 100 100 100 100 100

1 4 2 2 3 3 5 6 7 10 10

42 Influence of Control Parameter In [28] Dr Yang con-cluded that if the parameters in BA can be adjusted properlyit can outperform GA HS (harmony search) [26] and PSOThe choice of the control parameters is of vital importancefor different problems To investigate the influence of theloudness119860 pulse rate 119903 harmonymemory consideration rateHMCR and pitch adjustment rate PAR on the performanceof HSBA we carry out this experiment in the benchmarkproblem with different parameters All other parametersettings are kept unchanged (unless noted otherwise in thefollowing paragraph) The results are recorded in Tables 5ndash12 after 100 Monte Carlo runs Among them Tables 5 7 9and 11 show the best minima found by HSBA algorithm over100Monte Carlo runs Tables 6 8 10 and 12 show the averageminima found byHSBA algorithm averaged over 100MonteCarlo runs In other words Tables 5 7 9 11 and Tables 68 10 12 show the best and average performance of HSBAalgorithm respectively In each table the last row is the totalnumber of functions on which HSBA performs the best withsome parameters

421 Loudness 119860 Tables 5 and 6 recorded the resultsperformed in the benchmark problem with the loudness 119860 =0 01 02 09 10 and fixed pulse rate 119903 = 06 harmonymemory consideration rate HMCR= 095 and pitch adjust-ment rate PAR= 01 From Tables 5 and 6 obviously it can beseen that (i) for the three benchmark functions F02 F03 andF04 HSBA performs slightly differently that is to say thesethree benchmark functions are insensitive to the parameter119860 (ii) For benchmark functions F01 F06 F11 and F12HSBA performs better on smaller119860(lt05) (iii) However forfunctions F05 F07ndash10 F13 and F14 HSBA performs betteron bigger119860(gt05) In sum HSBA performs the best when119860is equal or very close to 10 So we set119860 = 095 which is veryclose to 09 and 10 in other experiments

422 Pulse Rate 119903 Tables 7 and 8 recorded the resultsperformed on the benchmark problem with the pulse rate119903 = 0 01 02 09 10 and fixed loudness119860 = 095harmony memory consideration rate HMCR=095 andpitch adjustment rate PAR= 01 From Tables 7 and 8 wecan evidently conclude that HSBA performs significantlybetter on bigger 119903(gt05) than on smaller 119903(lt05) and HSBAperforms the best when r is equal or very close to 06 So weset 119903 = 06 in other experiments

423 Harmony Memory Consideration Rate119867119872119862119877 Tables9 and 10 recorded the results performed in the bench-mark problem with the harmony memory consideration rateHMCR = 0 01 02 09 10 fixed loudness 119860 = 095pulse rate 119903 = 06 and pitch adjustment rate PAR = 01 FromTables 9 and 10 we can recognize that the function valuesevaluated by HSBA are bettersmaller on bigger HMCR(gt05) than on smaller r (lt05) and most benchmarks reachminimum when HMCR is equal or very close to 1 So we setHMCR=095 in other experiments

424 Pitch Adjustment Rate 119875119860119877 Tables 11 and 12 recordedthe results performed on the benchmark problems with thepitch adjustment rate PAR = 0 01 02 09 10 fixedloudness119860 = 095 pulse rate 119903 = 06 and harmonymemory consideration rate HMCR = 095 From Tables 11and 12 we can recognize that the function values for HSBAvary little with the increasing PAR and HSBA reachesoptimumminimum in most benchmarks when PAR is equalor very close to 01 So we set PAR = 01 in other experiments

43 Discussion In the HSBA the bats fly in the sky usingecholocation to find foodprey (ie best solutions) Fourother parameters are the loudness (119860) the rate of pulseemission (119903) harmony memory consideration rate (HMCR)

16 Journal of Applied Mathematics

Table 6 Mean normalized optimization results in fourteen benchmark functions with different A The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119860

0 01 02 03 04 05 06 07 08 09 10F01 101 101 100 100 100 100 100 100 100 100 100F02 152 100 146 117 108 137 188 165 167 179 148F03 669 1041 106 101 103 103 101 102 105 100 100F04 20131 126 410 421 333 272 100 519 291 100 311F05 59122 1682 425 587 101 414 2404 100 368 733 900F06 100 490E4 63799 25871 495E4 218 217 217 218 217 217F07 857 221E6 643 27234 11048 211E4 102 101 100 100 100F08 4980 114E4 172E4 16160 22459 9119 174E4 103 111 100 100F09 8237 173E6 106 314 7445 10310 4226 794E3 100 137 129F10 9031 145E3 245E5 232E3 2334 2234 3138 1289 234E3 140 100F11 498 100 315E4 233 1441 52023 44365 61626 24991 478E4 214F12 369 210E4 457E4 100 212E4 26635 23313 19880 27614 11201 214E4F13 190 825 448E6 214E6 100 615 25451 22275 18996 26386 10701F14 6691 195E5 117E4 124E3 2104 186E3 2940 2392 100 1835 2475

1 2 1 2 2 1 2 2 4 5 5

Table 7 Best normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 184 184 484 159 171 100 134 109 116 101F02 944E3 100 100 100 100 100 100 100 100 100 100F03 373 546 186 186 186 186 100 172 121 161 101F04 1142 111 2429 123 126 100 129 129 129 129 129F05 177E5 230 115 109E4 100 100 100 100 100 100 100F06 6244 2996 4858 1560 2634 780 182 100 3520 239 155F07 831 448 258 129 129 349 100 100 100 100 100F08 961 118 420 137 137 137 695 101 101 100 100F09 37394 44422 168 335 168 168 100 29198 101 101 101F10 38693 316 1397 463 158 158 100 158 30931 158 101F11 4255 100 1829 2135 2018 1111 1904 2135 1581 1876 1155F12 11440 100 14184 4448 12547 4448 4448 4448 4448 4448 6943F13 639 637 601 296 302 100 152 139 100 257 249F14 12052 404 233 100 116 340 100 100 116 116 116

0 3 1 2 2 4 8 5 4 4 4

and pitch adjustment rate (PAR) The appropriate update forthe pitch adjustment rate (PAR) and the rate of pulse emission(119903) balances the exploration and exploitation behavior of eachbat respectively as the loudness usually decreases once a bathas found its preysolution while the rate of pulse emissionincreases in order to raise the attack accuracy

For all of the standard benchmark functions that havebeen considered the HSBA has been demonstrated toperform better than or be equal to the standard BA and otheracclaimed state-of-the-art population-based algorithms withthe HSBA performing significantly better in some functionsThe HSBA performs excellently and efficiently because ofits ability to simultaneously carry out a local search still

searching globally at the same time It succeeds in doingthis due to the local search via harmony search algorithmand global search via bat algorithm concurrently A similarbehavior may be performed in the PSO by using multiswarmfrom a particle population initially [44] However HSBArsquosadvantages include performing simply and easily and onlyhaving four parameters to regulate The work carried outhere demonstrates the HSBA to be robust over all kinds ofbenchmark functions

Benchmark evaluation is a good way for verifying theperformance of the metaheuristic algorithms but it also haslimitations First we did not make any special effort to tunethe optimization algorithms in this section Different tuning

Journal of Applied Mathematics 17

Table 8 Mean normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 199 186 194 159 100 142 134 109 116 171F02 331 538 457 308 382 114 100 289 279 402 311F03 373 546 533 229 336 241 100 172 121 161 136F04 1142 111 2429 149E4 113E3 394E3 278E4 1966 17522 114 100F05 1609 12862 3269 2446 6440 1277 2959 7617 456 100 446F06 6244 2696 4858 1560 2634 780 182 100 3520 239 155F07 330 178 134 103 123 139 155 100 119 136 349F08 488 393 394 194 219 383 353 433 100 227 314F09 139 166 114 100 121 115 115 109 108 104 109F10 721 994 875 360 846 611 483 414 576 100 271F11 382 423 373 205 181 100 171 220 142 168 189F12 164 217 204 147 180 186 100 121 113 134 156F13 639 637 601 296 302 190 152 139 100 257 249F14 574 1543 742 682 644 317 100 485 221 174 209

0 0 0 1 0 2 4 2 2 2 1

Table 9 Best normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 164 164 164 164 164 164 164 151 128 100 163F02 187E4 100 100 100 100 100 100 100 100 100 100F03 1982 2649 371 371 371 371 371 371 371 206 100F04 126E5 100 187E4 100 100 100 100 100 100 100 100F05 607E5 534 100 187E4 100 100 100 100 100 100 100F06 10892 24631 36504 34540 33857 23465 14349 4528 2330 100 2575F07 922 1428 534 100 100 930 100 100 100 100 100F08 1810 108 772 108 108 108 3594 100 100 100 100F09 28004 39161 127 681 127 127 127 22795 100 100 100F10 55165 876 176E3 1170 164 164 164 164 27448 100 100F11 1467 100 618 618 1399 618 618 618 440 269 616F12 768 100 1110 273 1021 273 273 273 273 273 473F13 772 2675 1590 1776 612 1040 612 595 255 100 225F14 53716 1428 534 100 100 713 100 100 100 100 100

0 4 2 4 5 3 5 6 7 11 9

parameter values in the optimization algorithmsmight resultin significant differences in their performance Second real-world optimization problems may not have much of arelationship to benchmark functionsThird benchmark testsmight result in different conclusions if the grading criteria orproblem setup change In our work we examined the meanand best results obtained with a certain population size andafter a certain number of generations However we mightarrive at different conclusions if (for example) we change thegeneration limit or look at how many generations it takes toreach a certain function value or if we change the populationsize In spite of these caveats the benchmark results shown

here are promising for HSBA and indicate that this novelmethod might be able to find a niche among the plethora ofpopulation-based optimization algorithms

We note that CPU time is a bottleneck to the implemen-tation of many population-based optimization algorithmsIf an algorithm cannot converge fast it will be impracticalsince it would take too long to find an optimal or suboptimalsolution HSBA does not seem to require an unreasonableamount of computational effort of the ten optimization algo-rithms compared in this paper HSBA was the third fastestNevertheless findingmechanisms to accelerate HSBA couldbe an important area for further research

18 Journal of Applied Mathematics

Table 10 Mean normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 101 101 101 101 101 101 101 101 101 100 101F02 74756 787 525 343 509 710 307 187 160 100 101F03 913 312E4 109 107 105 106 105 102 101 101 100F04 2101198646 1001198644 3951198644 9481198643 5971198643 2951198643 1511198643 1501198643 4694 100 237F05 3241198646 3271198644 2091198644 4141198644 7131198643 1541198643 8931198643 1531198643 62948 100 20373F06 100 5941198644 42204 63215 6001198644 192 191 191 191 191 191F07 1063 2071198646 900 21672 32455 307E4 104 103 102 101 100F08 5950 9881198643 304E4 14180 21701 32468 307E4 107 107 103 100F09 22687 4951198646 308 891 11422 17360 25941 245E4 145 100 171F10 25737 2831198644 935E5 137E4 9709 6656 9925 14940 139E4 100 273F11 607 100 183E4 202 1661 39040 26299 40300 60361 572E4 184F12 300 279E4 360E4 100 282E4 27095 19414 13077 20040 30016 284E4F13 232 966 561E6 189E6 100 815 26778 19186 12923 19805 29665F14 18466 384E5 117E4 185E3 100 571E3 2511 5539 3960 2657 4149

1 1 0 1 2 0 0 0 0 6 3

Table 11 Best normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 391E3 100 100 100 100 100 100 100 100 100 100F03 100 425 151 220 220 212 220 199 220 128 180F04 100 255 6554 255 255 255 255 255 255 255 255F05 252 100 171 169E3 171 171 171 171 171 171 171F06 100 5987 3418 2285 4652 2344 4354 4484 3688 1899 812F07 807 100 148 255 255 1306 255 255 255 255 255F08 543 100 192 100 100 100 1145 100 100 100 100F09 7672 252 117 100 171 171 171 15556 171 171 171F10 92910 148 100 491 255 255 255 222 235E3 255 255F11 318E3 100 582E3 399E3 339E3 582E3 491E3 582E3 582E3 958E3 582E3F12 30506 100 53728 9734 18758 9734 9734 9734 9734 9734 39827F13 192 422 587 1007 1298 466 148 401 317 100 409F14 8812 100 148 255 255 491 255 255 255 255 255

4 8 3 4 3 3 2 3 3 4 3

5 Conclusion and Future Work

This paper proposed a hybrid metaheuristic HSBA methodfor optimization problemWe improved the BAby combiningoriginal harmony search (HS) algorithm and evaluating theHSBA on multimodal numerical optimization problemsA novel type of BA model has been presented and animprovement is applied to mutate between bats using har-mony search algorithm during the process of bats updatingUsing the original configuration of the bat algorithm wegenerate the new harmonies based on the newly generatedbat each iteration after batrsquos position has been updated Thenew harmony vector substitutes the newly generated bat only

if it has better fitness This selection scheme is rather greedywhich often overtakes original HS and BA The HSBAattempts to take merits of the BA and the HS in order toavoid all bats getting trapped in inferior local optimal regionsThe HSBA enables the bats to have more diverse exemplarsto learn from as the bats are updated each iteration andalso form new harmonies to search in a larger search spaceThis new method can speed up the global convergence ratewithout losing the strong robustness of the basic BA Fromthe analysis of the experimental results we observe that theproposed HSBA makes good use of the information in pastsolutions more effectively to generate better quality solutionsfrequently when compared to the other population-based

Journal of Applied Mathematics 19

Table 12Mean normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 407 100 100 452 277 257 341 536 369 489 103F03 100 127E4 134 135 135 135 134 135 134 134 134F04 115 8127 939E3 100 101 101 129 100 100 101 101F05 34573 5061 8619 907E3 2501 108 101 191 333 100 118F06 100 542E6 427E4 474E4 548E6 57788 57788 57788 57788 57787 57787F07 486 153 100 9543 10596 122E4 134 136 132 133 135F08 1269 7600 876E3 1703 6913 7672 885E3 110 105 104 100F09 6345 23047 182 167 1278 4860 5349 609E3 111 130 100F10 13355 1251 126 197E3 1148 464 1645 1893 199E3 100 188F11 6302 100 639E3 7944 5924 396E3 142E3 581E3 645E3 745E5 7981F12 390 881E3 890E3 100 890E3 4440 4778 1725 7011 7784 899E3F13 100 2166 207E3 669 580 430 27167 28246 10539 42926 47662F14 4614 100 3610 5593 122 640E3 4215 3289 3481 1272 5129

4 4 3 3 1 1 1 2 2 3 3

optimization algorithms such as ACO BA BBO DE ESGA HS PSO and SGA Based on the results of the tenapproaches on the test problems we can conclude that theHSBA significantly improves the performances of the HSand the BA on most multimodal and unimodal problems

In this work 14 benchmark functions are used to evaluatethe performance of our approach we will test our approachon more problems such as the high-dimensional (119863 ge20) CEC 2010 test suit [45] and the real-world problemsMoreover we will compare HSBA with other metaheuristicmethod such as DLHS [46 47] SGHS [48] adaptive DE[49] CLPSO [50] and DMS-PSO [51] In addition we onlyconsider the unconstrained function optimization in thiswork Our future work consists on adding the diversity rulesinto HSBA for constrained optimization problems such asconstrained real-parameter optimization CEC 2010 test suit[52]

In the field of optimization there are many issues worthyof further study and efficient optimizationmethod should bedeveloped depending on the analysis of specific engineeringproblem Our future work will focus on the two issues onthe one hand we would apply our proposed approach HSBA to solve practical engineering optimization problemsand obviously HSBA can become a fascinating method forreal-world engineering optimization problems on the otherhand we would develop new meta-hybrid approach to solveoptimization problem

Acknowledgments

This work was supported by State Key Laboratory of LaserInteraction with Material Research Fund under Grant noSKLLIM0902-01 and Key Research Technology of Electric-discharge Non-chain Pulsed DF Laser under Grant no LXJJ-11-Q80

References

[1] G Wang L Guo A H Gandomi et al ldquoLevy-flight krill herdalgorithmrdquo Mathematical Problems in Engineering vol 2013Article ID 682073 14 pages 2013

[2] A H Gandomi X S Yang S Talatahari and A H AlaviMetaheuristic Applications in Structures and InfrastructuresElsevier Waltham Mass USA 2013

[3] D H Ackley ldquoAn empirical study of bit vector functionoptimizationrdquo in Genetic Algorithms and Simulated AnnealingL Davis Ed pp 170ndash204 Pitman London UK 1987

[4] R Fletcher and M J D Powell ldquoA rapidly convergent descentmethod for minimizationrdquoThe Computer Journal vol 6 no 2pp 163ndash168 1963

[5] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[6] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[7] K A De Jong Analysis of the Behavior of a Class of GeneticAdaptive Systems University of Michigan 1975

[8] L A Rastrigin Extremal Control Systems NaukaMoscow Rus-sian 1974Theoretical Foundations of Engineering CyberneticsSeries

[9] S-K S Fan and J-M Chang ldquoDynamic multi-swarm particleswarm optimizer using parallel PC cluster systems for globaloptimization of large-scale multimodal functionsrdquo EngineeringOptimization vol 42 no 5 pp 431ndash451 2010

[10] W Gong Z Cai and C X Ling ldquoDEBBO a hybrid differentialevolution with biogeography-based optimization for globalnumerical optimizationrdquo Soft Computing vol 15 no 4 pp 645ndash665 2010

[11] G Wang L Guo H Duan L Liu and H Wang ldquoA modifiedfirefly algorithm for UCAV path planningrdquo International Jour-nal of Hybrid Information Technology vol 5 no 3 pp 123ndash1442012

20 Journal of Applied Mathematics

[12] GWang LGuoHDuan L Liu andHWang ldquoAhybridmeta-heuristic DECS algorithm for UCAVpath planningrdquo Journal ofInformation and Computational Science vol 9 no 16 pp 1ndash82012

[13] GWang L Guo H Duan L Liu HWang andM Shao ldquoPathplanning for uninhabited combat aerial vehicle using hybridmeta-heuristic DEBBO algorithmrdquo Advanced Science Engi-neering and Medicine vol 4 no 6 pp 550ndash564 2012

[14] G Wang L Guo H Duan H Wang L Liu and M Shao ldquoAhybrid meta-heuristic DECS algorithm for UCAV three-dimension path planningrdquo The Scientific World Journal vol2012 Article ID 583973 11 pages 2012

[15] H DuanW Zhao GWang and X Feng ldquoTest-sheet composi-tion using analytic hierarchy process and hybrid metaheuristicalgorithmTSBBOrdquoMathematical Problems in Engineering vol2012 Article ID 712752 22 pages 2012

[16] G Wang L Guo H Duan L Liu and H Wang ldquoDynamicdeployment of wireless sensor networks by biogeography basedoptimization algorithmrdquo Journal of Sensor and Actuator Net-works vol 1 no 2 pp 86ndash96 2012

[17] X S Yang A H Gandomi S Talatahari and A H AlaviMetaheuristics in Water Geotechnical and Transport Engineer-ing Elsevier Waltham Mass USA 2013

[18] D E GoldbergGenetic Algorithms in Search Optimization andMachine Learning Addison-Wesley New York NY USA 1998

[19] R Storn and K Price ldquoDifferential evolutionmdasha simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997

[20] S Das and P N Suganthan ldquoDifferential evolution a survey ofthe state-of-the-artrdquo IEEE Transactions on Evolutionary Com-putation vol 15 no 1 pp 4ndash31 2011

[21] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[22] A H Gandomi G J Yun X-S Yang and S Talatahari ldquoChaos-enhanced accelerated particle swarm optimizationrdquo Communi-cations in Nonlinear Science and Numerical Simulation vol 18no 2 pp 327ndash340 2013

[23] S Talatahari X S Yang R Sheikholeslami andAH GandomildquoOptimal parameter estimation for muskingum model usinghybrid CSS and PSO methodrdquo Journal of Computational andTheoretical Nanoscience In press

[24] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[25] G Wang L Guo H Duan L Liu H Wang and M ShaoldquoHybridizing harmony search with biogeography based opti-mization for global numerical optimizationrdquo Journal of Com-putational and Theoretical Nanoscience In press

[26] Z W Geem J H Kim and G V Loganathan ldquoA new heuristicoptimization algorithm harmony searchrdquo Simulation vol 76no 2 pp 60ndash68 2001

[27] G Wang L Guo H Duan H Wang L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[28] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress Frome UK 2nd edition 2010

[29] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing amp Applications In press

[30] X S Yang andAHGandomi ldquoBat algorithm a novel approachfor global engineering optimizationrdquo Engineering Computa-tions vol 29 no 5 pp 464ndash483 2012

[31] G Wang L Guo H Duan L Liu and H Wang ldquoA Bat algo-rithm with mutation for UCAV path planningrdquo The ScientificWorld Journal vol 2012 Article ID 418946 15 pages 2012

[32] R Parpinelli and H Lopes ldquoNew inspirations in swarm intelli-gence a surveyrdquo International Journal of Bio-Inspired Computa-tion vol 3 no 1 pp 1ndash16 2011

[33] P W Tsai J S Pan B Y Liao M J Tsai and V Istanda ldquoBatalgorithm inspired algorithm for solving numerical optimiza-tion problemsrdquo Applied Mechanics and Materials vol 148 pp134ndash137 2012

[34] Wikipedia ldquoMathematical optimizationrdquo httpenwikipediaorgwikiNumerical optimization

[35] W Y Zhang S Xu and S J Li ldquoNecessary conditions for weaksharp minima in cone-constrained optimization problemsrdquoAbstract and Applied Analysis vol 2011 Article ID 909520 11pages 2012

[36] Wikipedia ldquoGlobal optimizationrdquo httpenwikipediaorgwikiGlobal optimization

[37] S Kirkpatrick C D Gelatt Jr and M P Vecchi ldquoOptimizationby simulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983

[38] M Dorigo and T Stutzle Ant Colony Optimization The MITPress Cambridge Mass USA 2004

[39] M Dorigo Optimization Learning and Natural AlgorithmsPolitecnico di Milano Italy 1992

[40] H G Beyer The Theory of Evolution Strategies Springer NewYork NY USA 2001

[41] W Khatib and P Fleming ldquoThe stud GA a mini revolutionrdquo inParallel Problem Solving From Nature pp 683ndash691 1998

[42] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[43] M Friedman ldquoA comparison of alternative tests of significancefor the problem of 119898 rankingsrdquo The Annals of MathematicalStatistics vol 11 no 1 pp 86ndash92 1940

[44] R Brits A P Engelbrecht and F van den Bergh ldquoLocatingmultiple optima using particle swarm optimizationrdquo AppliedMathematics and Computation vol 189 no 2 pp 1859ndash18832007

[45] K Tang X Li P N Suganthan Z Yang and T Weise ldquoBench-mark functions for the CECrsquo2010 special session and competi-tion on large scale global optimizationrdquo Nature Inspired Com-putation and Applications Laboratory USTC Hefei China2010

[46] Q-K Pan P N Suganthan J J Liang and M F Tasgetiren ldquoAlocal-best harmony search algorithm with dynamic subpopula-tionsrdquo Engineering Optimization vol 42 no 2 pp 101ndash117 2010

[47] Q-K Pan P N Suganthan J J Liang and M F TasgetirenldquoA local-best harmony search algorithm with dynamic sub-harmony memories for lot-streaming flow shop schedulingproblemrdquo Expert Systems with Applications vol 38 no 4 pp3252ndash3259 2011

[48] Q-K Pan P N Suganthan M F Tasgetiren and J J LiangldquoA self-adaptive global best harmony search algorithm forcontinuous optimization problemsrdquo Applied Mathematics andComputation vol 216 no 3 pp 830ndash848 2010

Journal of Applied Mathematics 21

[49] S M Islam S Das S Ghosh S Roy and P N Suganthan ldquoAnadaptive differential evolution algorithm with novel mutationand crossover strategies for global numerical optimizationrdquoIEEE Transactions on Systems Man and Cybernetics B vol 42no 2 pp 482ndash500 2012

[50] J J Liang A K Qin P N Suganthan and S Baskar ldquoCompre-hensive learning particle swarm optimizer for global optimiza-tion of multimodal functionsrdquo IEEE Transactions on Evolution-ary Computation vol 10 no 3 pp 281ndash295 2006

[51] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[52] R Mallipeddi and P Suganthan Problem Definitions and Eval-uation Criteria for the CEC 2010 Competition on ConstrainedReal-Parameter Optimization Nanyang Technological Univer-sity Singapore 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

4 Journal of Applied Mathematics

BeginStep 1 Set the parameters and initialize the HMStep 2 Evaluate the fitness for each individual in HMStep 3 while the halting criteria is not satisfied do

for 119889 = 1119863 doif rand ltHMCR then memory consideration

119909new (119889) = 119909119886(119889) where 119886 isin (1 2 HMS)

if rand lt PAR then pitch adjustment119909new (119889) = 119909old (119889) + bw times (2 times rand minus 1)

endifelse random selection

119909new(119889) = 119909min119889 + rand times (119909max119889 minus 119909minminus119889)endif

endfor 119889Update the HM as 119909

119908= 119909new if 119891(119909new) lt 119891(119909

119908) (minimization objective)

Update the best harmony vector found so farStep 4 end whileStep 5 Post-processing the results and visualization

End

Algorithm 1 Harmony search algorithm

BeginStep 1 Initialization Set the generation counter 119905 = 1 Initialize the population of NP bats 119875

randomly and each bat corresponding to a potential solution tothe given problem define loudness 119860

119894 pulse frequency 119876

119894

and the initial velocities V119894(119894 = 1 2 NP) set pulse rate 119903

119894

Step 2 While the termination criteria is not satisfied or 119905 ltMaxGeneration doGenerate new solutions by adjusting frequency and updating velocitiesand locationssolutions [(4)ndash(6)]if (rand gt 119903

119894) then

Select a solution among the best solutionsGenerate a local solution around the selected best solution

end ifGenerate a new solution by flying randomlyif (rand lt 119860

119894amp 119891(119909

119894) lt 119891(119909

lowast)) then

Accept the new solutionsIncrease 119903

119894and reduce 119860

119894

end ifRank the bats and find the current best 119909

lowast

119905 = 119905 + 1Step 3 end whileStep 4 Post-processing the results and visualization

End

Algorithm 2 Bat algorithm

in a 119863-dimensional search space The new solutions 119909119905119894and

velocities V119905119894at time step t are given by

119891119894= 119891min + (119891max minus 119891min) 120573 (4)

V119905

119894= V119905minus1

119894+ (119909119905119894minus 119909lowast) 119891119894 (5)

119909119905119894= 119909119905minus1119894

+ V119905

119894 (6)

where 120573 isin [0 1] is a random vector drawn from a uniformdistribution Here 119909

lowastis the current global best location (solu-

tion)which is located after comparing all the solutions among

all the 119899 bats Generally speaking depending on the domainsize of the problem of interest the frequency 119891 is assignedto 119891min = 0 and 119891max = 100 in practical implementationInitially each bat is randomly given a frequency which isdrawn uniformly from [119891min 119891max]

For the local search part once a solution is selectedamong the current best solutions a new solution for each batis generated locally using random walk

119909new = 119909old + 120576119860119905 (7)

Journal of Applied Mathematics 5

where 120576 isin [minus1 1] is a scaling factor which is a random num-ber while 119860

119905= ⟨119860119905119894⟩ is the average loudness of all the bats at

time step 119905The update of the velocities and positions of bats is similar

to the procedure in the standard PSO [21] as 119891119894controls the

pace and range of the movement of the swarming particles inessence To some degree BA can be considered as a balancedcombination of the standard PSO and the intensive localsearch controlled by the loudness and pulse rate

Furthermore the loudness 119860119894and the rate 119903

119894of pulse

emission update accordingly as the iterations proceed asshown in (8)

119860119905+1119894

= 120572119860119905119894

119903119905+1119894

= 1199030119894[1 minus exp (minus120574119905)]

(8)

where 120572 and 120574 are constants In essence 120572 is similar tothe cooling factor of a cooling schedule in the simulatedannealing (SA) [37] For simplicity we set 120572 = 120574 = 09 inthis work

3 Our Approach HSBA

Based on the introduction of HS and BA in previous sectionwe will explain how we combine the two approaches to formthe proposed bat algorithm with harmony search (HSBA) inthis section whichmodifies the solutions with poor fitness inorder to add diversity of the population to improve the searchefficiency

In general the standard BA algorithm is adept at exploit-ing the search space but at times it may trap into somelocal optima so that it cannot perform global search wellFor BA the search depends completely on random walks soa fast convergence cannot be guaranteed Firstly presentedhere in order to increase the diversity of the populationfor BA so as to avoid trapping into local optima a mainimprovement of adding pitch adjustment operation in HSserving as a mutation operator is made to the BA with theaim of speeding up convergence thus making the approachmore feasible for a wider range of practical applications whilepreserving the attractive characteristics of the basic BA Inthis paper a hybrid metaheuristic algorithm by inducing thepitch adjustment operation in HS as a mutation operatorinto bat algorithm so-called harmony searchbat algorithm(HSBA) is used to optimize the benchmark functions Thedifference between HSBA and BA is that the mutationoperator is used to improve the original BA generating a newsolution for each bat In this way this method can explore thenew search space by the mutation of the HS algorithm andexploit the population information with BA and thereforecan avoid trapping into local optima in BA In the followingwe will show the algorithm HSBA which is an improvementof HS and BA

The critical operator of HSBA is the hybrid harmonysearchmutation operator which composes the improvisationof harmony in HS with the BAThe core idea of the proposedhybrid mutation operator is based on two considerationsFirstly poor solutions can take in many new used features

from good solutions Secondly the mutation operator canimprove the exploration of the new search space In this waythe strong exploration abilities of the original HS and theexploitation abilities of the BA can be fully developed

For bat algorithm as the search relies entirely on randomwalks a fast convergence cannot be guaranteed Describedhere for the first time a main improvement of addingmutation operator is made to the BA including three minorimprovements which are made with the aim of speeding upconvergence thus making the method more practical for awider range of applications but without losing the attractivefeatures of the original method

The first improvement is that we use fixed frequency119891 and loudness 119860 instead of various frequency 119891

119894and 119860119905

119894

Similar to BA inHSBA each bat is defined by its position 119909119905119894

velocity V119905119894 the emission pulse rate 119903119905

119894and the fixed frequency

119891 and loudness 119860 in a 119889-dimensional search space The newsolutions 119909119905

119894and velocities V119905

119894at time step 119905 are given by

V119905

119894= V119905minus1

119894+ (119909119905119894minus 119909lowast) 119891

119909119905119894= 119909119905minus1119894

+ V119905

119894

(9)

where 119909lowastis the current global best location (solution) which

is located after comparing all the solutions among all the 119899bats In our experiments we make 119891 = 05 Through a seriesof simulation experiments on benchmarks in Section 42 itwas found that setting the parameter of pulse rate 119903 to 06 andthe loudness 119860 to 095 produced the best results

The second improvement is to add mutation operator inan attempt to increase diversity of the population to improvethe search efficiency and speed up the convergence to optimaFor the local search part once a solution is selected among thecurrent best solutions a new solution for each bat is generatedlocally using random walk by (7) when 120585 is larger than pulserate 119903 that is 120585 gt 119903 where 120585 isin [0 1] is a random real numberdrawn from a uniform distribution while when 120585 le 119903 weuse pitch adjustment operation in HS serving as a mutationoperator updating the new solution to increase diversity ofthe population to improve the search efficiency as shown in(3) Through testing benchmarks in Section 42 it was foundthat setting the parameter of harmonymemory considerationrate HMCR to 095 and pitch adjustment rate PAR to 01produced the best results

The last improvement is the addition of elitism schemeinto the HSBA As with other population-based optimiza-tion algorithms we typically incorporate some sort of elitismin order to retain the best solutions in the population Thisprevents the best solutions from being corrupted by pitchadjustment operator Note that we use an elitism approach tosave the property of the bats that has the best solution in theHSBA process so even if pitch adjustment operation ruinsits corresponding bat we have saved it and can revert back toit if needed

Based on the above-mentioned analyses the mainframeof the harmony searchbat algorithm (HSBA) is presentedin Algorithm 3

6 Journal of Applied Mathematics

BeginStep 1 Initialization Set the generation counter 119905 = 1 initialize the population of NP

bats 119875 randomly and each bat corresponding to a potentialsolution to the given problem define loudness 119860 set frequency 119876the initial velocities V and pulse rate 119903 set the harmony memory considerationrate HMCR the pitch adjustment rate PAR and bandwidth bwset maximum of elite individuals retained KEEP

Step 2 Evaluate the quality 119891 for each bat in 119875 determined by the objective function 119891(119909)Step 3 While the termination criteria is not satisfied or 119905 ltMaxGeneration do

Sort the population of bats 119875 from best to worst by order of quality 119891 for each batStore the KEEP best bats as KEEPBATfor 119894 = 1NP (all bats) do

V119905

119894= V119905minus1

119894+ (V119905119894minus 119909lowast) times 119876

119909119905119894= 119909119905minus1119894

+ V119905

119894

if (rand gt 119903) then119909119905119906= 119909lowast+ 120572120576119905

end iffor 119895 = 1119863 (all elements) do Mutate

if (rand lt HMCR) then1199031= lceilNP lowast randrceil

119909V (119895) = 1199091199031(119895) where 119903

1isin (1 2 HMS)

if (rand lt PAR) then119909V (119895) = 119909V (119895) + bw times (2 times rand minus 1)

endifelse

119909V(119895) = 119909min 119895 + rand times (119909max119895 minus 119909minminus119895)

endifendfor 119895Evaluate the fitness for the offsprings 119909119905

119906 119909119905119894 119909119905

V

Select the offspring 119909119905119896with the best fitness among the offsprings 119909119905

119906 119909119905119894 119909119905

V

if (rand lt 119860) then1199091199051199031= 119909119905119896

end ifReplace the KEEP worst bats with the KEEP best bats KEEPBAT stored

end for 119894119905 = 119905 + 1

Step 4 end whileStep 5 Post-processing the results and visualization

End

Algorithm 3 The hybrid meta-heuristic algorithm of HSBA

4 Simulation Experiments

In this section we test the performance of the proposedmeta-heuristic HSBA to global numerical optimization through aseries of experiments conducted on benchmark functions

To allow a fair comparison of running times all the exper-iments were conducted on a PC with a Pentium IV processorrunning at 20GHz 512MB of RAM and a hard driveof 160Gbytes Our implementation was compiled usingMATLAB R2012a (714) running under Windows XP3 Nocommercial BA or HS tools were used in the followingexperiments

41 General Performance of HSBA In order to explorethe benefits of HSBA in this subsection we compared itsperformance on global numeric optimization problem withnine other population-based optimization methods which

are ACO BA BBO DE ES GA HS PSO and SGA ACO(ant colony optimization) [38 39] is a swarm intelligencealgorithm for solving optimization problems which is basedon the pheromone deposition of ants BBO (biogeography-based optimization) [24] is a new excellent powerful andefficient evolution algorithm developed for the global opti-mization inspired by the immigration and emigration ofspecies between islands (or habitats) in search of morecompatible islands DE (differential evolution) [19] is a simplebut excellent optimization method that uses the differencebetween two solutions to probabilistically adapt a thirdsolution An ES (evolutionary strategy) [40] is an algorithmthat generally distributes equal importance to mutation andrecombination and that allows two or more parents toreproduce an offspring A GA (genetic algorithm) [18] is asearch heuristic that mimics the process of natural evolutionPSO (particle swarm optimization) [21] is also a swarm

Journal of Applied Mathematics 7

intelligence algorithm which is based on the swarm behaviorof fish and bird schooling in nature A stud genetic algorithm(SGA) [41] is a GA that uses the best individual at eachgeneration for crossover

In all experiments we will use the same parameters forHS BA and HSBA that are loudness 119860 = 095 pulse rate119903 = 06 scaling factor 120576 = 01 the harmony memoryconsideration rate HMCR = 095 and the pitch adjustmentrate PAR = 01 For ACO BBO DE ES GA PSO and SGAwe set the parameters as follows For ACO initial pheromonevalue 120591

0= 1119864 minus 6 pheromone update constant 119876 = 20

exploration constant 1199020= 1 global pheromone decay rate

120588119892= 09 local pheromone decay rate 120588

119897= 05 pheromone

sensitivity 119904 = 1 and visibility sensitivity 120573 = 5 for BBOhabitat modification probability = 1 immigration probabilitybounds per gene = [0 1] step size for numerical integration ofprobabilities = 1 maximum immigration and migration ratesfor each island = 1 and mutation probability = 0005 for DEa weighting factor 119865 = 05 and a crossover constant CR = 05for the ES the number of offspring 120582 = 10 produced ineach generation and standard deviation 120590 = 1 for changingsolutions For the GA we used roulette wheel selection andsingle-point crossover with a crossover probability of 1 anda mutation probability of 001 For PSO we set an inertialconstant = 03 a cognitive constant = 1 and a social constantfor swarm interaction = 1 For the SGA we used single-pointcrossover with a crossover probability of 1 and a mutationprobability of 001

Well-defined problem sets are favorable for evaluatingthe performance of optimization methods proposed in thispaper Based on mathematical functions benchmark func-tions can be applied as objective functions to perform suchtests The properties of these benchmark functions can beeasily achieved from their definitions Fourteen differentbenchmark functions are applied to verify our proposedmetaheuristic algorithmHSBA Each of the functions in thisstudy has 20 independent variables (ie119863 = 20)

The benchmark functions described in Table 1 are stan-dard testing functionsTheproperties of the benchmark func-tions are given in Table 2 The modality property means thenumber of the best solutions in the search space Unimodalbenchmark functions only have an optimum which is theglobal optimum Multimodal benchmark functions have atleast two optima in their search space indicating that theyhave more than one local optima except the global optimumMore details of all the benchmark functions can be found in[6]

We set population size 119873119875 = 50 an elitism parameter119870119890119890119901 = 2 and maximum generation119872119886119909119892119890119899 = 50 for eachalgorithm We ran 100 Monte Carlo simulations of eachalgorithm on each benchmark function to get representativeperformances Tables 3 and 4 illustrate the results of thesimulations Table 3 shows the average minima found byeach algorithm averaged over 100 Monte Carlo runs Table 4shows the absolute best minima found by each algorithmover 100 Monte Carlo runs In other words Table 3 showsthe average performance of each algorithm while Table 4shows the best performance of each algorithmThe best valueachieved for each test problem is shown in bold Note that the

normalizations in the tables are based on different scales sovalues cannot be compared between the two tables

From Table 3 we see that on average HSBA is the mosteffective at finding objective function minimum on ten ofthe fourteen benchmarks (F02 F03 F06ndashF11 and F13-F14)ACO is the second most effective performing the best on thebenchmarks F04 and F05 BBO and SGA are the third mosteffective performing the best on the benchmarks F12 andF01 respectively whenmultiple runs aremade Table 4 showsthat HSBA performs the best on thirteen of the fourteenbenchmarks (F02ndashF14) while BBO performs the secondbest at finding objective function minimum on the onlybenchmark (F01) when multiple runs are made In additionstatistical analysis [42] on these values obtained by the tenmethods on 14 benchmark functions based on the Friedmanrsquostest [43] reveals that the differences in the obtained averageand the best functionminima are statistically significant (119875 =

166 lowast 10minus17 and119875 = 725 lowast 10minus17 resp) at the confidence levelof 5

Furthermore the computational requirements of the tenoptimization methods were similar We collected the averagecomputational time of the optimization methods as appliedto the 14 benchmarks discussed in this sectionThe results areshown in Table 3 BA was the quickest optimization methodHSBA was the third fastest of the ten algorithms Howeverit should be noted that in the vast majority of real-worldapplications it is the fitness function evaluation that is by farthe most expensive part of a population-based optimizationalgorithm

Furthermore convergence graphs of ACO BA BBODE ES GA HS HSBA PSO and SGA are shown inFigures 1ndash14 which represent the process of optimizationThevalues shown in Figures 1ndash14 are the mean objective functionoptimum achieved from 100Monte Carlo simulations whichare the true objective function values not normalized

Figure 1 shows the results obtained for the ten methodswhen the F01 Ackley function is applied From Figure 1clearly we can draw the conclusion that BBO is significantlysuperior to all the other algorithms including HSBA duringthe process of optimization while HSBA performs the sec-ond best on this multimodal benchmark function Here allthe other algorithms show the almost same starting point andHSBA has a faster convergence rate than other algorithmswhile HSBA is outperformed by BBO after 13 generationsAlthough slower SGA eventually finds the global minimumclose to HSBA Obviously BBO HSBA and SGA outper-form ACO BA DE ES GA HS and PSO during the wholeprocess of searching the global minimum

Figure 2 illustrates the optimization results for F02 Fletc-her-Powell function In thismultimodal benchmark problemit is obvious that HSBA outperforms all other methods dur-ing the whole progress of optimization (except the generation20ndash33 BBO and HSBA performs approximately equally)

Figure 3 shows the optimization results for F03 Griewankfunction From Table 3 and Figure 3 we can see that HSBAperforms the best on this multimodal benchmark problemLooking at Figure 3 carefully HSBA shows a faster conver-gence rate initially than ACO however it is outperformed

8 Journal of Applied Mathematics

Table1Be

nchm

arkfunctio

ns

No

Nam

eDefinitio

nSource

F01

Ackley

119891(119909)=20

+119890minus20

sdot119890minus02sdot radic(1119899)

119899 sum 119894=1

1199092 119894

minus119890(1119899)

119899 sum 119894=1

cos(2120587119909119894)

[3]

F02

Fletcher-Pow

ell119891(119909)=119899 sum 119894=1

(119860119894minus119861119894)2

119860119894=119899 sum 119895=1

(119886119894119895sin

120572119895+119887 119894119895cos120572119895)

119861119894=119899 sum 119895=1

(119886119894119895sin

119909119895+119887 119894119895cos119909119895)

[4]

F03

Grie

wank

119891(119909)=119899 sum 119894=1

1199092 119894

4000

minus119899 prod 119894=1

cos(

119909119894

radic119894)

+1

[5]

F04

Penalty

1119891(119909)=

120587 3010sin2(1205871199101)+119899minus1

sum 119894=1

(119910119894minus1)2

sdot[1+10sin2(120587119910119894+1)]

+(119910119899minus1)2

+119899 sum 119894=1

119906(119909119894101004)

119910119894=1+025

(119909119894+1)

[6]

F05

Penalty

2119891(119909)=

01sin2(31205871199091)+119899minus1

sum 119894=1

(119909119894minus1)2

sdot[1+sin2(3120587119909119894+1)]+(119909119899minus1)2

[1+sin2(2120587119909119899)]

+119899 sum 119894=1

119906(11990911989451004)

[6]

F06

Quarticwith

noise

119891(119909)=119899 sum 119894=1

(119894sdot1199094 119894+119880(01))

[7]

F07

Rastrig

in119891(119909)=10

sdot119899+119899 sum 119894=1

(1199092 119894minus10

sdotcos

(2120587119909119894))

[8]

F08

Rosenb

rock

119891(119909)=119899minus1

sum 119894=1

[100(119909119894+1minus1199092 119894)2+(119909119894minus1)2

][7]

F09

Schw

efel226

119891(119909)=4189829

times119863minus119863 sum 119894=1

119909119894sin

(1003816 1003816 1003816 1003816119909119894

1003816 1003816 1003816 100381612

)[9]

F10

Schw

efel12

119891(119909)=119899 sum 119894=1

(119894 sum 119895=1

119909119895)

2

[10]

F11

Schw

efel222

119891(119909)=119899 sum 119894=1

|119909119894|+119899 prod 119894=1

|119909119894|

[10]

F12

Schw

efel221

119891(119909)=max 119894

1003816 1003816 1003816 1003816119909119894

1003816 1003816 1003816 10038161le119894le119899

[6]

Journal of Applied Mathematics 9

Table1Con

tinued

No

Nam

eDefinitio

nSource

F13

Sphere

119891(119909)=119899 sum 119894=1

1199092 119894

[7]

F14

Step

119891(119909)=6sdot119899

+119899 sum 119894=1

lfloor 119909119894rfloor

[7]

lowast

Inbenchm

arkfunctio

nF0

2them

atrix

elem

entsa 119899times119899b 119899times119899isin(minus100100)120572119899times1isin(minus120587120587)ared

rawnfro

mun

iform

distr

ibution[4]

lowastIn

benchm

arkfunctio

nsF0

4andF0

5thed

efinitio

nof

thefun

ction119906(119909119894119886119896119898)isgivenby119906(119909119894119886119896119898)=119896(119909119894minus119886)119898119909119894gt1198860minus119886le119909119894le119886119896(minus119909119894minus119886)119898119909119894ltminus119886

10 Journal of Applied Mathematics

Table 2 Properties of benchmark functions lb denotes lower bound ub denotes upper bound and opt denotes optimum point

No Function lb ub opt Continuity ModalityF01 Ackley minus32768 32768 0 Continuous MultimodalF02 Fletcher-Powell minus120587 120587 0 Continuous MultimodalF03 Griewank minus600 600 0 Continuous MultimodalF04 Penalty 1 minus50 50 0 Continuous MultimodalF05 Penalty 2 minus50 50 0 Continuous MultimodalF06 Quartic with noise minus128 128 1 Continuous MultimodalF07 Rastrigin minus512 512 0 Continuous MultimodalF08 Rosenbrock minus2048 2048 0 Continuous UnimodalF09 Schwefel 226 minus512 512 0 Continuous MultimodalF10 Schwefel 12 minus100 100 0 Continuous UnimodalF11 Schwefel 222 minus10 10 0 Continuous UnimodalF12 Schwefel 221 minus100 100 0 Continuous UnimodalF13 Sphere minus512 512 0 Continuous UnimodalF14 Step minus512 512 0 Discontinuous Unimodal

Table 3 Mean normalized optimization results in fourteen benchmark functions The values shown are the minimum objective functionvalues found by each algorithm averaged over 100 Monte Carlo simulations

ACO BA BBO DE ES GA HS HSBA PSO SGAF01 231 333 115 202 338 272 347 109 266 100F02 2458 2582 158 894 2435 545 1569 100 1396 133F03 316 6072 193 544 2385 322 7722 100 2577 142F04 100 30E38 40E32 56E33 27E38 31E32 14E39 23E32 41E36 96E31F05 100 11E8 29942 15E6 46E8 54E3 41E8 21551 55E7 11110F06 48901 68E3 3532 30829 18E4 27483 15E4 100 25E3 1009F07 809 1155 128 656 1187 617 1022 100 844 180F08 4225 2901 229 759 5999 905 4785 100 1204 215F09 317 2026 199 1358 1333 181 1992 100 1761 115F10 175 373 138 295 493 125 422 100 248 148F11 105 1970 183 714 2312 1113 1945 100 1322 246F12 186 403 100 299 391 192 374 138 238 112F13 9830 15084 380 1903 22652 4774 18232 100 7291 402F14 773 12048 393 1331 10256 1153 14655 100 6344 328TimE 274 100 132 164 167 179 233 143 203 176lowast

The values are normalized so that the minimum in each row is 100 These are not the absolute minima found by each algorithm but the average minimafound by each algorithm

by ACO after 5 generations For other algorithms althoughslower BBO GA and SGA eventually find the global mini-mum close to HSBA while BA DE ES HS and PSO fail tosearch the global minimum within the limited iterations

Figure 4 shows the results for F04 Penalty 1 functionFrom Figure 4 it is obvious that HSBA outperforms all othermethods during the whole progress of optimization in thismultimodal benchmark function Although slower SGA andBBO perform the second and third best at finding the globalminimum

Figure 5 shows the performance achieved for F05 Penalty2 function For this multimodal function similar to F04Penalty 1 function shown in Figure 4 HSBA is significantly

superior to all the other algorithms during the process ofoptimization Here PSO shows a faster convergence rateinitially than HSBA however it is outperformed by HSBAafter 3 generations

Figure 6 shows the results achieved for the ten methodswhen using the F06 Quartic (with noise) function FromTable 3 and Figure 6 we can conclude that HSBA performsthe best in this multimodal function Looking carefullyat Figure 6 PSO and HSBA show the same initial fastconvergence towards the known minimum as the procedureproceeds HSBA gets closer and closer to the minimumwhile PSO comes into being premature and traps into thelocal minimum BA ES GA HS and PSO do not manage

Journal of Applied Mathematics 11

Table 4 Best normalized optimization results in fourteen benchmark functions The values shown are the minimum objective functionvalues found by each algorithm

ACO BA BBO DE ES GA HS HSBA PSO SGAF01 185 231 100 143 225 203 230 105 191 104F02 1042 1448 109 422 1039 403 991 100 828 133F03 273 4622 187 447 2138 802 4324 100 1675 176F04 44E6 50E6 12E3 19E4 22E6 30E4 42E6 100 3755 322F05 30E4 32E4 5101 38633 16E4 88446 26E4 100 4113 457F06 5851 99255 650 2469 80848 5924 74691 100 18971 202F07 571 853 125 513 794 523 747 100 600 168F08 2642 2503 148 370 3352 616 2150 100 775 145F09 243 866 128 490 593 209 728 100 740 142F10 189 494 118 266 300 208 276 100 201 174F11 774 1261 121 336 1214 601 960 100 781 162F12 133 233 144 169 208 174 211 100 170 121F13 2804 5657 217 554 6057 1971 5286 100 2098 228F14 428 5402 197 485 3361 1060 4911 100 1972 185lowast

The values are normalized so that the minimum in each row is 100 These are the absolute best minima found by each algorithm

0 10 20 30 40 508

10

12

14

16

18

20

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 1 Comparison of the performance of the different methodsfor the F01 Ackley function

to succeed in this benchmark function within maximumnumber of generations At last BBO DE and SGA convergeto the value very close to HSBA

Figure 7 shows the optimization results for the F07Rastrigin function Very clearly HSBA has the fastest con-vergence rate at finding the globalminimumand significantlyoutperforms all other approaches Looking carefully at Fig-ure 7 approximately we can divide all the algorithms intothree groups one group including BBO HSBA and SGAperforming the best the other group including ACO DE

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

106

105

Figure 2 Comparison of the performance of the different methodsfor the F02 Fletcher-Powell function

GA and PSO performing the second best another groupincluding BA ES and HS performing the worst

Figure 8 shows the results for F08 Rosenbrock functionFrom Table 3 and Figure 8 we can see that HSBA is superiorto the other algorithms during the optimization process inthis relatively simple unimodal benchmark function Alsowe see that similar to the F06 Quartic (with noise) functionshown in Figure 6 PSO shows a faster convergence rateinitially than HSBA however it is outperformed by HSBAafter 4 generations and is premature to the local minimum

Figure 9 shows the equivalent results for the F09 Schwefel226 function From Figure 9 very clearly though HSBA isoutperformed by BBO between the generations 10ndash30 it hasthe stable convergence rate at finding the global minimum

12 Journal of Applied Mathematics

0 10 20 30 40 50

50

100

150

200

250

300

350

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 3 Comparison of the performance of the different methodsfor the F03 Griewank function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

1010

109

108

107

106

105

104

103

102

101

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 4 Comparison of the performance of the different methodsfor the F04 Penalty 1 function

and significantly outperforms all other approaches in thismultimodal benchmark function

Figure 10 shows the results for F10 Schwefel 12 functionFrom Figure 10 we can see that HSBA performs far betterthan other algorithms during the optimization process in thisrelative simple unimodal benchmark function PSO showsa faster convergence rate initially than HSBA howeverit is outperformed by HSBA after 3 generations Lookingcarefully at Figure 10 akin to F07 Rastrigin function shown inFigure 7 all the algorithms except BA can be approximately

0

Ben

chm

ark

func

tion

val

ue

5 10 15 20 25 30 35 40 45 50Number of generations

109

108

107

106

105

104

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 5 Comparison of the performance of the different methodsfor the F05 Penalty 2 function

0 10 20 30 40 500

5

10

15

20

25

30

35

40

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 6 Comparison of the performance of the different methodsfor the F06 Quartic (with noise) function

divided into three groups one group including BBO andHSBA performing the best the other group including ACOGA PSO and SGA performing the second best anothergroup including DE ES and HS performing the worst

Figure 11 shows the results for F11 Schwefel 222 functionVery clearly BBO shows a faster convergence rate initiallythan HSBA however it is outperformed by HSBA after 40generations At last HSBA reaches the optimal solution sig-nificantly superior to other algorithms BBO is only inferior

Journal of Applied Mathematics 13B

ench

mar

k fu

ncti

on v

alue

0 10 20 30 40 50

50

100

150

200

250

Number of generations

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 7 Comparison of the performance of the different methodsfor the F07 Rastrigin function

0 10 20 30 40 50

500

1000

1500

2000

2500

3000

3500

4000

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 8 Comparison of the performance of the different methodsfor the F08 Rosenbrock function

to HSBA and performs the second best in this unimodalfunction

Figure 12 shows the results for F12 Schwefel 221 functionVery clearly HSBAhas the fastest convergence rate at findingthe global minimum and significantly outperforms all otherapproaches

0 10 20 30 40 50

1000

2000

3000

4000

5000

6000

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 9 Comparison of the performance of the different methodsfor the F09 Schwefel 226 function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

104

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 10 Comparison of the performance of the differentmethodsfor the F10 Schwefel 12 function

Figure 13 shows the results for F13 Sphere function FromTable 3 and Figure 13 HSBA has the fastest convergencerate at finding the global minimum and outperforms all otherapproaches Looking carefully at Figure 13 for BBO and SGAwe can see that BBO has a faster convergence rate than SGAbut SGA does finally converge to the value of BBO that isapproaching to HSBA

Figure 14 shows the results for F14 Step function Appar-ently HSBA shows the fastest convergence rate at findingthe global minimum and significantly outperforms all otherapproaches Looking carefully at Figure 14 for BBO and SGA

14 Journal of Applied MathematicsB

ench

mar

k fu

ncti

on v

alue

0 5 10 15 20 25 30 35 40 45 50Number of generations

102

101

100

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 11 Comparison of the performance of the different methodsfor the F11 Schwefel 222 function

0 10 20 30 40 5030

35

40

45

50

55

60

65

70

75

80

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 12 Comparison of the performance of the different methodsfor the F12 Schwefel 221 function

we can see that BBO has a faster convergence rate than SGAbut SGA does finally converge to the value of BBO

From the above-analyses about Figures 1ndash14 we can cometo the conclusion that our proposed hybrid metaheuristicalgorithm HSBA significantly outperforms the other ninealgorithms In general ACO BBO and SGA are only inferiorto the HSBA and ACO BBO and SGA perform betterthanHSBA in the benchmarks F04-F05 the benchmark F12and the benchmark F01 respectively Further benchmarksF04 F05 F06 F08 and F10 illustrate that PSO has a faster

0 10 20 30 40 500

10

20

30

40

50

60

70

80

90

100

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 13 Comparison of the performance of the different methodsfor the F13 Sphere function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

104

103

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 14 Comparison of the performance of the differentmethodsfor the F14 Step function

convergence rate initially while later it converges slowerand slower to the true objective function value At lastwe must point out that in [24] Simon compared BBOwith seven state-of-the-art EAs over 14 benchmark func-tions and a real-world sensor selection problem The resultsdemonstrated the good performance of BBO It is also indi-rectly demonstrated that our proposed hybrid metaheuristicmethod HSBA is a more powerful and efficient optimiza-tion algorithm than other population-based optimizationalgorithms

Journal of Applied Mathematics 15

Table 5 Best normalized optimization results in fourteen benchmark functions with different 119860 The numbers shown are the best resultsfound after 100 Monte Carlo simulations HSBA algorithm

119860

0 01 02 03 04 05 06 07 08 09 10F01 166 133 155 140 100 133 121 122 116 113 114F02 125E4 100 100 100 100 100 100 100 100 100 100F03 341 1115 100 100 100 100 100 100 100 100 100F04 273E4 100 125E4 104 104 104 104 104 104 100 100F05 503E5 530 133 168E4 100 100 100 100 100 100 100F06 100 256 547 473 1781 990 289 674 990 260 557F07 1014 1492 435 110 110 1537 101 100 100 100 100F08 3849 108 1406 108 108 108 9901 101 100 100 100F09 28518 40458 138 474 119 119 119 36791 101 100 100F10 66522 468 169E3 1527 118 118 118 118 121 100 100F11 2003 100 996 1154 3922 996 996 996 996 3897 849F12 1032 100 1322 323 1469 279 279 279 279 279 1931F13 146 415 284 447 115 156 100 100 100 100 100F14 52752 1357 395 101 115 1291 100 100 100 100 100

1 4 2 2 3 3 5 6 7 10 10

42 Influence of Control Parameter In [28] Dr Yang con-cluded that if the parameters in BA can be adjusted properlyit can outperform GA HS (harmony search) [26] and PSOThe choice of the control parameters is of vital importancefor different problems To investigate the influence of theloudness119860 pulse rate 119903 harmonymemory consideration rateHMCR and pitch adjustment rate PAR on the performanceof HSBA we carry out this experiment in the benchmarkproblem with different parameters All other parametersettings are kept unchanged (unless noted otherwise in thefollowing paragraph) The results are recorded in Tables 5ndash12 after 100 Monte Carlo runs Among them Tables 5 7 9and 11 show the best minima found by HSBA algorithm over100Monte Carlo runs Tables 6 8 10 and 12 show the averageminima found byHSBA algorithm averaged over 100MonteCarlo runs In other words Tables 5 7 9 11 and Tables 68 10 12 show the best and average performance of HSBAalgorithm respectively In each table the last row is the totalnumber of functions on which HSBA performs the best withsome parameters

421 Loudness 119860 Tables 5 and 6 recorded the resultsperformed in the benchmark problem with the loudness 119860 =0 01 02 09 10 and fixed pulse rate 119903 = 06 harmonymemory consideration rate HMCR= 095 and pitch adjust-ment rate PAR= 01 From Tables 5 and 6 obviously it can beseen that (i) for the three benchmark functions F02 F03 andF04 HSBA performs slightly differently that is to say thesethree benchmark functions are insensitive to the parameter119860 (ii) For benchmark functions F01 F06 F11 and F12HSBA performs better on smaller119860(lt05) (iii) However forfunctions F05 F07ndash10 F13 and F14 HSBA performs betteron bigger119860(gt05) In sum HSBA performs the best when119860is equal or very close to 10 So we set119860 = 095 which is veryclose to 09 and 10 in other experiments

422 Pulse Rate 119903 Tables 7 and 8 recorded the resultsperformed on the benchmark problem with the pulse rate119903 = 0 01 02 09 10 and fixed loudness119860 = 095harmony memory consideration rate HMCR=095 andpitch adjustment rate PAR= 01 From Tables 7 and 8 wecan evidently conclude that HSBA performs significantlybetter on bigger 119903(gt05) than on smaller 119903(lt05) and HSBAperforms the best when r is equal or very close to 06 So weset 119903 = 06 in other experiments

423 Harmony Memory Consideration Rate119867119872119862119877 Tables9 and 10 recorded the results performed in the bench-mark problem with the harmony memory consideration rateHMCR = 0 01 02 09 10 fixed loudness 119860 = 095pulse rate 119903 = 06 and pitch adjustment rate PAR = 01 FromTables 9 and 10 we can recognize that the function valuesevaluated by HSBA are bettersmaller on bigger HMCR(gt05) than on smaller r (lt05) and most benchmarks reachminimum when HMCR is equal or very close to 1 So we setHMCR=095 in other experiments

424 Pitch Adjustment Rate 119875119860119877 Tables 11 and 12 recordedthe results performed on the benchmark problems with thepitch adjustment rate PAR = 0 01 02 09 10 fixedloudness119860 = 095 pulse rate 119903 = 06 and harmonymemory consideration rate HMCR = 095 From Tables 11and 12 we can recognize that the function values for HSBAvary little with the increasing PAR and HSBA reachesoptimumminimum in most benchmarks when PAR is equalor very close to 01 So we set PAR = 01 in other experiments

43 Discussion In the HSBA the bats fly in the sky usingecholocation to find foodprey (ie best solutions) Fourother parameters are the loudness (119860) the rate of pulseemission (119903) harmony memory consideration rate (HMCR)

16 Journal of Applied Mathematics

Table 6 Mean normalized optimization results in fourteen benchmark functions with different A The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119860

0 01 02 03 04 05 06 07 08 09 10F01 101 101 100 100 100 100 100 100 100 100 100F02 152 100 146 117 108 137 188 165 167 179 148F03 669 1041 106 101 103 103 101 102 105 100 100F04 20131 126 410 421 333 272 100 519 291 100 311F05 59122 1682 425 587 101 414 2404 100 368 733 900F06 100 490E4 63799 25871 495E4 218 217 217 218 217 217F07 857 221E6 643 27234 11048 211E4 102 101 100 100 100F08 4980 114E4 172E4 16160 22459 9119 174E4 103 111 100 100F09 8237 173E6 106 314 7445 10310 4226 794E3 100 137 129F10 9031 145E3 245E5 232E3 2334 2234 3138 1289 234E3 140 100F11 498 100 315E4 233 1441 52023 44365 61626 24991 478E4 214F12 369 210E4 457E4 100 212E4 26635 23313 19880 27614 11201 214E4F13 190 825 448E6 214E6 100 615 25451 22275 18996 26386 10701F14 6691 195E5 117E4 124E3 2104 186E3 2940 2392 100 1835 2475

1 2 1 2 2 1 2 2 4 5 5

Table 7 Best normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 184 184 484 159 171 100 134 109 116 101F02 944E3 100 100 100 100 100 100 100 100 100 100F03 373 546 186 186 186 186 100 172 121 161 101F04 1142 111 2429 123 126 100 129 129 129 129 129F05 177E5 230 115 109E4 100 100 100 100 100 100 100F06 6244 2996 4858 1560 2634 780 182 100 3520 239 155F07 831 448 258 129 129 349 100 100 100 100 100F08 961 118 420 137 137 137 695 101 101 100 100F09 37394 44422 168 335 168 168 100 29198 101 101 101F10 38693 316 1397 463 158 158 100 158 30931 158 101F11 4255 100 1829 2135 2018 1111 1904 2135 1581 1876 1155F12 11440 100 14184 4448 12547 4448 4448 4448 4448 4448 6943F13 639 637 601 296 302 100 152 139 100 257 249F14 12052 404 233 100 116 340 100 100 116 116 116

0 3 1 2 2 4 8 5 4 4 4

and pitch adjustment rate (PAR) The appropriate update forthe pitch adjustment rate (PAR) and the rate of pulse emission(119903) balances the exploration and exploitation behavior of eachbat respectively as the loudness usually decreases once a bathas found its preysolution while the rate of pulse emissionincreases in order to raise the attack accuracy

For all of the standard benchmark functions that havebeen considered the HSBA has been demonstrated toperform better than or be equal to the standard BA and otheracclaimed state-of-the-art population-based algorithms withthe HSBA performing significantly better in some functionsThe HSBA performs excellently and efficiently because ofits ability to simultaneously carry out a local search still

searching globally at the same time It succeeds in doingthis due to the local search via harmony search algorithmand global search via bat algorithm concurrently A similarbehavior may be performed in the PSO by using multiswarmfrom a particle population initially [44] However HSBArsquosadvantages include performing simply and easily and onlyhaving four parameters to regulate The work carried outhere demonstrates the HSBA to be robust over all kinds ofbenchmark functions

Benchmark evaluation is a good way for verifying theperformance of the metaheuristic algorithms but it also haslimitations First we did not make any special effort to tunethe optimization algorithms in this section Different tuning

Journal of Applied Mathematics 17

Table 8 Mean normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 199 186 194 159 100 142 134 109 116 171F02 331 538 457 308 382 114 100 289 279 402 311F03 373 546 533 229 336 241 100 172 121 161 136F04 1142 111 2429 149E4 113E3 394E3 278E4 1966 17522 114 100F05 1609 12862 3269 2446 6440 1277 2959 7617 456 100 446F06 6244 2696 4858 1560 2634 780 182 100 3520 239 155F07 330 178 134 103 123 139 155 100 119 136 349F08 488 393 394 194 219 383 353 433 100 227 314F09 139 166 114 100 121 115 115 109 108 104 109F10 721 994 875 360 846 611 483 414 576 100 271F11 382 423 373 205 181 100 171 220 142 168 189F12 164 217 204 147 180 186 100 121 113 134 156F13 639 637 601 296 302 190 152 139 100 257 249F14 574 1543 742 682 644 317 100 485 221 174 209

0 0 0 1 0 2 4 2 2 2 1

Table 9 Best normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 164 164 164 164 164 164 164 151 128 100 163F02 187E4 100 100 100 100 100 100 100 100 100 100F03 1982 2649 371 371 371 371 371 371 371 206 100F04 126E5 100 187E4 100 100 100 100 100 100 100 100F05 607E5 534 100 187E4 100 100 100 100 100 100 100F06 10892 24631 36504 34540 33857 23465 14349 4528 2330 100 2575F07 922 1428 534 100 100 930 100 100 100 100 100F08 1810 108 772 108 108 108 3594 100 100 100 100F09 28004 39161 127 681 127 127 127 22795 100 100 100F10 55165 876 176E3 1170 164 164 164 164 27448 100 100F11 1467 100 618 618 1399 618 618 618 440 269 616F12 768 100 1110 273 1021 273 273 273 273 273 473F13 772 2675 1590 1776 612 1040 612 595 255 100 225F14 53716 1428 534 100 100 713 100 100 100 100 100

0 4 2 4 5 3 5 6 7 11 9

parameter values in the optimization algorithmsmight resultin significant differences in their performance Second real-world optimization problems may not have much of arelationship to benchmark functionsThird benchmark testsmight result in different conclusions if the grading criteria orproblem setup change In our work we examined the meanand best results obtained with a certain population size andafter a certain number of generations However we mightarrive at different conclusions if (for example) we change thegeneration limit or look at how many generations it takes toreach a certain function value or if we change the populationsize In spite of these caveats the benchmark results shown

here are promising for HSBA and indicate that this novelmethod might be able to find a niche among the plethora ofpopulation-based optimization algorithms

We note that CPU time is a bottleneck to the implemen-tation of many population-based optimization algorithmsIf an algorithm cannot converge fast it will be impracticalsince it would take too long to find an optimal or suboptimalsolution HSBA does not seem to require an unreasonableamount of computational effort of the ten optimization algo-rithms compared in this paper HSBA was the third fastestNevertheless findingmechanisms to accelerate HSBA couldbe an important area for further research

18 Journal of Applied Mathematics

Table 10 Mean normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 101 101 101 101 101 101 101 101 101 100 101F02 74756 787 525 343 509 710 307 187 160 100 101F03 913 312E4 109 107 105 106 105 102 101 101 100F04 2101198646 1001198644 3951198644 9481198643 5971198643 2951198643 1511198643 1501198643 4694 100 237F05 3241198646 3271198644 2091198644 4141198644 7131198643 1541198643 8931198643 1531198643 62948 100 20373F06 100 5941198644 42204 63215 6001198644 192 191 191 191 191 191F07 1063 2071198646 900 21672 32455 307E4 104 103 102 101 100F08 5950 9881198643 304E4 14180 21701 32468 307E4 107 107 103 100F09 22687 4951198646 308 891 11422 17360 25941 245E4 145 100 171F10 25737 2831198644 935E5 137E4 9709 6656 9925 14940 139E4 100 273F11 607 100 183E4 202 1661 39040 26299 40300 60361 572E4 184F12 300 279E4 360E4 100 282E4 27095 19414 13077 20040 30016 284E4F13 232 966 561E6 189E6 100 815 26778 19186 12923 19805 29665F14 18466 384E5 117E4 185E3 100 571E3 2511 5539 3960 2657 4149

1 1 0 1 2 0 0 0 0 6 3

Table 11 Best normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 391E3 100 100 100 100 100 100 100 100 100 100F03 100 425 151 220 220 212 220 199 220 128 180F04 100 255 6554 255 255 255 255 255 255 255 255F05 252 100 171 169E3 171 171 171 171 171 171 171F06 100 5987 3418 2285 4652 2344 4354 4484 3688 1899 812F07 807 100 148 255 255 1306 255 255 255 255 255F08 543 100 192 100 100 100 1145 100 100 100 100F09 7672 252 117 100 171 171 171 15556 171 171 171F10 92910 148 100 491 255 255 255 222 235E3 255 255F11 318E3 100 582E3 399E3 339E3 582E3 491E3 582E3 582E3 958E3 582E3F12 30506 100 53728 9734 18758 9734 9734 9734 9734 9734 39827F13 192 422 587 1007 1298 466 148 401 317 100 409F14 8812 100 148 255 255 491 255 255 255 255 255

4 8 3 4 3 3 2 3 3 4 3

5 Conclusion and Future Work

This paper proposed a hybrid metaheuristic HSBA methodfor optimization problemWe improved the BAby combiningoriginal harmony search (HS) algorithm and evaluating theHSBA on multimodal numerical optimization problemsA novel type of BA model has been presented and animprovement is applied to mutate between bats using har-mony search algorithm during the process of bats updatingUsing the original configuration of the bat algorithm wegenerate the new harmonies based on the newly generatedbat each iteration after batrsquos position has been updated Thenew harmony vector substitutes the newly generated bat only

if it has better fitness This selection scheme is rather greedywhich often overtakes original HS and BA The HSBAattempts to take merits of the BA and the HS in order toavoid all bats getting trapped in inferior local optimal regionsThe HSBA enables the bats to have more diverse exemplarsto learn from as the bats are updated each iteration andalso form new harmonies to search in a larger search spaceThis new method can speed up the global convergence ratewithout losing the strong robustness of the basic BA Fromthe analysis of the experimental results we observe that theproposed HSBA makes good use of the information in pastsolutions more effectively to generate better quality solutionsfrequently when compared to the other population-based

Journal of Applied Mathematics 19

Table 12Mean normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 407 100 100 452 277 257 341 536 369 489 103F03 100 127E4 134 135 135 135 134 135 134 134 134F04 115 8127 939E3 100 101 101 129 100 100 101 101F05 34573 5061 8619 907E3 2501 108 101 191 333 100 118F06 100 542E6 427E4 474E4 548E6 57788 57788 57788 57788 57787 57787F07 486 153 100 9543 10596 122E4 134 136 132 133 135F08 1269 7600 876E3 1703 6913 7672 885E3 110 105 104 100F09 6345 23047 182 167 1278 4860 5349 609E3 111 130 100F10 13355 1251 126 197E3 1148 464 1645 1893 199E3 100 188F11 6302 100 639E3 7944 5924 396E3 142E3 581E3 645E3 745E5 7981F12 390 881E3 890E3 100 890E3 4440 4778 1725 7011 7784 899E3F13 100 2166 207E3 669 580 430 27167 28246 10539 42926 47662F14 4614 100 3610 5593 122 640E3 4215 3289 3481 1272 5129

4 4 3 3 1 1 1 2 2 3 3

optimization algorithms such as ACO BA BBO DE ESGA HS PSO and SGA Based on the results of the tenapproaches on the test problems we can conclude that theHSBA significantly improves the performances of the HSand the BA on most multimodal and unimodal problems

In this work 14 benchmark functions are used to evaluatethe performance of our approach we will test our approachon more problems such as the high-dimensional (119863 ge20) CEC 2010 test suit [45] and the real-world problemsMoreover we will compare HSBA with other metaheuristicmethod such as DLHS [46 47] SGHS [48] adaptive DE[49] CLPSO [50] and DMS-PSO [51] In addition we onlyconsider the unconstrained function optimization in thiswork Our future work consists on adding the diversity rulesinto HSBA for constrained optimization problems such asconstrained real-parameter optimization CEC 2010 test suit[52]

In the field of optimization there are many issues worthyof further study and efficient optimizationmethod should bedeveloped depending on the analysis of specific engineeringproblem Our future work will focus on the two issues onthe one hand we would apply our proposed approach HSBA to solve practical engineering optimization problemsand obviously HSBA can become a fascinating method forreal-world engineering optimization problems on the otherhand we would develop new meta-hybrid approach to solveoptimization problem

Acknowledgments

This work was supported by State Key Laboratory of LaserInteraction with Material Research Fund under Grant noSKLLIM0902-01 and Key Research Technology of Electric-discharge Non-chain Pulsed DF Laser under Grant no LXJJ-11-Q80

References

[1] G Wang L Guo A H Gandomi et al ldquoLevy-flight krill herdalgorithmrdquo Mathematical Problems in Engineering vol 2013Article ID 682073 14 pages 2013

[2] A H Gandomi X S Yang S Talatahari and A H AlaviMetaheuristic Applications in Structures and InfrastructuresElsevier Waltham Mass USA 2013

[3] D H Ackley ldquoAn empirical study of bit vector functionoptimizationrdquo in Genetic Algorithms and Simulated AnnealingL Davis Ed pp 170ndash204 Pitman London UK 1987

[4] R Fletcher and M J D Powell ldquoA rapidly convergent descentmethod for minimizationrdquoThe Computer Journal vol 6 no 2pp 163ndash168 1963

[5] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[6] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[7] K A De Jong Analysis of the Behavior of a Class of GeneticAdaptive Systems University of Michigan 1975

[8] L A Rastrigin Extremal Control Systems NaukaMoscow Rus-sian 1974Theoretical Foundations of Engineering CyberneticsSeries

[9] S-K S Fan and J-M Chang ldquoDynamic multi-swarm particleswarm optimizer using parallel PC cluster systems for globaloptimization of large-scale multimodal functionsrdquo EngineeringOptimization vol 42 no 5 pp 431ndash451 2010

[10] W Gong Z Cai and C X Ling ldquoDEBBO a hybrid differentialevolution with biogeography-based optimization for globalnumerical optimizationrdquo Soft Computing vol 15 no 4 pp 645ndash665 2010

[11] G Wang L Guo H Duan L Liu and H Wang ldquoA modifiedfirefly algorithm for UCAV path planningrdquo International Jour-nal of Hybrid Information Technology vol 5 no 3 pp 123ndash1442012

20 Journal of Applied Mathematics

[12] GWang LGuoHDuan L Liu andHWang ldquoAhybridmeta-heuristic DECS algorithm for UCAVpath planningrdquo Journal ofInformation and Computational Science vol 9 no 16 pp 1ndash82012

[13] GWang L Guo H Duan L Liu HWang andM Shao ldquoPathplanning for uninhabited combat aerial vehicle using hybridmeta-heuristic DEBBO algorithmrdquo Advanced Science Engi-neering and Medicine vol 4 no 6 pp 550ndash564 2012

[14] G Wang L Guo H Duan H Wang L Liu and M Shao ldquoAhybrid meta-heuristic DECS algorithm for UCAV three-dimension path planningrdquo The Scientific World Journal vol2012 Article ID 583973 11 pages 2012

[15] H DuanW Zhao GWang and X Feng ldquoTest-sheet composi-tion using analytic hierarchy process and hybrid metaheuristicalgorithmTSBBOrdquoMathematical Problems in Engineering vol2012 Article ID 712752 22 pages 2012

[16] G Wang L Guo H Duan L Liu and H Wang ldquoDynamicdeployment of wireless sensor networks by biogeography basedoptimization algorithmrdquo Journal of Sensor and Actuator Net-works vol 1 no 2 pp 86ndash96 2012

[17] X S Yang A H Gandomi S Talatahari and A H AlaviMetaheuristics in Water Geotechnical and Transport Engineer-ing Elsevier Waltham Mass USA 2013

[18] D E GoldbergGenetic Algorithms in Search Optimization andMachine Learning Addison-Wesley New York NY USA 1998

[19] R Storn and K Price ldquoDifferential evolutionmdasha simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997

[20] S Das and P N Suganthan ldquoDifferential evolution a survey ofthe state-of-the-artrdquo IEEE Transactions on Evolutionary Com-putation vol 15 no 1 pp 4ndash31 2011

[21] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[22] A H Gandomi G J Yun X-S Yang and S Talatahari ldquoChaos-enhanced accelerated particle swarm optimizationrdquo Communi-cations in Nonlinear Science and Numerical Simulation vol 18no 2 pp 327ndash340 2013

[23] S Talatahari X S Yang R Sheikholeslami andAH GandomildquoOptimal parameter estimation for muskingum model usinghybrid CSS and PSO methodrdquo Journal of Computational andTheoretical Nanoscience In press

[24] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[25] G Wang L Guo H Duan L Liu H Wang and M ShaoldquoHybridizing harmony search with biogeography based opti-mization for global numerical optimizationrdquo Journal of Com-putational and Theoretical Nanoscience In press

[26] Z W Geem J H Kim and G V Loganathan ldquoA new heuristicoptimization algorithm harmony searchrdquo Simulation vol 76no 2 pp 60ndash68 2001

[27] G Wang L Guo H Duan H Wang L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[28] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress Frome UK 2nd edition 2010

[29] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing amp Applications In press

[30] X S Yang andAHGandomi ldquoBat algorithm a novel approachfor global engineering optimizationrdquo Engineering Computa-tions vol 29 no 5 pp 464ndash483 2012

[31] G Wang L Guo H Duan L Liu and H Wang ldquoA Bat algo-rithm with mutation for UCAV path planningrdquo The ScientificWorld Journal vol 2012 Article ID 418946 15 pages 2012

[32] R Parpinelli and H Lopes ldquoNew inspirations in swarm intelli-gence a surveyrdquo International Journal of Bio-Inspired Computa-tion vol 3 no 1 pp 1ndash16 2011

[33] P W Tsai J S Pan B Y Liao M J Tsai and V Istanda ldquoBatalgorithm inspired algorithm for solving numerical optimiza-tion problemsrdquo Applied Mechanics and Materials vol 148 pp134ndash137 2012

[34] Wikipedia ldquoMathematical optimizationrdquo httpenwikipediaorgwikiNumerical optimization

[35] W Y Zhang S Xu and S J Li ldquoNecessary conditions for weaksharp minima in cone-constrained optimization problemsrdquoAbstract and Applied Analysis vol 2011 Article ID 909520 11pages 2012

[36] Wikipedia ldquoGlobal optimizationrdquo httpenwikipediaorgwikiGlobal optimization

[37] S Kirkpatrick C D Gelatt Jr and M P Vecchi ldquoOptimizationby simulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983

[38] M Dorigo and T Stutzle Ant Colony Optimization The MITPress Cambridge Mass USA 2004

[39] M Dorigo Optimization Learning and Natural AlgorithmsPolitecnico di Milano Italy 1992

[40] H G Beyer The Theory of Evolution Strategies Springer NewYork NY USA 2001

[41] W Khatib and P Fleming ldquoThe stud GA a mini revolutionrdquo inParallel Problem Solving From Nature pp 683ndash691 1998

[42] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[43] M Friedman ldquoA comparison of alternative tests of significancefor the problem of 119898 rankingsrdquo The Annals of MathematicalStatistics vol 11 no 1 pp 86ndash92 1940

[44] R Brits A P Engelbrecht and F van den Bergh ldquoLocatingmultiple optima using particle swarm optimizationrdquo AppliedMathematics and Computation vol 189 no 2 pp 1859ndash18832007

[45] K Tang X Li P N Suganthan Z Yang and T Weise ldquoBench-mark functions for the CECrsquo2010 special session and competi-tion on large scale global optimizationrdquo Nature Inspired Com-putation and Applications Laboratory USTC Hefei China2010

[46] Q-K Pan P N Suganthan J J Liang and M F Tasgetiren ldquoAlocal-best harmony search algorithm with dynamic subpopula-tionsrdquo Engineering Optimization vol 42 no 2 pp 101ndash117 2010

[47] Q-K Pan P N Suganthan J J Liang and M F TasgetirenldquoA local-best harmony search algorithm with dynamic sub-harmony memories for lot-streaming flow shop schedulingproblemrdquo Expert Systems with Applications vol 38 no 4 pp3252ndash3259 2011

[48] Q-K Pan P N Suganthan M F Tasgetiren and J J LiangldquoA self-adaptive global best harmony search algorithm forcontinuous optimization problemsrdquo Applied Mathematics andComputation vol 216 no 3 pp 830ndash848 2010

Journal of Applied Mathematics 21

[49] S M Islam S Das S Ghosh S Roy and P N Suganthan ldquoAnadaptive differential evolution algorithm with novel mutationand crossover strategies for global numerical optimizationrdquoIEEE Transactions on Systems Man and Cybernetics B vol 42no 2 pp 482ndash500 2012

[50] J J Liang A K Qin P N Suganthan and S Baskar ldquoCompre-hensive learning particle swarm optimizer for global optimiza-tion of multimodal functionsrdquo IEEE Transactions on Evolution-ary Computation vol 10 no 3 pp 281ndash295 2006

[51] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[52] R Mallipeddi and P Suganthan Problem Definitions and Eval-uation Criteria for the CEC 2010 Competition on ConstrainedReal-Parameter Optimization Nanyang Technological Univer-sity Singapore 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Journal of Applied Mathematics 5

where 120576 isin [minus1 1] is a scaling factor which is a random num-ber while 119860

119905= ⟨119860119905119894⟩ is the average loudness of all the bats at

time step 119905The update of the velocities and positions of bats is similar

to the procedure in the standard PSO [21] as 119891119894controls the

pace and range of the movement of the swarming particles inessence To some degree BA can be considered as a balancedcombination of the standard PSO and the intensive localsearch controlled by the loudness and pulse rate

Furthermore the loudness 119860119894and the rate 119903

119894of pulse

emission update accordingly as the iterations proceed asshown in (8)

119860119905+1119894

= 120572119860119905119894

119903119905+1119894

= 1199030119894[1 minus exp (minus120574119905)]

(8)

where 120572 and 120574 are constants In essence 120572 is similar tothe cooling factor of a cooling schedule in the simulatedannealing (SA) [37] For simplicity we set 120572 = 120574 = 09 inthis work

3 Our Approach HSBA

Based on the introduction of HS and BA in previous sectionwe will explain how we combine the two approaches to formthe proposed bat algorithm with harmony search (HSBA) inthis section whichmodifies the solutions with poor fitness inorder to add diversity of the population to improve the searchefficiency

In general the standard BA algorithm is adept at exploit-ing the search space but at times it may trap into somelocal optima so that it cannot perform global search wellFor BA the search depends completely on random walks soa fast convergence cannot be guaranteed Firstly presentedhere in order to increase the diversity of the populationfor BA so as to avoid trapping into local optima a mainimprovement of adding pitch adjustment operation in HSserving as a mutation operator is made to the BA with theaim of speeding up convergence thus making the approachmore feasible for a wider range of practical applications whilepreserving the attractive characteristics of the basic BA Inthis paper a hybrid metaheuristic algorithm by inducing thepitch adjustment operation in HS as a mutation operatorinto bat algorithm so-called harmony searchbat algorithm(HSBA) is used to optimize the benchmark functions Thedifference between HSBA and BA is that the mutationoperator is used to improve the original BA generating a newsolution for each bat In this way this method can explore thenew search space by the mutation of the HS algorithm andexploit the population information with BA and thereforecan avoid trapping into local optima in BA In the followingwe will show the algorithm HSBA which is an improvementof HS and BA

The critical operator of HSBA is the hybrid harmonysearchmutation operator which composes the improvisationof harmony in HS with the BAThe core idea of the proposedhybrid mutation operator is based on two considerationsFirstly poor solutions can take in many new used features

from good solutions Secondly the mutation operator canimprove the exploration of the new search space In this waythe strong exploration abilities of the original HS and theexploitation abilities of the BA can be fully developed

For bat algorithm as the search relies entirely on randomwalks a fast convergence cannot be guaranteed Describedhere for the first time a main improvement of addingmutation operator is made to the BA including three minorimprovements which are made with the aim of speeding upconvergence thus making the method more practical for awider range of applications but without losing the attractivefeatures of the original method

The first improvement is that we use fixed frequency119891 and loudness 119860 instead of various frequency 119891

119894and 119860119905

119894

Similar to BA inHSBA each bat is defined by its position 119909119905119894

velocity V119905119894 the emission pulse rate 119903119905

119894and the fixed frequency

119891 and loudness 119860 in a 119889-dimensional search space The newsolutions 119909119905

119894and velocities V119905

119894at time step 119905 are given by

V119905

119894= V119905minus1

119894+ (119909119905119894minus 119909lowast) 119891

119909119905119894= 119909119905minus1119894

+ V119905

119894

(9)

where 119909lowastis the current global best location (solution) which

is located after comparing all the solutions among all the 119899bats In our experiments we make 119891 = 05 Through a seriesof simulation experiments on benchmarks in Section 42 itwas found that setting the parameter of pulse rate 119903 to 06 andthe loudness 119860 to 095 produced the best results

The second improvement is to add mutation operator inan attempt to increase diversity of the population to improvethe search efficiency and speed up the convergence to optimaFor the local search part once a solution is selected among thecurrent best solutions a new solution for each bat is generatedlocally using random walk by (7) when 120585 is larger than pulserate 119903 that is 120585 gt 119903 where 120585 isin [0 1] is a random real numberdrawn from a uniform distribution while when 120585 le 119903 weuse pitch adjustment operation in HS serving as a mutationoperator updating the new solution to increase diversity ofthe population to improve the search efficiency as shown in(3) Through testing benchmarks in Section 42 it was foundthat setting the parameter of harmonymemory considerationrate HMCR to 095 and pitch adjustment rate PAR to 01produced the best results

The last improvement is the addition of elitism schemeinto the HSBA As with other population-based optimiza-tion algorithms we typically incorporate some sort of elitismin order to retain the best solutions in the population Thisprevents the best solutions from being corrupted by pitchadjustment operator Note that we use an elitism approach tosave the property of the bats that has the best solution in theHSBA process so even if pitch adjustment operation ruinsits corresponding bat we have saved it and can revert back toit if needed

Based on the above-mentioned analyses the mainframeof the harmony searchbat algorithm (HSBA) is presentedin Algorithm 3

6 Journal of Applied Mathematics

BeginStep 1 Initialization Set the generation counter 119905 = 1 initialize the population of NP

bats 119875 randomly and each bat corresponding to a potentialsolution to the given problem define loudness 119860 set frequency 119876the initial velocities V and pulse rate 119903 set the harmony memory considerationrate HMCR the pitch adjustment rate PAR and bandwidth bwset maximum of elite individuals retained KEEP

Step 2 Evaluate the quality 119891 for each bat in 119875 determined by the objective function 119891(119909)Step 3 While the termination criteria is not satisfied or 119905 ltMaxGeneration do

Sort the population of bats 119875 from best to worst by order of quality 119891 for each batStore the KEEP best bats as KEEPBATfor 119894 = 1NP (all bats) do

V119905

119894= V119905minus1

119894+ (V119905119894minus 119909lowast) times 119876

119909119905119894= 119909119905minus1119894

+ V119905

119894

if (rand gt 119903) then119909119905119906= 119909lowast+ 120572120576119905

end iffor 119895 = 1119863 (all elements) do Mutate

if (rand lt HMCR) then1199031= lceilNP lowast randrceil

119909V (119895) = 1199091199031(119895) where 119903

1isin (1 2 HMS)

if (rand lt PAR) then119909V (119895) = 119909V (119895) + bw times (2 times rand minus 1)

endifelse

119909V(119895) = 119909min 119895 + rand times (119909max119895 minus 119909minminus119895)

endifendfor 119895Evaluate the fitness for the offsprings 119909119905

119906 119909119905119894 119909119905

V

Select the offspring 119909119905119896with the best fitness among the offsprings 119909119905

119906 119909119905119894 119909119905

V

if (rand lt 119860) then1199091199051199031= 119909119905119896

end ifReplace the KEEP worst bats with the KEEP best bats KEEPBAT stored

end for 119894119905 = 119905 + 1

Step 4 end whileStep 5 Post-processing the results and visualization

End

Algorithm 3 The hybrid meta-heuristic algorithm of HSBA

4 Simulation Experiments

In this section we test the performance of the proposedmeta-heuristic HSBA to global numerical optimization through aseries of experiments conducted on benchmark functions

To allow a fair comparison of running times all the exper-iments were conducted on a PC with a Pentium IV processorrunning at 20GHz 512MB of RAM and a hard driveof 160Gbytes Our implementation was compiled usingMATLAB R2012a (714) running under Windows XP3 Nocommercial BA or HS tools were used in the followingexperiments

41 General Performance of HSBA In order to explorethe benefits of HSBA in this subsection we compared itsperformance on global numeric optimization problem withnine other population-based optimization methods which

are ACO BA BBO DE ES GA HS PSO and SGA ACO(ant colony optimization) [38 39] is a swarm intelligencealgorithm for solving optimization problems which is basedon the pheromone deposition of ants BBO (biogeography-based optimization) [24] is a new excellent powerful andefficient evolution algorithm developed for the global opti-mization inspired by the immigration and emigration ofspecies between islands (or habitats) in search of morecompatible islands DE (differential evolution) [19] is a simplebut excellent optimization method that uses the differencebetween two solutions to probabilistically adapt a thirdsolution An ES (evolutionary strategy) [40] is an algorithmthat generally distributes equal importance to mutation andrecombination and that allows two or more parents toreproduce an offspring A GA (genetic algorithm) [18] is asearch heuristic that mimics the process of natural evolutionPSO (particle swarm optimization) [21] is also a swarm

Journal of Applied Mathematics 7

intelligence algorithm which is based on the swarm behaviorof fish and bird schooling in nature A stud genetic algorithm(SGA) [41] is a GA that uses the best individual at eachgeneration for crossover

In all experiments we will use the same parameters forHS BA and HSBA that are loudness 119860 = 095 pulse rate119903 = 06 scaling factor 120576 = 01 the harmony memoryconsideration rate HMCR = 095 and the pitch adjustmentrate PAR = 01 For ACO BBO DE ES GA PSO and SGAwe set the parameters as follows For ACO initial pheromonevalue 120591

0= 1119864 minus 6 pheromone update constant 119876 = 20

exploration constant 1199020= 1 global pheromone decay rate

120588119892= 09 local pheromone decay rate 120588

119897= 05 pheromone

sensitivity 119904 = 1 and visibility sensitivity 120573 = 5 for BBOhabitat modification probability = 1 immigration probabilitybounds per gene = [0 1] step size for numerical integration ofprobabilities = 1 maximum immigration and migration ratesfor each island = 1 and mutation probability = 0005 for DEa weighting factor 119865 = 05 and a crossover constant CR = 05for the ES the number of offspring 120582 = 10 produced ineach generation and standard deviation 120590 = 1 for changingsolutions For the GA we used roulette wheel selection andsingle-point crossover with a crossover probability of 1 anda mutation probability of 001 For PSO we set an inertialconstant = 03 a cognitive constant = 1 and a social constantfor swarm interaction = 1 For the SGA we used single-pointcrossover with a crossover probability of 1 and a mutationprobability of 001

Well-defined problem sets are favorable for evaluatingthe performance of optimization methods proposed in thispaper Based on mathematical functions benchmark func-tions can be applied as objective functions to perform suchtests The properties of these benchmark functions can beeasily achieved from their definitions Fourteen differentbenchmark functions are applied to verify our proposedmetaheuristic algorithmHSBA Each of the functions in thisstudy has 20 independent variables (ie119863 = 20)

The benchmark functions described in Table 1 are stan-dard testing functionsTheproperties of the benchmark func-tions are given in Table 2 The modality property means thenumber of the best solutions in the search space Unimodalbenchmark functions only have an optimum which is theglobal optimum Multimodal benchmark functions have atleast two optima in their search space indicating that theyhave more than one local optima except the global optimumMore details of all the benchmark functions can be found in[6]

We set population size 119873119875 = 50 an elitism parameter119870119890119890119901 = 2 and maximum generation119872119886119909119892119890119899 = 50 for eachalgorithm We ran 100 Monte Carlo simulations of eachalgorithm on each benchmark function to get representativeperformances Tables 3 and 4 illustrate the results of thesimulations Table 3 shows the average minima found byeach algorithm averaged over 100 Monte Carlo runs Table 4shows the absolute best minima found by each algorithmover 100 Monte Carlo runs In other words Table 3 showsthe average performance of each algorithm while Table 4shows the best performance of each algorithmThe best valueachieved for each test problem is shown in bold Note that the

normalizations in the tables are based on different scales sovalues cannot be compared between the two tables

From Table 3 we see that on average HSBA is the mosteffective at finding objective function minimum on ten ofthe fourteen benchmarks (F02 F03 F06ndashF11 and F13-F14)ACO is the second most effective performing the best on thebenchmarks F04 and F05 BBO and SGA are the third mosteffective performing the best on the benchmarks F12 andF01 respectively whenmultiple runs aremade Table 4 showsthat HSBA performs the best on thirteen of the fourteenbenchmarks (F02ndashF14) while BBO performs the secondbest at finding objective function minimum on the onlybenchmark (F01) when multiple runs are made In additionstatistical analysis [42] on these values obtained by the tenmethods on 14 benchmark functions based on the Friedmanrsquostest [43] reveals that the differences in the obtained averageand the best functionminima are statistically significant (119875 =

166 lowast 10minus17 and119875 = 725 lowast 10minus17 resp) at the confidence levelof 5

Furthermore the computational requirements of the tenoptimization methods were similar We collected the averagecomputational time of the optimization methods as appliedto the 14 benchmarks discussed in this sectionThe results areshown in Table 3 BA was the quickest optimization methodHSBA was the third fastest of the ten algorithms Howeverit should be noted that in the vast majority of real-worldapplications it is the fitness function evaluation that is by farthe most expensive part of a population-based optimizationalgorithm

Furthermore convergence graphs of ACO BA BBODE ES GA HS HSBA PSO and SGA are shown inFigures 1ndash14 which represent the process of optimizationThevalues shown in Figures 1ndash14 are the mean objective functionoptimum achieved from 100Monte Carlo simulations whichare the true objective function values not normalized

Figure 1 shows the results obtained for the ten methodswhen the F01 Ackley function is applied From Figure 1clearly we can draw the conclusion that BBO is significantlysuperior to all the other algorithms including HSBA duringthe process of optimization while HSBA performs the sec-ond best on this multimodal benchmark function Here allthe other algorithms show the almost same starting point andHSBA has a faster convergence rate than other algorithmswhile HSBA is outperformed by BBO after 13 generationsAlthough slower SGA eventually finds the global minimumclose to HSBA Obviously BBO HSBA and SGA outper-form ACO BA DE ES GA HS and PSO during the wholeprocess of searching the global minimum

Figure 2 illustrates the optimization results for F02 Fletc-her-Powell function In thismultimodal benchmark problemit is obvious that HSBA outperforms all other methods dur-ing the whole progress of optimization (except the generation20ndash33 BBO and HSBA performs approximately equally)

Figure 3 shows the optimization results for F03 Griewankfunction From Table 3 and Figure 3 we can see that HSBAperforms the best on this multimodal benchmark problemLooking at Figure 3 carefully HSBA shows a faster conver-gence rate initially than ACO however it is outperformed

8 Journal of Applied Mathematics

Table1Be

nchm

arkfunctio

ns

No

Nam

eDefinitio

nSource

F01

Ackley

119891(119909)=20

+119890minus20

sdot119890minus02sdot radic(1119899)

119899 sum 119894=1

1199092 119894

minus119890(1119899)

119899 sum 119894=1

cos(2120587119909119894)

[3]

F02

Fletcher-Pow

ell119891(119909)=119899 sum 119894=1

(119860119894minus119861119894)2

119860119894=119899 sum 119895=1

(119886119894119895sin

120572119895+119887 119894119895cos120572119895)

119861119894=119899 sum 119895=1

(119886119894119895sin

119909119895+119887 119894119895cos119909119895)

[4]

F03

Grie

wank

119891(119909)=119899 sum 119894=1

1199092 119894

4000

minus119899 prod 119894=1

cos(

119909119894

radic119894)

+1

[5]

F04

Penalty

1119891(119909)=

120587 3010sin2(1205871199101)+119899minus1

sum 119894=1

(119910119894minus1)2

sdot[1+10sin2(120587119910119894+1)]

+(119910119899minus1)2

+119899 sum 119894=1

119906(119909119894101004)

119910119894=1+025

(119909119894+1)

[6]

F05

Penalty

2119891(119909)=

01sin2(31205871199091)+119899minus1

sum 119894=1

(119909119894minus1)2

sdot[1+sin2(3120587119909119894+1)]+(119909119899minus1)2

[1+sin2(2120587119909119899)]

+119899 sum 119894=1

119906(11990911989451004)

[6]

F06

Quarticwith

noise

119891(119909)=119899 sum 119894=1

(119894sdot1199094 119894+119880(01))

[7]

F07

Rastrig

in119891(119909)=10

sdot119899+119899 sum 119894=1

(1199092 119894minus10

sdotcos

(2120587119909119894))

[8]

F08

Rosenb

rock

119891(119909)=119899minus1

sum 119894=1

[100(119909119894+1minus1199092 119894)2+(119909119894minus1)2

][7]

F09

Schw

efel226

119891(119909)=4189829

times119863minus119863 sum 119894=1

119909119894sin

(1003816 1003816 1003816 1003816119909119894

1003816 1003816 1003816 100381612

)[9]

F10

Schw

efel12

119891(119909)=119899 sum 119894=1

(119894 sum 119895=1

119909119895)

2

[10]

F11

Schw

efel222

119891(119909)=119899 sum 119894=1

|119909119894|+119899 prod 119894=1

|119909119894|

[10]

F12

Schw

efel221

119891(119909)=max 119894

1003816 1003816 1003816 1003816119909119894

1003816 1003816 1003816 10038161le119894le119899

[6]

Journal of Applied Mathematics 9

Table1Con

tinued

No

Nam

eDefinitio

nSource

F13

Sphere

119891(119909)=119899 sum 119894=1

1199092 119894

[7]

F14

Step

119891(119909)=6sdot119899

+119899 sum 119894=1

lfloor 119909119894rfloor

[7]

lowast

Inbenchm

arkfunctio

nF0

2them

atrix

elem

entsa 119899times119899b 119899times119899isin(minus100100)120572119899times1isin(minus120587120587)ared

rawnfro

mun

iform

distr

ibution[4]

lowastIn

benchm

arkfunctio

nsF0

4andF0

5thed

efinitio

nof

thefun

ction119906(119909119894119886119896119898)isgivenby119906(119909119894119886119896119898)=119896(119909119894minus119886)119898119909119894gt1198860minus119886le119909119894le119886119896(minus119909119894minus119886)119898119909119894ltminus119886

10 Journal of Applied Mathematics

Table 2 Properties of benchmark functions lb denotes lower bound ub denotes upper bound and opt denotes optimum point

No Function lb ub opt Continuity ModalityF01 Ackley minus32768 32768 0 Continuous MultimodalF02 Fletcher-Powell minus120587 120587 0 Continuous MultimodalF03 Griewank minus600 600 0 Continuous MultimodalF04 Penalty 1 minus50 50 0 Continuous MultimodalF05 Penalty 2 minus50 50 0 Continuous MultimodalF06 Quartic with noise minus128 128 1 Continuous MultimodalF07 Rastrigin minus512 512 0 Continuous MultimodalF08 Rosenbrock minus2048 2048 0 Continuous UnimodalF09 Schwefel 226 minus512 512 0 Continuous MultimodalF10 Schwefel 12 minus100 100 0 Continuous UnimodalF11 Schwefel 222 minus10 10 0 Continuous UnimodalF12 Schwefel 221 minus100 100 0 Continuous UnimodalF13 Sphere minus512 512 0 Continuous UnimodalF14 Step minus512 512 0 Discontinuous Unimodal

Table 3 Mean normalized optimization results in fourteen benchmark functions The values shown are the minimum objective functionvalues found by each algorithm averaged over 100 Monte Carlo simulations

ACO BA BBO DE ES GA HS HSBA PSO SGAF01 231 333 115 202 338 272 347 109 266 100F02 2458 2582 158 894 2435 545 1569 100 1396 133F03 316 6072 193 544 2385 322 7722 100 2577 142F04 100 30E38 40E32 56E33 27E38 31E32 14E39 23E32 41E36 96E31F05 100 11E8 29942 15E6 46E8 54E3 41E8 21551 55E7 11110F06 48901 68E3 3532 30829 18E4 27483 15E4 100 25E3 1009F07 809 1155 128 656 1187 617 1022 100 844 180F08 4225 2901 229 759 5999 905 4785 100 1204 215F09 317 2026 199 1358 1333 181 1992 100 1761 115F10 175 373 138 295 493 125 422 100 248 148F11 105 1970 183 714 2312 1113 1945 100 1322 246F12 186 403 100 299 391 192 374 138 238 112F13 9830 15084 380 1903 22652 4774 18232 100 7291 402F14 773 12048 393 1331 10256 1153 14655 100 6344 328TimE 274 100 132 164 167 179 233 143 203 176lowast

The values are normalized so that the minimum in each row is 100 These are not the absolute minima found by each algorithm but the average minimafound by each algorithm

by ACO after 5 generations For other algorithms althoughslower BBO GA and SGA eventually find the global mini-mum close to HSBA while BA DE ES HS and PSO fail tosearch the global minimum within the limited iterations

Figure 4 shows the results for F04 Penalty 1 functionFrom Figure 4 it is obvious that HSBA outperforms all othermethods during the whole progress of optimization in thismultimodal benchmark function Although slower SGA andBBO perform the second and third best at finding the globalminimum

Figure 5 shows the performance achieved for F05 Penalty2 function For this multimodal function similar to F04Penalty 1 function shown in Figure 4 HSBA is significantly

superior to all the other algorithms during the process ofoptimization Here PSO shows a faster convergence rateinitially than HSBA however it is outperformed by HSBAafter 3 generations

Figure 6 shows the results achieved for the ten methodswhen using the F06 Quartic (with noise) function FromTable 3 and Figure 6 we can conclude that HSBA performsthe best in this multimodal function Looking carefullyat Figure 6 PSO and HSBA show the same initial fastconvergence towards the known minimum as the procedureproceeds HSBA gets closer and closer to the minimumwhile PSO comes into being premature and traps into thelocal minimum BA ES GA HS and PSO do not manage

Journal of Applied Mathematics 11

Table 4 Best normalized optimization results in fourteen benchmark functions The values shown are the minimum objective functionvalues found by each algorithm

ACO BA BBO DE ES GA HS HSBA PSO SGAF01 185 231 100 143 225 203 230 105 191 104F02 1042 1448 109 422 1039 403 991 100 828 133F03 273 4622 187 447 2138 802 4324 100 1675 176F04 44E6 50E6 12E3 19E4 22E6 30E4 42E6 100 3755 322F05 30E4 32E4 5101 38633 16E4 88446 26E4 100 4113 457F06 5851 99255 650 2469 80848 5924 74691 100 18971 202F07 571 853 125 513 794 523 747 100 600 168F08 2642 2503 148 370 3352 616 2150 100 775 145F09 243 866 128 490 593 209 728 100 740 142F10 189 494 118 266 300 208 276 100 201 174F11 774 1261 121 336 1214 601 960 100 781 162F12 133 233 144 169 208 174 211 100 170 121F13 2804 5657 217 554 6057 1971 5286 100 2098 228F14 428 5402 197 485 3361 1060 4911 100 1972 185lowast

The values are normalized so that the minimum in each row is 100 These are the absolute best minima found by each algorithm

0 10 20 30 40 508

10

12

14

16

18

20

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 1 Comparison of the performance of the different methodsfor the F01 Ackley function

to succeed in this benchmark function within maximumnumber of generations At last BBO DE and SGA convergeto the value very close to HSBA

Figure 7 shows the optimization results for the F07Rastrigin function Very clearly HSBA has the fastest con-vergence rate at finding the globalminimumand significantlyoutperforms all other approaches Looking carefully at Fig-ure 7 approximately we can divide all the algorithms intothree groups one group including BBO HSBA and SGAperforming the best the other group including ACO DE

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

106

105

Figure 2 Comparison of the performance of the different methodsfor the F02 Fletcher-Powell function

GA and PSO performing the second best another groupincluding BA ES and HS performing the worst

Figure 8 shows the results for F08 Rosenbrock functionFrom Table 3 and Figure 8 we can see that HSBA is superiorto the other algorithms during the optimization process inthis relatively simple unimodal benchmark function Alsowe see that similar to the F06 Quartic (with noise) functionshown in Figure 6 PSO shows a faster convergence rateinitially than HSBA however it is outperformed by HSBAafter 4 generations and is premature to the local minimum

Figure 9 shows the equivalent results for the F09 Schwefel226 function From Figure 9 very clearly though HSBA isoutperformed by BBO between the generations 10ndash30 it hasthe stable convergence rate at finding the global minimum

12 Journal of Applied Mathematics

0 10 20 30 40 50

50

100

150

200

250

300

350

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 3 Comparison of the performance of the different methodsfor the F03 Griewank function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

1010

109

108

107

106

105

104

103

102

101

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 4 Comparison of the performance of the different methodsfor the F04 Penalty 1 function

and significantly outperforms all other approaches in thismultimodal benchmark function

Figure 10 shows the results for F10 Schwefel 12 functionFrom Figure 10 we can see that HSBA performs far betterthan other algorithms during the optimization process in thisrelative simple unimodal benchmark function PSO showsa faster convergence rate initially than HSBA howeverit is outperformed by HSBA after 3 generations Lookingcarefully at Figure 10 akin to F07 Rastrigin function shown inFigure 7 all the algorithms except BA can be approximately

0

Ben

chm

ark

func

tion

val

ue

5 10 15 20 25 30 35 40 45 50Number of generations

109

108

107

106

105

104

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 5 Comparison of the performance of the different methodsfor the F05 Penalty 2 function

0 10 20 30 40 500

5

10

15

20

25

30

35

40

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 6 Comparison of the performance of the different methodsfor the F06 Quartic (with noise) function

divided into three groups one group including BBO andHSBA performing the best the other group including ACOGA PSO and SGA performing the second best anothergroup including DE ES and HS performing the worst

Figure 11 shows the results for F11 Schwefel 222 functionVery clearly BBO shows a faster convergence rate initiallythan HSBA however it is outperformed by HSBA after 40generations At last HSBA reaches the optimal solution sig-nificantly superior to other algorithms BBO is only inferior

Journal of Applied Mathematics 13B

ench

mar

k fu

ncti

on v

alue

0 10 20 30 40 50

50

100

150

200

250

Number of generations

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 7 Comparison of the performance of the different methodsfor the F07 Rastrigin function

0 10 20 30 40 50

500

1000

1500

2000

2500

3000

3500

4000

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 8 Comparison of the performance of the different methodsfor the F08 Rosenbrock function

to HSBA and performs the second best in this unimodalfunction

Figure 12 shows the results for F12 Schwefel 221 functionVery clearly HSBAhas the fastest convergence rate at findingthe global minimum and significantly outperforms all otherapproaches

0 10 20 30 40 50

1000

2000

3000

4000

5000

6000

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 9 Comparison of the performance of the different methodsfor the F09 Schwefel 226 function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

104

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 10 Comparison of the performance of the differentmethodsfor the F10 Schwefel 12 function

Figure 13 shows the results for F13 Sphere function FromTable 3 and Figure 13 HSBA has the fastest convergencerate at finding the global minimum and outperforms all otherapproaches Looking carefully at Figure 13 for BBO and SGAwe can see that BBO has a faster convergence rate than SGAbut SGA does finally converge to the value of BBO that isapproaching to HSBA

Figure 14 shows the results for F14 Step function Appar-ently HSBA shows the fastest convergence rate at findingthe global minimum and significantly outperforms all otherapproaches Looking carefully at Figure 14 for BBO and SGA

14 Journal of Applied MathematicsB

ench

mar

k fu

ncti

on v

alue

0 5 10 15 20 25 30 35 40 45 50Number of generations

102

101

100

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 11 Comparison of the performance of the different methodsfor the F11 Schwefel 222 function

0 10 20 30 40 5030

35

40

45

50

55

60

65

70

75

80

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 12 Comparison of the performance of the different methodsfor the F12 Schwefel 221 function

we can see that BBO has a faster convergence rate than SGAbut SGA does finally converge to the value of BBO

From the above-analyses about Figures 1ndash14 we can cometo the conclusion that our proposed hybrid metaheuristicalgorithm HSBA significantly outperforms the other ninealgorithms In general ACO BBO and SGA are only inferiorto the HSBA and ACO BBO and SGA perform betterthanHSBA in the benchmarks F04-F05 the benchmark F12and the benchmark F01 respectively Further benchmarksF04 F05 F06 F08 and F10 illustrate that PSO has a faster

0 10 20 30 40 500

10

20

30

40

50

60

70

80

90

100

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 13 Comparison of the performance of the different methodsfor the F13 Sphere function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

104

103

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 14 Comparison of the performance of the differentmethodsfor the F14 Step function

convergence rate initially while later it converges slowerand slower to the true objective function value At lastwe must point out that in [24] Simon compared BBOwith seven state-of-the-art EAs over 14 benchmark func-tions and a real-world sensor selection problem The resultsdemonstrated the good performance of BBO It is also indi-rectly demonstrated that our proposed hybrid metaheuristicmethod HSBA is a more powerful and efficient optimiza-tion algorithm than other population-based optimizationalgorithms

Journal of Applied Mathematics 15

Table 5 Best normalized optimization results in fourteen benchmark functions with different 119860 The numbers shown are the best resultsfound after 100 Monte Carlo simulations HSBA algorithm

119860

0 01 02 03 04 05 06 07 08 09 10F01 166 133 155 140 100 133 121 122 116 113 114F02 125E4 100 100 100 100 100 100 100 100 100 100F03 341 1115 100 100 100 100 100 100 100 100 100F04 273E4 100 125E4 104 104 104 104 104 104 100 100F05 503E5 530 133 168E4 100 100 100 100 100 100 100F06 100 256 547 473 1781 990 289 674 990 260 557F07 1014 1492 435 110 110 1537 101 100 100 100 100F08 3849 108 1406 108 108 108 9901 101 100 100 100F09 28518 40458 138 474 119 119 119 36791 101 100 100F10 66522 468 169E3 1527 118 118 118 118 121 100 100F11 2003 100 996 1154 3922 996 996 996 996 3897 849F12 1032 100 1322 323 1469 279 279 279 279 279 1931F13 146 415 284 447 115 156 100 100 100 100 100F14 52752 1357 395 101 115 1291 100 100 100 100 100

1 4 2 2 3 3 5 6 7 10 10

42 Influence of Control Parameter In [28] Dr Yang con-cluded that if the parameters in BA can be adjusted properlyit can outperform GA HS (harmony search) [26] and PSOThe choice of the control parameters is of vital importancefor different problems To investigate the influence of theloudness119860 pulse rate 119903 harmonymemory consideration rateHMCR and pitch adjustment rate PAR on the performanceof HSBA we carry out this experiment in the benchmarkproblem with different parameters All other parametersettings are kept unchanged (unless noted otherwise in thefollowing paragraph) The results are recorded in Tables 5ndash12 after 100 Monte Carlo runs Among them Tables 5 7 9and 11 show the best minima found by HSBA algorithm over100Monte Carlo runs Tables 6 8 10 and 12 show the averageminima found byHSBA algorithm averaged over 100MonteCarlo runs In other words Tables 5 7 9 11 and Tables 68 10 12 show the best and average performance of HSBAalgorithm respectively In each table the last row is the totalnumber of functions on which HSBA performs the best withsome parameters

421 Loudness 119860 Tables 5 and 6 recorded the resultsperformed in the benchmark problem with the loudness 119860 =0 01 02 09 10 and fixed pulse rate 119903 = 06 harmonymemory consideration rate HMCR= 095 and pitch adjust-ment rate PAR= 01 From Tables 5 and 6 obviously it can beseen that (i) for the three benchmark functions F02 F03 andF04 HSBA performs slightly differently that is to say thesethree benchmark functions are insensitive to the parameter119860 (ii) For benchmark functions F01 F06 F11 and F12HSBA performs better on smaller119860(lt05) (iii) However forfunctions F05 F07ndash10 F13 and F14 HSBA performs betteron bigger119860(gt05) In sum HSBA performs the best when119860is equal or very close to 10 So we set119860 = 095 which is veryclose to 09 and 10 in other experiments

422 Pulse Rate 119903 Tables 7 and 8 recorded the resultsperformed on the benchmark problem with the pulse rate119903 = 0 01 02 09 10 and fixed loudness119860 = 095harmony memory consideration rate HMCR=095 andpitch adjustment rate PAR= 01 From Tables 7 and 8 wecan evidently conclude that HSBA performs significantlybetter on bigger 119903(gt05) than on smaller 119903(lt05) and HSBAperforms the best when r is equal or very close to 06 So weset 119903 = 06 in other experiments

423 Harmony Memory Consideration Rate119867119872119862119877 Tables9 and 10 recorded the results performed in the bench-mark problem with the harmony memory consideration rateHMCR = 0 01 02 09 10 fixed loudness 119860 = 095pulse rate 119903 = 06 and pitch adjustment rate PAR = 01 FromTables 9 and 10 we can recognize that the function valuesevaluated by HSBA are bettersmaller on bigger HMCR(gt05) than on smaller r (lt05) and most benchmarks reachminimum when HMCR is equal or very close to 1 So we setHMCR=095 in other experiments

424 Pitch Adjustment Rate 119875119860119877 Tables 11 and 12 recordedthe results performed on the benchmark problems with thepitch adjustment rate PAR = 0 01 02 09 10 fixedloudness119860 = 095 pulse rate 119903 = 06 and harmonymemory consideration rate HMCR = 095 From Tables 11and 12 we can recognize that the function values for HSBAvary little with the increasing PAR and HSBA reachesoptimumminimum in most benchmarks when PAR is equalor very close to 01 So we set PAR = 01 in other experiments

43 Discussion In the HSBA the bats fly in the sky usingecholocation to find foodprey (ie best solutions) Fourother parameters are the loudness (119860) the rate of pulseemission (119903) harmony memory consideration rate (HMCR)

16 Journal of Applied Mathematics

Table 6 Mean normalized optimization results in fourteen benchmark functions with different A The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119860

0 01 02 03 04 05 06 07 08 09 10F01 101 101 100 100 100 100 100 100 100 100 100F02 152 100 146 117 108 137 188 165 167 179 148F03 669 1041 106 101 103 103 101 102 105 100 100F04 20131 126 410 421 333 272 100 519 291 100 311F05 59122 1682 425 587 101 414 2404 100 368 733 900F06 100 490E4 63799 25871 495E4 218 217 217 218 217 217F07 857 221E6 643 27234 11048 211E4 102 101 100 100 100F08 4980 114E4 172E4 16160 22459 9119 174E4 103 111 100 100F09 8237 173E6 106 314 7445 10310 4226 794E3 100 137 129F10 9031 145E3 245E5 232E3 2334 2234 3138 1289 234E3 140 100F11 498 100 315E4 233 1441 52023 44365 61626 24991 478E4 214F12 369 210E4 457E4 100 212E4 26635 23313 19880 27614 11201 214E4F13 190 825 448E6 214E6 100 615 25451 22275 18996 26386 10701F14 6691 195E5 117E4 124E3 2104 186E3 2940 2392 100 1835 2475

1 2 1 2 2 1 2 2 4 5 5

Table 7 Best normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 184 184 484 159 171 100 134 109 116 101F02 944E3 100 100 100 100 100 100 100 100 100 100F03 373 546 186 186 186 186 100 172 121 161 101F04 1142 111 2429 123 126 100 129 129 129 129 129F05 177E5 230 115 109E4 100 100 100 100 100 100 100F06 6244 2996 4858 1560 2634 780 182 100 3520 239 155F07 831 448 258 129 129 349 100 100 100 100 100F08 961 118 420 137 137 137 695 101 101 100 100F09 37394 44422 168 335 168 168 100 29198 101 101 101F10 38693 316 1397 463 158 158 100 158 30931 158 101F11 4255 100 1829 2135 2018 1111 1904 2135 1581 1876 1155F12 11440 100 14184 4448 12547 4448 4448 4448 4448 4448 6943F13 639 637 601 296 302 100 152 139 100 257 249F14 12052 404 233 100 116 340 100 100 116 116 116

0 3 1 2 2 4 8 5 4 4 4

and pitch adjustment rate (PAR) The appropriate update forthe pitch adjustment rate (PAR) and the rate of pulse emission(119903) balances the exploration and exploitation behavior of eachbat respectively as the loudness usually decreases once a bathas found its preysolution while the rate of pulse emissionincreases in order to raise the attack accuracy

For all of the standard benchmark functions that havebeen considered the HSBA has been demonstrated toperform better than or be equal to the standard BA and otheracclaimed state-of-the-art population-based algorithms withthe HSBA performing significantly better in some functionsThe HSBA performs excellently and efficiently because ofits ability to simultaneously carry out a local search still

searching globally at the same time It succeeds in doingthis due to the local search via harmony search algorithmand global search via bat algorithm concurrently A similarbehavior may be performed in the PSO by using multiswarmfrom a particle population initially [44] However HSBArsquosadvantages include performing simply and easily and onlyhaving four parameters to regulate The work carried outhere demonstrates the HSBA to be robust over all kinds ofbenchmark functions

Benchmark evaluation is a good way for verifying theperformance of the metaheuristic algorithms but it also haslimitations First we did not make any special effort to tunethe optimization algorithms in this section Different tuning

Journal of Applied Mathematics 17

Table 8 Mean normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 199 186 194 159 100 142 134 109 116 171F02 331 538 457 308 382 114 100 289 279 402 311F03 373 546 533 229 336 241 100 172 121 161 136F04 1142 111 2429 149E4 113E3 394E3 278E4 1966 17522 114 100F05 1609 12862 3269 2446 6440 1277 2959 7617 456 100 446F06 6244 2696 4858 1560 2634 780 182 100 3520 239 155F07 330 178 134 103 123 139 155 100 119 136 349F08 488 393 394 194 219 383 353 433 100 227 314F09 139 166 114 100 121 115 115 109 108 104 109F10 721 994 875 360 846 611 483 414 576 100 271F11 382 423 373 205 181 100 171 220 142 168 189F12 164 217 204 147 180 186 100 121 113 134 156F13 639 637 601 296 302 190 152 139 100 257 249F14 574 1543 742 682 644 317 100 485 221 174 209

0 0 0 1 0 2 4 2 2 2 1

Table 9 Best normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 164 164 164 164 164 164 164 151 128 100 163F02 187E4 100 100 100 100 100 100 100 100 100 100F03 1982 2649 371 371 371 371 371 371 371 206 100F04 126E5 100 187E4 100 100 100 100 100 100 100 100F05 607E5 534 100 187E4 100 100 100 100 100 100 100F06 10892 24631 36504 34540 33857 23465 14349 4528 2330 100 2575F07 922 1428 534 100 100 930 100 100 100 100 100F08 1810 108 772 108 108 108 3594 100 100 100 100F09 28004 39161 127 681 127 127 127 22795 100 100 100F10 55165 876 176E3 1170 164 164 164 164 27448 100 100F11 1467 100 618 618 1399 618 618 618 440 269 616F12 768 100 1110 273 1021 273 273 273 273 273 473F13 772 2675 1590 1776 612 1040 612 595 255 100 225F14 53716 1428 534 100 100 713 100 100 100 100 100

0 4 2 4 5 3 5 6 7 11 9

parameter values in the optimization algorithmsmight resultin significant differences in their performance Second real-world optimization problems may not have much of arelationship to benchmark functionsThird benchmark testsmight result in different conclusions if the grading criteria orproblem setup change In our work we examined the meanand best results obtained with a certain population size andafter a certain number of generations However we mightarrive at different conclusions if (for example) we change thegeneration limit or look at how many generations it takes toreach a certain function value or if we change the populationsize In spite of these caveats the benchmark results shown

here are promising for HSBA and indicate that this novelmethod might be able to find a niche among the plethora ofpopulation-based optimization algorithms

We note that CPU time is a bottleneck to the implemen-tation of many population-based optimization algorithmsIf an algorithm cannot converge fast it will be impracticalsince it would take too long to find an optimal or suboptimalsolution HSBA does not seem to require an unreasonableamount of computational effort of the ten optimization algo-rithms compared in this paper HSBA was the third fastestNevertheless findingmechanisms to accelerate HSBA couldbe an important area for further research

18 Journal of Applied Mathematics

Table 10 Mean normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 101 101 101 101 101 101 101 101 101 100 101F02 74756 787 525 343 509 710 307 187 160 100 101F03 913 312E4 109 107 105 106 105 102 101 101 100F04 2101198646 1001198644 3951198644 9481198643 5971198643 2951198643 1511198643 1501198643 4694 100 237F05 3241198646 3271198644 2091198644 4141198644 7131198643 1541198643 8931198643 1531198643 62948 100 20373F06 100 5941198644 42204 63215 6001198644 192 191 191 191 191 191F07 1063 2071198646 900 21672 32455 307E4 104 103 102 101 100F08 5950 9881198643 304E4 14180 21701 32468 307E4 107 107 103 100F09 22687 4951198646 308 891 11422 17360 25941 245E4 145 100 171F10 25737 2831198644 935E5 137E4 9709 6656 9925 14940 139E4 100 273F11 607 100 183E4 202 1661 39040 26299 40300 60361 572E4 184F12 300 279E4 360E4 100 282E4 27095 19414 13077 20040 30016 284E4F13 232 966 561E6 189E6 100 815 26778 19186 12923 19805 29665F14 18466 384E5 117E4 185E3 100 571E3 2511 5539 3960 2657 4149

1 1 0 1 2 0 0 0 0 6 3

Table 11 Best normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 391E3 100 100 100 100 100 100 100 100 100 100F03 100 425 151 220 220 212 220 199 220 128 180F04 100 255 6554 255 255 255 255 255 255 255 255F05 252 100 171 169E3 171 171 171 171 171 171 171F06 100 5987 3418 2285 4652 2344 4354 4484 3688 1899 812F07 807 100 148 255 255 1306 255 255 255 255 255F08 543 100 192 100 100 100 1145 100 100 100 100F09 7672 252 117 100 171 171 171 15556 171 171 171F10 92910 148 100 491 255 255 255 222 235E3 255 255F11 318E3 100 582E3 399E3 339E3 582E3 491E3 582E3 582E3 958E3 582E3F12 30506 100 53728 9734 18758 9734 9734 9734 9734 9734 39827F13 192 422 587 1007 1298 466 148 401 317 100 409F14 8812 100 148 255 255 491 255 255 255 255 255

4 8 3 4 3 3 2 3 3 4 3

5 Conclusion and Future Work

This paper proposed a hybrid metaheuristic HSBA methodfor optimization problemWe improved the BAby combiningoriginal harmony search (HS) algorithm and evaluating theHSBA on multimodal numerical optimization problemsA novel type of BA model has been presented and animprovement is applied to mutate between bats using har-mony search algorithm during the process of bats updatingUsing the original configuration of the bat algorithm wegenerate the new harmonies based on the newly generatedbat each iteration after batrsquos position has been updated Thenew harmony vector substitutes the newly generated bat only

if it has better fitness This selection scheme is rather greedywhich often overtakes original HS and BA The HSBAattempts to take merits of the BA and the HS in order toavoid all bats getting trapped in inferior local optimal regionsThe HSBA enables the bats to have more diverse exemplarsto learn from as the bats are updated each iteration andalso form new harmonies to search in a larger search spaceThis new method can speed up the global convergence ratewithout losing the strong robustness of the basic BA Fromthe analysis of the experimental results we observe that theproposed HSBA makes good use of the information in pastsolutions more effectively to generate better quality solutionsfrequently when compared to the other population-based

Journal of Applied Mathematics 19

Table 12Mean normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 407 100 100 452 277 257 341 536 369 489 103F03 100 127E4 134 135 135 135 134 135 134 134 134F04 115 8127 939E3 100 101 101 129 100 100 101 101F05 34573 5061 8619 907E3 2501 108 101 191 333 100 118F06 100 542E6 427E4 474E4 548E6 57788 57788 57788 57788 57787 57787F07 486 153 100 9543 10596 122E4 134 136 132 133 135F08 1269 7600 876E3 1703 6913 7672 885E3 110 105 104 100F09 6345 23047 182 167 1278 4860 5349 609E3 111 130 100F10 13355 1251 126 197E3 1148 464 1645 1893 199E3 100 188F11 6302 100 639E3 7944 5924 396E3 142E3 581E3 645E3 745E5 7981F12 390 881E3 890E3 100 890E3 4440 4778 1725 7011 7784 899E3F13 100 2166 207E3 669 580 430 27167 28246 10539 42926 47662F14 4614 100 3610 5593 122 640E3 4215 3289 3481 1272 5129

4 4 3 3 1 1 1 2 2 3 3

optimization algorithms such as ACO BA BBO DE ESGA HS PSO and SGA Based on the results of the tenapproaches on the test problems we can conclude that theHSBA significantly improves the performances of the HSand the BA on most multimodal and unimodal problems

In this work 14 benchmark functions are used to evaluatethe performance of our approach we will test our approachon more problems such as the high-dimensional (119863 ge20) CEC 2010 test suit [45] and the real-world problemsMoreover we will compare HSBA with other metaheuristicmethod such as DLHS [46 47] SGHS [48] adaptive DE[49] CLPSO [50] and DMS-PSO [51] In addition we onlyconsider the unconstrained function optimization in thiswork Our future work consists on adding the diversity rulesinto HSBA for constrained optimization problems such asconstrained real-parameter optimization CEC 2010 test suit[52]

In the field of optimization there are many issues worthyof further study and efficient optimizationmethod should bedeveloped depending on the analysis of specific engineeringproblem Our future work will focus on the two issues onthe one hand we would apply our proposed approach HSBA to solve practical engineering optimization problemsand obviously HSBA can become a fascinating method forreal-world engineering optimization problems on the otherhand we would develop new meta-hybrid approach to solveoptimization problem

Acknowledgments

This work was supported by State Key Laboratory of LaserInteraction with Material Research Fund under Grant noSKLLIM0902-01 and Key Research Technology of Electric-discharge Non-chain Pulsed DF Laser under Grant no LXJJ-11-Q80

References

[1] G Wang L Guo A H Gandomi et al ldquoLevy-flight krill herdalgorithmrdquo Mathematical Problems in Engineering vol 2013Article ID 682073 14 pages 2013

[2] A H Gandomi X S Yang S Talatahari and A H AlaviMetaheuristic Applications in Structures and InfrastructuresElsevier Waltham Mass USA 2013

[3] D H Ackley ldquoAn empirical study of bit vector functionoptimizationrdquo in Genetic Algorithms and Simulated AnnealingL Davis Ed pp 170ndash204 Pitman London UK 1987

[4] R Fletcher and M J D Powell ldquoA rapidly convergent descentmethod for minimizationrdquoThe Computer Journal vol 6 no 2pp 163ndash168 1963

[5] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[6] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[7] K A De Jong Analysis of the Behavior of a Class of GeneticAdaptive Systems University of Michigan 1975

[8] L A Rastrigin Extremal Control Systems NaukaMoscow Rus-sian 1974Theoretical Foundations of Engineering CyberneticsSeries

[9] S-K S Fan and J-M Chang ldquoDynamic multi-swarm particleswarm optimizer using parallel PC cluster systems for globaloptimization of large-scale multimodal functionsrdquo EngineeringOptimization vol 42 no 5 pp 431ndash451 2010

[10] W Gong Z Cai and C X Ling ldquoDEBBO a hybrid differentialevolution with biogeography-based optimization for globalnumerical optimizationrdquo Soft Computing vol 15 no 4 pp 645ndash665 2010

[11] G Wang L Guo H Duan L Liu and H Wang ldquoA modifiedfirefly algorithm for UCAV path planningrdquo International Jour-nal of Hybrid Information Technology vol 5 no 3 pp 123ndash1442012

20 Journal of Applied Mathematics

[12] GWang LGuoHDuan L Liu andHWang ldquoAhybridmeta-heuristic DECS algorithm for UCAVpath planningrdquo Journal ofInformation and Computational Science vol 9 no 16 pp 1ndash82012

[13] GWang L Guo H Duan L Liu HWang andM Shao ldquoPathplanning for uninhabited combat aerial vehicle using hybridmeta-heuristic DEBBO algorithmrdquo Advanced Science Engi-neering and Medicine vol 4 no 6 pp 550ndash564 2012

[14] G Wang L Guo H Duan H Wang L Liu and M Shao ldquoAhybrid meta-heuristic DECS algorithm for UCAV three-dimension path planningrdquo The Scientific World Journal vol2012 Article ID 583973 11 pages 2012

[15] H DuanW Zhao GWang and X Feng ldquoTest-sheet composi-tion using analytic hierarchy process and hybrid metaheuristicalgorithmTSBBOrdquoMathematical Problems in Engineering vol2012 Article ID 712752 22 pages 2012

[16] G Wang L Guo H Duan L Liu and H Wang ldquoDynamicdeployment of wireless sensor networks by biogeography basedoptimization algorithmrdquo Journal of Sensor and Actuator Net-works vol 1 no 2 pp 86ndash96 2012

[17] X S Yang A H Gandomi S Talatahari and A H AlaviMetaheuristics in Water Geotechnical and Transport Engineer-ing Elsevier Waltham Mass USA 2013

[18] D E GoldbergGenetic Algorithms in Search Optimization andMachine Learning Addison-Wesley New York NY USA 1998

[19] R Storn and K Price ldquoDifferential evolutionmdasha simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997

[20] S Das and P N Suganthan ldquoDifferential evolution a survey ofthe state-of-the-artrdquo IEEE Transactions on Evolutionary Com-putation vol 15 no 1 pp 4ndash31 2011

[21] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[22] A H Gandomi G J Yun X-S Yang and S Talatahari ldquoChaos-enhanced accelerated particle swarm optimizationrdquo Communi-cations in Nonlinear Science and Numerical Simulation vol 18no 2 pp 327ndash340 2013

[23] S Talatahari X S Yang R Sheikholeslami andAH GandomildquoOptimal parameter estimation for muskingum model usinghybrid CSS and PSO methodrdquo Journal of Computational andTheoretical Nanoscience In press

[24] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[25] G Wang L Guo H Duan L Liu H Wang and M ShaoldquoHybridizing harmony search with biogeography based opti-mization for global numerical optimizationrdquo Journal of Com-putational and Theoretical Nanoscience In press

[26] Z W Geem J H Kim and G V Loganathan ldquoA new heuristicoptimization algorithm harmony searchrdquo Simulation vol 76no 2 pp 60ndash68 2001

[27] G Wang L Guo H Duan H Wang L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[28] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress Frome UK 2nd edition 2010

[29] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing amp Applications In press

[30] X S Yang andAHGandomi ldquoBat algorithm a novel approachfor global engineering optimizationrdquo Engineering Computa-tions vol 29 no 5 pp 464ndash483 2012

[31] G Wang L Guo H Duan L Liu and H Wang ldquoA Bat algo-rithm with mutation for UCAV path planningrdquo The ScientificWorld Journal vol 2012 Article ID 418946 15 pages 2012

[32] R Parpinelli and H Lopes ldquoNew inspirations in swarm intelli-gence a surveyrdquo International Journal of Bio-Inspired Computa-tion vol 3 no 1 pp 1ndash16 2011

[33] P W Tsai J S Pan B Y Liao M J Tsai and V Istanda ldquoBatalgorithm inspired algorithm for solving numerical optimiza-tion problemsrdquo Applied Mechanics and Materials vol 148 pp134ndash137 2012

[34] Wikipedia ldquoMathematical optimizationrdquo httpenwikipediaorgwikiNumerical optimization

[35] W Y Zhang S Xu and S J Li ldquoNecessary conditions for weaksharp minima in cone-constrained optimization problemsrdquoAbstract and Applied Analysis vol 2011 Article ID 909520 11pages 2012

[36] Wikipedia ldquoGlobal optimizationrdquo httpenwikipediaorgwikiGlobal optimization

[37] S Kirkpatrick C D Gelatt Jr and M P Vecchi ldquoOptimizationby simulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983

[38] M Dorigo and T Stutzle Ant Colony Optimization The MITPress Cambridge Mass USA 2004

[39] M Dorigo Optimization Learning and Natural AlgorithmsPolitecnico di Milano Italy 1992

[40] H G Beyer The Theory of Evolution Strategies Springer NewYork NY USA 2001

[41] W Khatib and P Fleming ldquoThe stud GA a mini revolutionrdquo inParallel Problem Solving From Nature pp 683ndash691 1998

[42] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[43] M Friedman ldquoA comparison of alternative tests of significancefor the problem of 119898 rankingsrdquo The Annals of MathematicalStatistics vol 11 no 1 pp 86ndash92 1940

[44] R Brits A P Engelbrecht and F van den Bergh ldquoLocatingmultiple optima using particle swarm optimizationrdquo AppliedMathematics and Computation vol 189 no 2 pp 1859ndash18832007

[45] K Tang X Li P N Suganthan Z Yang and T Weise ldquoBench-mark functions for the CECrsquo2010 special session and competi-tion on large scale global optimizationrdquo Nature Inspired Com-putation and Applications Laboratory USTC Hefei China2010

[46] Q-K Pan P N Suganthan J J Liang and M F Tasgetiren ldquoAlocal-best harmony search algorithm with dynamic subpopula-tionsrdquo Engineering Optimization vol 42 no 2 pp 101ndash117 2010

[47] Q-K Pan P N Suganthan J J Liang and M F TasgetirenldquoA local-best harmony search algorithm with dynamic sub-harmony memories for lot-streaming flow shop schedulingproblemrdquo Expert Systems with Applications vol 38 no 4 pp3252ndash3259 2011

[48] Q-K Pan P N Suganthan M F Tasgetiren and J J LiangldquoA self-adaptive global best harmony search algorithm forcontinuous optimization problemsrdquo Applied Mathematics andComputation vol 216 no 3 pp 830ndash848 2010

Journal of Applied Mathematics 21

[49] S M Islam S Das S Ghosh S Roy and P N Suganthan ldquoAnadaptive differential evolution algorithm with novel mutationand crossover strategies for global numerical optimizationrdquoIEEE Transactions on Systems Man and Cybernetics B vol 42no 2 pp 482ndash500 2012

[50] J J Liang A K Qin P N Suganthan and S Baskar ldquoCompre-hensive learning particle swarm optimizer for global optimiza-tion of multimodal functionsrdquo IEEE Transactions on Evolution-ary Computation vol 10 no 3 pp 281ndash295 2006

[51] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[52] R Mallipeddi and P Suganthan Problem Definitions and Eval-uation Criteria for the CEC 2010 Competition on ConstrainedReal-Parameter Optimization Nanyang Technological Univer-sity Singapore 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

6 Journal of Applied Mathematics

BeginStep 1 Initialization Set the generation counter 119905 = 1 initialize the population of NP

bats 119875 randomly and each bat corresponding to a potentialsolution to the given problem define loudness 119860 set frequency 119876the initial velocities V and pulse rate 119903 set the harmony memory considerationrate HMCR the pitch adjustment rate PAR and bandwidth bwset maximum of elite individuals retained KEEP

Step 2 Evaluate the quality 119891 for each bat in 119875 determined by the objective function 119891(119909)Step 3 While the termination criteria is not satisfied or 119905 ltMaxGeneration do

Sort the population of bats 119875 from best to worst by order of quality 119891 for each batStore the KEEP best bats as KEEPBATfor 119894 = 1NP (all bats) do

V119905

119894= V119905minus1

119894+ (V119905119894minus 119909lowast) times 119876

119909119905119894= 119909119905minus1119894

+ V119905

119894

if (rand gt 119903) then119909119905119906= 119909lowast+ 120572120576119905

end iffor 119895 = 1119863 (all elements) do Mutate

if (rand lt HMCR) then1199031= lceilNP lowast randrceil

119909V (119895) = 1199091199031(119895) where 119903

1isin (1 2 HMS)

if (rand lt PAR) then119909V (119895) = 119909V (119895) + bw times (2 times rand minus 1)

endifelse

119909V(119895) = 119909min 119895 + rand times (119909max119895 minus 119909minminus119895)

endifendfor 119895Evaluate the fitness for the offsprings 119909119905

119906 119909119905119894 119909119905

V

Select the offspring 119909119905119896with the best fitness among the offsprings 119909119905

119906 119909119905119894 119909119905

V

if (rand lt 119860) then1199091199051199031= 119909119905119896

end ifReplace the KEEP worst bats with the KEEP best bats KEEPBAT stored

end for 119894119905 = 119905 + 1

Step 4 end whileStep 5 Post-processing the results and visualization

End

Algorithm 3 The hybrid meta-heuristic algorithm of HSBA

4 Simulation Experiments

In this section we test the performance of the proposedmeta-heuristic HSBA to global numerical optimization through aseries of experiments conducted on benchmark functions

To allow a fair comparison of running times all the exper-iments were conducted on a PC with a Pentium IV processorrunning at 20GHz 512MB of RAM and a hard driveof 160Gbytes Our implementation was compiled usingMATLAB R2012a (714) running under Windows XP3 Nocommercial BA or HS tools were used in the followingexperiments

41 General Performance of HSBA In order to explorethe benefits of HSBA in this subsection we compared itsperformance on global numeric optimization problem withnine other population-based optimization methods which

are ACO BA BBO DE ES GA HS PSO and SGA ACO(ant colony optimization) [38 39] is a swarm intelligencealgorithm for solving optimization problems which is basedon the pheromone deposition of ants BBO (biogeography-based optimization) [24] is a new excellent powerful andefficient evolution algorithm developed for the global opti-mization inspired by the immigration and emigration ofspecies between islands (or habitats) in search of morecompatible islands DE (differential evolution) [19] is a simplebut excellent optimization method that uses the differencebetween two solutions to probabilistically adapt a thirdsolution An ES (evolutionary strategy) [40] is an algorithmthat generally distributes equal importance to mutation andrecombination and that allows two or more parents toreproduce an offspring A GA (genetic algorithm) [18] is asearch heuristic that mimics the process of natural evolutionPSO (particle swarm optimization) [21] is also a swarm

Journal of Applied Mathematics 7

intelligence algorithm which is based on the swarm behaviorof fish and bird schooling in nature A stud genetic algorithm(SGA) [41] is a GA that uses the best individual at eachgeneration for crossover

In all experiments we will use the same parameters forHS BA and HSBA that are loudness 119860 = 095 pulse rate119903 = 06 scaling factor 120576 = 01 the harmony memoryconsideration rate HMCR = 095 and the pitch adjustmentrate PAR = 01 For ACO BBO DE ES GA PSO and SGAwe set the parameters as follows For ACO initial pheromonevalue 120591

0= 1119864 minus 6 pheromone update constant 119876 = 20

exploration constant 1199020= 1 global pheromone decay rate

120588119892= 09 local pheromone decay rate 120588

119897= 05 pheromone

sensitivity 119904 = 1 and visibility sensitivity 120573 = 5 for BBOhabitat modification probability = 1 immigration probabilitybounds per gene = [0 1] step size for numerical integration ofprobabilities = 1 maximum immigration and migration ratesfor each island = 1 and mutation probability = 0005 for DEa weighting factor 119865 = 05 and a crossover constant CR = 05for the ES the number of offspring 120582 = 10 produced ineach generation and standard deviation 120590 = 1 for changingsolutions For the GA we used roulette wheel selection andsingle-point crossover with a crossover probability of 1 anda mutation probability of 001 For PSO we set an inertialconstant = 03 a cognitive constant = 1 and a social constantfor swarm interaction = 1 For the SGA we used single-pointcrossover with a crossover probability of 1 and a mutationprobability of 001

Well-defined problem sets are favorable for evaluatingthe performance of optimization methods proposed in thispaper Based on mathematical functions benchmark func-tions can be applied as objective functions to perform suchtests The properties of these benchmark functions can beeasily achieved from their definitions Fourteen differentbenchmark functions are applied to verify our proposedmetaheuristic algorithmHSBA Each of the functions in thisstudy has 20 independent variables (ie119863 = 20)

The benchmark functions described in Table 1 are stan-dard testing functionsTheproperties of the benchmark func-tions are given in Table 2 The modality property means thenumber of the best solutions in the search space Unimodalbenchmark functions only have an optimum which is theglobal optimum Multimodal benchmark functions have atleast two optima in their search space indicating that theyhave more than one local optima except the global optimumMore details of all the benchmark functions can be found in[6]

We set population size 119873119875 = 50 an elitism parameter119870119890119890119901 = 2 and maximum generation119872119886119909119892119890119899 = 50 for eachalgorithm We ran 100 Monte Carlo simulations of eachalgorithm on each benchmark function to get representativeperformances Tables 3 and 4 illustrate the results of thesimulations Table 3 shows the average minima found byeach algorithm averaged over 100 Monte Carlo runs Table 4shows the absolute best minima found by each algorithmover 100 Monte Carlo runs In other words Table 3 showsthe average performance of each algorithm while Table 4shows the best performance of each algorithmThe best valueachieved for each test problem is shown in bold Note that the

normalizations in the tables are based on different scales sovalues cannot be compared between the two tables

From Table 3 we see that on average HSBA is the mosteffective at finding objective function minimum on ten ofthe fourteen benchmarks (F02 F03 F06ndashF11 and F13-F14)ACO is the second most effective performing the best on thebenchmarks F04 and F05 BBO and SGA are the third mosteffective performing the best on the benchmarks F12 andF01 respectively whenmultiple runs aremade Table 4 showsthat HSBA performs the best on thirteen of the fourteenbenchmarks (F02ndashF14) while BBO performs the secondbest at finding objective function minimum on the onlybenchmark (F01) when multiple runs are made In additionstatistical analysis [42] on these values obtained by the tenmethods on 14 benchmark functions based on the Friedmanrsquostest [43] reveals that the differences in the obtained averageand the best functionminima are statistically significant (119875 =

166 lowast 10minus17 and119875 = 725 lowast 10minus17 resp) at the confidence levelof 5

Furthermore the computational requirements of the tenoptimization methods were similar We collected the averagecomputational time of the optimization methods as appliedto the 14 benchmarks discussed in this sectionThe results areshown in Table 3 BA was the quickest optimization methodHSBA was the third fastest of the ten algorithms Howeverit should be noted that in the vast majority of real-worldapplications it is the fitness function evaluation that is by farthe most expensive part of a population-based optimizationalgorithm

Furthermore convergence graphs of ACO BA BBODE ES GA HS HSBA PSO and SGA are shown inFigures 1ndash14 which represent the process of optimizationThevalues shown in Figures 1ndash14 are the mean objective functionoptimum achieved from 100Monte Carlo simulations whichare the true objective function values not normalized

Figure 1 shows the results obtained for the ten methodswhen the F01 Ackley function is applied From Figure 1clearly we can draw the conclusion that BBO is significantlysuperior to all the other algorithms including HSBA duringthe process of optimization while HSBA performs the sec-ond best on this multimodal benchmark function Here allthe other algorithms show the almost same starting point andHSBA has a faster convergence rate than other algorithmswhile HSBA is outperformed by BBO after 13 generationsAlthough slower SGA eventually finds the global minimumclose to HSBA Obviously BBO HSBA and SGA outper-form ACO BA DE ES GA HS and PSO during the wholeprocess of searching the global minimum

Figure 2 illustrates the optimization results for F02 Fletc-her-Powell function In thismultimodal benchmark problemit is obvious that HSBA outperforms all other methods dur-ing the whole progress of optimization (except the generation20ndash33 BBO and HSBA performs approximately equally)

Figure 3 shows the optimization results for F03 Griewankfunction From Table 3 and Figure 3 we can see that HSBAperforms the best on this multimodal benchmark problemLooking at Figure 3 carefully HSBA shows a faster conver-gence rate initially than ACO however it is outperformed

8 Journal of Applied Mathematics

Table1Be

nchm

arkfunctio

ns

No

Nam

eDefinitio

nSource

F01

Ackley

119891(119909)=20

+119890minus20

sdot119890minus02sdot radic(1119899)

119899 sum 119894=1

1199092 119894

minus119890(1119899)

119899 sum 119894=1

cos(2120587119909119894)

[3]

F02

Fletcher-Pow

ell119891(119909)=119899 sum 119894=1

(119860119894minus119861119894)2

119860119894=119899 sum 119895=1

(119886119894119895sin

120572119895+119887 119894119895cos120572119895)

119861119894=119899 sum 119895=1

(119886119894119895sin

119909119895+119887 119894119895cos119909119895)

[4]

F03

Grie

wank

119891(119909)=119899 sum 119894=1

1199092 119894

4000

minus119899 prod 119894=1

cos(

119909119894

radic119894)

+1

[5]

F04

Penalty

1119891(119909)=

120587 3010sin2(1205871199101)+119899minus1

sum 119894=1

(119910119894minus1)2

sdot[1+10sin2(120587119910119894+1)]

+(119910119899minus1)2

+119899 sum 119894=1

119906(119909119894101004)

119910119894=1+025

(119909119894+1)

[6]

F05

Penalty

2119891(119909)=

01sin2(31205871199091)+119899minus1

sum 119894=1

(119909119894minus1)2

sdot[1+sin2(3120587119909119894+1)]+(119909119899minus1)2

[1+sin2(2120587119909119899)]

+119899 sum 119894=1

119906(11990911989451004)

[6]

F06

Quarticwith

noise

119891(119909)=119899 sum 119894=1

(119894sdot1199094 119894+119880(01))

[7]

F07

Rastrig

in119891(119909)=10

sdot119899+119899 sum 119894=1

(1199092 119894minus10

sdotcos

(2120587119909119894))

[8]

F08

Rosenb

rock

119891(119909)=119899minus1

sum 119894=1

[100(119909119894+1minus1199092 119894)2+(119909119894minus1)2

][7]

F09

Schw

efel226

119891(119909)=4189829

times119863minus119863 sum 119894=1

119909119894sin

(1003816 1003816 1003816 1003816119909119894

1003816 1003816 1003816 100381612

)[9]

F10

Schw

efel12

119891(119909)=119899 sum 119894=1

(119894 sum 119895=1

119909119895)

2

[10]

F11

Schw

efel222

119891(119909)=119899 sum 119894=1

|119909119894|+119899 prod 119894=1

|119909119894|

[10]

F12

Schw

efel221

119891(119909)=max 119894

1003816 1003816 1003816 1003816119909119894

1003816 1003816 1003816 10038161le119894le119899

[6]

Journal of Applied Mathematics 9

Table1Con

tinued

No

Nam

eDefinitio

nSource

F13

Sphere

119891(119909)=119899 sum 119894=1

1199092 119894

[7]

F14

Step

119891(119909)=6sdot119899

+119899 sum 119894=1

lfloor 119909119894rfloor

[7]

lowast

Inbenchm

arkfunctio

nF0

2them

atrix

elem

entsa 119899times119899b 119899times119899isin(minus100100)120572119899times1isin(minus120587120587)ared

rawnfro

mun

iform

distr

ibution[4]

lowastIn

benchm

arkfunctio

nsF0

4andF0

5thed

efinitio

nof

thefun

ction119906(119909119894119886119896119898)isgivenby119906(119909119894119886119896119898)=119896(119909119894minus119886)119898119909119894gt1198860minus119886le119909119894le119886119896(minus119909119894minus119886)119898119909119894ltminus119886

10 Journal of Applied Mathematics

Table 2 Properties of benchmark functions lb denotes lower bound ub denotes upper bound and opt denotes optimum point

No Function lb ub opt Continuity ModalityF01 Ackley minus32768 32768 0 Continuous MultimodalF02 Fletcher-Powell minus120587 120587 0 Continuous MultimodalF03 Griewank minus600 600 0 Continuous MultimodalF04 Penalty 1 minus50 50 0 Continuous MultimodalF05 Penalty 2 minus50 50 0 Continuous MultimodalF06 Quartic with noise minus128 128 1 Continuous MultimodalF07 Rastrigin minus512 512 0 Continuous MultimodalF08 Rosenbrock minus2048 2048 0 Continuous UnimodalF09 Schwefel 226 minus512 512 0 Continuous MultimodalF10 Schwefel 12 minus100 100 0 Continuous UnimodalF11 Schwefel 222 minus10 10 0 Continuous UnimodalF12 Schwefel 221 minus100 100 0 Continuous UnimodalF13 Sphere minus512 512 0 Continuous UnimodalF14 Step minus512 512 0 Discontinuous Unimodal

Table 3 Mean normalized optimization results in fourteen benchmark functions The values shown are the minimum objective functionvalues found by each algorithm averaged over 100 Monte Carlo simulations

ACO BA BBO DE ES GA HS HSBA PSO SGAF01 231 333 115 202 338 272 347 109 266 100F02 2458 2582 158 894 2435 545 1569 100 1396 133F03 316 6072 193 544 2385 322 7722 100 2577 142F04 100 30E38 40E32 56E33 27E38 31E32 14E39 23E32 41E36 96E31F05 100 11E8 29942 15E6 46E8 54E3 41E8 21551 55E7 11110F06 48901 68E3 3532 30829 18E4 27483 15E4 100 25E3 1009F07 809 1155 128 656 1187 617 1022 100 844 180F08 4225 2901 229 759 5999 905 4785 100 1204 215F09 317 2026 199 1358 1333 181 1992 100 1761 115F10 175 373 138 295 493 125 422 100 248 148F11 105 1970 183 714 2312 1113 1945 100 1322 246F12 186 403 100 299 391 192 374 138 238 112F13 9830 15084 380 1903 22652 4774 18232 100 7291 402F14 773 12048 393 1331 10256 1153 14655 100 6344 328TimE 274 100 132 164 167 179 233 143 203 176lowast

The values are normalized so that the minimum in each row is 100 These are not the absolute minima found by each algorithm but the average minimafound by each algorithm

by ACO after 5 generations For other algorithms althoughslower BBO GA and SGA eventually find the global mini-mum close to HSBA while BA DE ES HS and PSO fail tosearch the global minimum within the limited iterations

Figure 4 shows the results for F04 Penalty 1 functionFrom Figure 4 it is obvious that HSBA outperforms all othermethods during the whole progress of optimization in thismultimodal benchmark function Although slower SGA andBBO perform the second and third best at finding the globalminimum

Figure 5 shows the performance achieved for F05 Penalty2 function For this multimodal function similar to F04Penalty 1 function shown in Figure 4 HSBA is significantly

superior to all the other algorithms during the process ofoptimization Here PSO shows a faster convergence rateinitially than HSBA however it is outperformed by HSBAafter 3 generations

Figure 6 shows the results achieved for the ten methodswhen using the F06 Quartic (with noise) function FromTable 3 and Figure 6 we can conclude that HSBA performsthe best in this multimodal function Looking carefullyat Figure 6 PSO and HSBA show the same initial fastconvergence towards the known minimum as the procedureproceeds HSBA gets closer and closer to the minimumwhile PSO comes into being premature and traps into thelocal minimum BA ES GA HS and PSO do not manage

Journal of Applied Mathematics 11

Table 4 Best normalized optimization results in fourteen benchmark functions The values shown are the minimum objective functionvalues found by each algorithm

ACO BA BBO DE ES GA HS HSBA PSO SGAF01 185 231 100 143 225 203 230 105 191 104F02 1042 1448 109 422 1039 403 991 100 828 133F03 273 4622 187 447 2138 802 4324 100 1675 176F04 44E6 50E6 12E3 19E4 22E6 30E4 42E6 100 3755 322F05 30E4 32E4 5101 38633 16E4 88446 26E4 100 4113 457F06 5851 99255 650 2469 80848 5924 74691 100 18971 202F07 571 853 125 513 794 523 747 100 600 168F08 2642 2503 148 370 3352 616 2150 100 775 145F09 243 866 128 490 593 209 728 100 740 142F10 189 494 118 266 300 208 276 100 201 174F11 774 1261 121 336 1214 601 960 100 781 162F12 133 233 144 169 208 174 211 100 170 121F13 2804 5657 217 554 6057 1971 5286 100 2098 228F14 428 5402 197 485 3361 1060 4911 100 1972 185lowast

The values are normalized so that the minimum in each row is 100 These are the absolute best minima found by each algorithm

0 10 20 30 40 508

10

12

14

16

18

20

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 1 Comparison of the performance of the different methodsfor the F01 Ackley function

to succeed in this benchmark function within maximumnumber of generations At last BBO DE and SGA convergeto the value very close to HSBA

Figure 7 shows the optimization results for the F07Rastrigin function Very clearly HSBA has the fastest con-vergence rate at finding the globalminimumand significantlyoutperforms all other approaches Looking carefully at Fig-ure 7 approximately we can divide all the algorithms intothree groups one group including BBO HSBA and SGAperforming the best the other group including ACO DE

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

106

105

Figure 2 Comparison of the performance of the different methodsfor the F02 Fletcher-Powell function

GA and PSO performing the second best another groupincluding BA ES and HS performing the worst

Figure 8 shows the results for F08 Rosenbrock functionFrom Table 3 and Figure 8 we can see that HSBA is superiorto the other algorithms during the optimization process inthis relatively simple unimodal benchmark function Alsowe see that similar to the F06 Quartic (with noise) functionshown in Figure 6 PSO shows a faster convergence rateinitially than HSBA however it is outperformed by HSBAafter 4 generations and is premature to the local minimum

Figure 9 shows the equivalent results for the F09 Schwefel226 function From Figure 9 very clearly though HSBA isoutperformed by BBO between the generations 10ndash30 it hasthe stable convergence rate at finding the global minimum

12 Journal of Applied Mathematics

0 10 20 30 40 50

50

100

150

200

250

300

350

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 3 Comparison of the performance of the different methodsfor the F03 Griewank function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

1010

109

108

107

106

105

104

103

102

101

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 4 Comparison of the performance of the different methodsfor the F04 Penalty 1 function

and significantly outperforms all other approaches in thismultimodal benchmark function

Figure 10 shows the results for F10 Schwefel 12 functionFrom Figure 10 we can see that HSBA performs far betterthan other algorithms during the optimization process in thisrelative simple unimodal benchmark function PSO showsa faster convergence rate initially than HSBA howeverit is outperformed by HSBA after 3 generations Lookingcarefully at Figure 10 akin to F07 Rastrigin function shown inFigure 7 all the algorithms except BA can be approximately

0

Ben

chm

ark

func

tion

val

ue

5 10 15 20 25 30 35 40 45 50Number of generations

109

108

107

106

105

104

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 5 Comparison of the performance of the different methodsfor the F05 Penalty 2 function

0 10 20 30 40 500

5

10

15

20

25

30

35

40

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 6 Comparison of the performance of the different methodsfor the F06 Quartic (with noise) function

divided into three groups one group including BBO andHSBA performing the best the other group including ACOGA PSO and SGA performing the second best anothergroup including DE ES and HS performing the worst

Figure 11 shows the results for F11 Schwefel 222 functionVery clearly BBO shows a faster convergence rate initiallythan HSBA however it is outperformed by HSBA after 40generations At last HSBA reaches the optimal solution sig-nificantly superior to other algorithms BBO is only inferior

Journal of Applied Mathematics 13B

ench

mar

k fu

ncti

on v

alue

0 10 20 30 40 50

50

100

150

200

250

Number of generations

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 7 Comparison of the performance of the different methodsfor the F07 Rastrigin function

0 10 20 30 40 50

500

1000

1500

2000

2500

3000

3500

4000

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 8 Comparison of the performance of the different methodsfor the F08 Rosenbrock function

to HSBA and performs the second best in this unimodalfunction

Figure 12 shows the results for F12 Schwefel 221 functionVery clearly HSBAhas the fastest convergence rate at findingthe global minimum and significantly outperforms all otherapproaches

0 10 20 30 40 50

1000

2000

3000

4000

5000

6000

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 9 Comparison of the performance of the different methodsfor the F09 Schwefel 226 function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

104

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 10 Comparison of the performance of the differentmethodsfor the F10 Schwefel 12 function

Figure 13 shows the results for F13 Sphere function FromTable 3 and Figure 13 HSBA has the fastest convergencerate at finding the global minimum and outperforms all otherapproaches Looking carefully at Figure 13 for BBO and SGAwe can see that BBO has a faster convergence rate than SGAbut SGA does finally converge to the value of BBO that isapproaching to HSBA

Figure 14 shows the results for F14 Step function Appar-ently HSBA shows the fastest convergence rate at findingthe global minimum and significantly outperforms all otherapproaches Looking carefully at Figure 14 for BBO and SGA

14 Journal of Applied MathematicsB

ench

mar

k fu

ncti

on v

alue

0 5 10 15 20 25 30 35 40 45 50Number of generations

102

101

100

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 11 Comparison of the performance of the different methodsfor the F11 Schwefel 222 function

0 10 20 30 40 5030

35

40

45

50

55

60

65

70

75

80

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 12 Comparison of the performance of the different methodsfor the F12 Schwefel 221 function

we can see that BBO has a faster convergence rate than SGAbut SGA does finally converge to the value of BBO

From the above-analyses about Figures 1ndash14 we can cometo the conclusion that our proposed hybrid metaheuristicalgorithm HSBA significantly outperforms the other ninealgorithms In general ACO BBO and SGA are only inferiorto the HSBA and ACO BBO and SGA perform betterthanHSBA in the benchmarks F04-F05 the benchmark F12and the benchmark F01 respectively Further benchmarksF04 F05 F06 F08 and F10 illustrate that PSO has a faster

0 10 20 30 40 500

10

20

30

40

50

60

70

80

90

100

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 13 Comparison of the performance of the different methodsfor the F13 Sphere function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

104

103

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 14 Comparison of the performance of the differentmethodsfor the F14 Step function

convergence rate initially while later it converges slowerand slower to the true objective function value At lastwe must point out that in [24] Simon compared BBOwith seven state-of-the-art EAs over 14 benchmark func-tions and a real-world sensor selection problem The resultsdemonstrated the good performance of BBO It is also indi-rectly demonstrated that our proposed hybrid metaheuristicmethod HSBA is a more powerful and efficient optimiza-tion algorithm than other population-based optimizationalgorithms

Journal of Applied Mathematics 15

Table 5 Best normalized optimization results in fourteen benchmark functions with different 119860 The numbers shown are the best resultsfound after 100 Monte Carlo simulations HSBA algorithm

119860

0 01 02 03 04 05 06 07 08 09 10F01 166 133 155 140 100 133 121 122 116 113 114F02 125E4 100 100 100 100 100 100 100 100 100 100F03 341 1115 100 100 100 100 100 100 100 100 100F04 273E4 100 125E4 104 104 104 104 104 104 100 100F05 503E5 530 133 168E4 100 100 100 100 100 100 100F06 100 256 547 473 1781 990 289 674 990 260 557F07 1014 1492 435 110 110 1537 101 100 100 100 100F08 3849 108 1406 108 108 108 9901 101 100 100 100F09 28518 40458 138 474 119 119 119 36791 101 100 100F10 66522 468 169E3 1527 118 118 118 118 121 100 100F11 2003 100 996 1154 3922 996 996 996 996 3897 849F12 1032 100 1322 323 1469 279 279 279 279 279 1931F13 146 415 284 447 115 156 100 100 100 100 100F14 52752 1357 395 101 115 1291 100 100 100 100 100

1 4 2 2 3 3 5 6 7 10 10

42 Influence of Control Parameter In [28] Dr Yang con-cluded that if the parameters in BA can be adjusted properlyit can outperform GA HS (harmony search) [26] and PSOThe choice of the control parameters is of vital importancefor different problems To investigate the influence of theloudness119860 pulse rate 119903 harmonymemory consideration rateHMCR and pitch adjustment rate PAR on the performanceof HSBA we carry out this experiment in the benchmarkproblem with different parameters All other parametersettings are kept unchanged (unless noted otherwise in thefollowing paragraph) The results are recorded in Tables 5ndash12 after 100 Monte Carlo runs Among them Tables 5 7 9and 11 show the best minima found by HSBA algorithm over100Monte Carlo runs Tables 6 8 10 and 12 show the averageminima found byHSBA algorithm averaged over 100MonteCarlo runs In other words Tables 5 7 9 11 and Tables 68 10 12 show the best and average performance of HSBAalgorithm respectively In each table the last row is the totalnumber of functions on which HSBA performs the best withsome parameters

421 Loudness 119860 Tables 5 and 6 recorded the resultsperformed in the benchmark problem with the loudness 119860 =0 01 02 09 10 and fixed pulse rate 119903 = 06 harmonymemory consideration rate HMCR= 095 and pitch adjust-ment rate PAR= 01 From Tables 5 and 6 obviously it can beseen that (i) for the three benchmark functions F02 F03 andF04 HSBA performs slightly differently that is to say thesethree benchmark functions are insensitive to the parameter119860 (ii) For benchmark functions F01 F06 F11 and F12HSBA performs better on smaller119860(lt05) (iii) However forfunctions F05 F07ndash10 F13 and F14 HSBA performs betteron bigger119860(gt05) In sum HSBA performs the best when119860is equal or very close to 10 So we set119860 = 095 which is veryclose to 09 and 10 in other experiments

422 Pulse Rate 119903 Tables 7 and 8 recorded the resultsperformed on the benchmark problem with the pulse rate119903 = 0 01 02 09 10 and fixed loudness119860 = 095harmony memory consideration rate HMCR=095 andpitch adjustment rate PAR= 01 From Tables 7 and 8 wecan evidently conclude that HSBA performs significantlybetter on bigger 119903(gt05) than on smaller 119903(lt05) and HSBAperforms the best when r is equal or very close to 06 So weset 119903 = 06 in other experiments

423 Harmony Memory Consideration Rate119867119872119862119877 Tables9 and 10 recorded the results performed in the bench-mark problem with the harmony memory consideration rateHMCR = 0 01 02 09 10 fixed loudness 119860 = 095pulse rate 119903 = 06 and pitch adjustment rate PAR = 01 FromTables 9 and 10 we can recognize that the function valuesevaluated by HSBA are bettersmaller on bigger HMCR(gt05) than on smaller r (lt05) and most benchmarks reachminimum when HMCR is equal or very close to 1 So we setHMCR=095 in other experiments

424 Pitch Adjustment Rate 119875119860119877 Tables 11 and 12 recordedthe results performed on the benchmark problems with thepitch adjustment rate PAR = 0 01 02 09 10 fixedloudness119860 = 095 pulse rate 119903 = 06 and harmonymemory consideration rate HMCR = 095 From Tables 11and 12 we can recognize that the function values for HSBAvary little with the increasing PAR and HSBA reachesoptimumminimum in most benchmarks when PAR is equalor very close to 01 So we set PAR = 01 in other experiments

43 Discussion In the HSBA the bats fly in the sky usingecholocation to find foodprey (ie best solutions) Fourother parameters are the loudness (119860) the rate of pulseemission (119903) harmony memory consideration rate (HMCR)

16 Journal of Applied Mathematics

Table 6 Mean normalized optimization results in fourteen benchmark functions with different A The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119860

0 01 02 03 04 05 06 07 08 09 10F01 101 101 100 100 100 100 100 100 100 100 100F02 152 100 146 117 108 137 188 165 167 179 148F03 669 1041 106 101 103 103 101 102 105 100 100F04 20131 126 410 421 333 272 100 519 291 100 311F05 59122 1682 425 587 101 414 2404 100 368 733 900F06 100 490E4 63799 25871 495E4 218 217 217 218 217 217F07 857 221E6 643 27234 11048 211E4 102 101 100 100 100F08 4980 114E4 172E4 16160 22459 9119 174E4 103 111 100 100F09 8237 173E6 106 314 7445 10310 4226 794E3 100 137 129F10 9031 145E3 245E5 232E3 2334 2234 3138 1289 234E3 140 100F11 498 100 315E4 233 1441 52023 44365 61626 24991 478E4 214F12 369 210E4 457E4 100 212E4 26635 23313 19880 27614 11201 214E4F13 190 825 448E6 214E6 100 615 25451 22275 18996 26386 10701F14 6691 195E5 117E4 124E3 2104 186E3 2940 2392 100 1835 2475

1 2 1 2 2 1 2 2 4 5 5

Table 7 Best normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 184 184 484 159 171 100 134 109 116 101F02 944E3 100 100 100 100 100 100 100 100 100 100F03 373 546 186 186 186 186 100 172 121 161 101F04 1142 111 2429 123 126 100 129 129 129 129 129F05 177E5 230 115 109E4 100 100 100 100 100 100 100F06 6244 2996 4858 1560 2634 780 182 100 3520 239 155F07 831 448 258 129 129 349 100 100 100 100 100F08 961 118 420 137 137 137 695 101 101 100 100F09 37394 44422 168 335 168 168 100 29198 101 101 101F10 38693 316 1397 463 158 158 100 158 30931 158 101F11 4255 100 1829 2135 2018 1111 1904 2135 1581 1876 1155F12 11440 100 14184 4448 12547 4448 4448 4448 4448 4448 6943F13 639 637 601 296 302 100 152 139 100 257 249F14 12052 404 233 100 116 340 100 100 116 116 116

0 3 1 2 2 4 8 5 4 4 4

and pitch adjustment rate (PAR) The appropriate update forthe pitch adjustment rate (PAR) and the rate of pulse emission(119903) balances the exploration and exploitation behavior of eachbat respectively as the loudness usually decreases once a bathas found its preysolution while the rate of pulse emissionincreases in order to raise the attack accuracy

For all of the standard benchmark functions that havebeen considered the HSBA has been demonstrated toperform better than or be equal to the standard BA and otheracclaimed state-of-the-art population-based algorithms withthe HSBA performing significantly better in some functionsThe HSBA performs excellently and efficiently because ofits ability to simultaneously carry out a local search still

searching globally at the same time It succeeds in doingthis due to the local search via harmony search algorithmand global search via bat algorithm concurrently A similarbehavior may be performed in the PSO by using multiswarmfrom a particle population initially [44] However HSBArsquosadvantages include performing simply and easily and onlyhaving four parameters to regulate The work carried outhere demonstrates the HSBA to be robust over all kinds ofbenchmark functions

Benchmark evaluation is a good way for verifying theperformance of the metaheuristic algorithms but it also haslimitations First we did not make any special effort to tunethe optimization algorithms in this section Different tuning

Journal of Applied Mathematics 17

Table 8 Mean normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 199 186 194 159 100 142 134 109 116 171F02 331 538 457 308 382 114 100 289 279 402 311F03 373 546 533 229 336 241 100 172 121 161 136F04 1142 111 2429 149E4 113E3 394E3 278E4 1966 17522 114 100F05 1609 12862 3269 2446 6440 1277 2959 7617 456 100 446F06 6244 2696 4858 1560 2634 780 182 100 3520 239 155F07 330 178 134 103 123 139 155 100 119 136 349F08 488 393 394 194 219 383 353 433 100 227 314F09 139 166 114 100 121 115 115 109 108 104 109F10 721 994 875 360 846 611 483 414 576 100 271F11 382 423 373 205 181 100 171 220 142 168 189F12 164 217 204 147 180 186 100 121 113 134 156F13 639 637 601 296 302 190 152 139 100 257 249F14 574 1543 742 682 644 317 100 485 221 174 209

0 0 0 1 0 2 4 2 2 2 1

Table 9 Best normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 164 164 164 164 164 164 164 151 128 100 163F02 187E4 100 100 100 100 100 100 100 100 100 100F03 1982 2649 371 371 371 371 371 371 371 206 100F04 126E5 100 187E4 100 100 100 100 100 100 100 100F05 607E5 534 100 187E4 100 100 100 100 100 100 100F06 10892 24631 36504 34540 33857 23465 14349 4528 2330 100 2575F07 922 1428 534 100 100 930 100 100 100 100 100F08 1810 108 772 108 108 108 3594 100 100 100 100F09 28004 39161 127 681 127 127 127 22795 100 100 100F10 55165 876 176E3 1170 164 164 164 164 27448 100 100F11 1467 100 618 618 1399 618 618 618 440 269 616F12 768 100 1110 273 1021 273 273 273 273 273 473F13 772 2675 1590 1776 612 1040 612 595 255 100 225F14 53716 1428 534 100 100 713 100 100 100 100 100

0 4 2 4 5 3 5 6 7 11 9

parameter values in the optimization algorithmsmight resultin significant differences in their performance Second real-world optimization problems may not have much of arelationship to benchmark functionsThird benchmark testsmight result in different conclusions if the grading criteria orproblem setup change In our work we examined the meanand best results obtained with a certain population size andafter a certain number of generations However we mightarrive at different conclusions if (for example) we change thegeneration limit or look at how many generations it takes toreach a certain function value or if we change the populationsize In spite of these caveats the benchmark results shown

here are promising for HSBA and indicate that this novelmethod might be able to find a niche among the plethora ofpopulation-based optimization algorithms

We note that CPU time is a bottleneck to the implemen-tation of many population-based optimization algorithmsIf an algorithm cannot converge fast it will be impracticalsince it would take too long to find an optimal or suboptimalsolution HSBA does not seem to require an unreasonableamount of computational effort of the ten optimization algo-rithms compared in this paper HSBA was the third fastestNevertheless findingmechanisms to accelerate HSBA couldbe an important area for further research

18 Journal of Applied Mathematics

Table 10 Mean normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 101 101 101 101 101 101 101 101 101 100 101F02 74756 787 525 343 509 710 307 187 160 100 101F03 913 312E4 109 107 105 106 105 102 101 101 100F04 2101198646 1001198644 3951198644 9481198643 5971198643 2951198643 1511198643 1501198643 4694 100 237F05 3241198646 3271198644 2091198644 4141198644 7131198643 1541198643 8931198643 1531198643 62948 100 20373F06 100 5941198644 42204 63215 6001198644 192 191 191 191 191 191F07 1063 2071198646 900 21672 32455 307E4 104 103 102 101 100F08 5950 9881198643 304E4 14180 21701 32468 307E4 107 107 103 100F09 22687 4951198646 308 891 11422 17360 25941 245E4 145 100 171F10 25737 2831198644 935E5 137E4 9709 6656 9925 14940 139E4 100 273F11 607 100 183E4 202 1661 39040 26299 40300 60361 572E4 184F12 300 279E4 360E4 100 282E4 27095 19414 13077 20040 30016 284E4F13 232 966 561E6 189E6 100 815 26778 19186 12923 19805 29665F14 18466 384E5 117E4 185E3 100 571E3 2511 5539 3960 2657 4149

1 1 0 1 2 0 0 0 0 6 3

Table 11 Best normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 391E3 100 100 100 100 100 100 100 100 100 100F03 100 425 151 220 220 212 220 199 220 128 180F04 100 255 6554 255 255 255 255 255 255 255 255F05 252 100 171 169E3 171 171 171 171 171 171 171F06 100 5987 3418 2285 4652 2344 4354 4484 3688 1899 812F07 807 100 148 255 255 1306 255 255 255 255 255F08 543 100 192 100 100 100 1145 100 100 100 100F09 7672 252 117 100 171 171 171 15556 171 171 171F10 92910 148 100 491 255 255 255 222 235E3 255 255F11 318E3 100 582E3 399E3 339E3 582E3 491E3 582E3 582E3 958E3 582E3F12 30506 100 53728 9734 18758 9734 9734 9734 9734 9734 39827F13 192 422 587 1007 1298 466 148 401 317 100 409F14 8812 100 148 255 255 491 255 255 255 255 255

4 8 3 4 3 3 2 3 3 4 3

5 Conclusion and Future Work

This paper proposed a hybrid metaheuristic HSBA methodfor optimization problemWe improved the BAby combiningoriginal harmony search (HS) algorithm and evaluating theHSBA on multimodal numerical optimization problemsA novel type of BA model has been presented and animprovement is applied to mutate between bats using har-mony search algorithm during the process of bats updatingUsing the original configuration of the bat algorithm wegenerate the new harmonies based on the newly generatedbat each iteration after batrsquos position has been updated Thenew harmony vector substitutes the newly generated bat only

if it has better fitness This selection scheme is rather greedywhich often overtakes original HS and BA The HSBAattempts to take merits of the BA and the HS in order toavoid all bats getting trapped in inferior local optimal regionsThe HSBA enables the bats to have more diverse exemplarsto learn from as the bats are updated each iteration andalso form new harmonies to search in a larger search spaceThis new method can speed up the global convergence ratewithout losing the strong robustness of the basic BA Fromthe analysis of the experimental results we observe that theproposed HSBA makes good use of the information in pastsolutions more effectively to generate better quality solutionsfrequently when compared to the other population-based

Journal of Applied Mathematics 19

Table 12Mean normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 407 100 100 452 277 257 341 536 369 489 103F03 100 127E4 134 135 135 135 134 135 134 134 134F04 115 8127 939E3 100 101 101 129 100 100 101 101F05 34573 5061 8619 907E3 2501 108 101 191 333 100 118F06 100 542E6 427E4 474E4 548E6 57788 57788 57788 57788 57787 57787F07 486 153 100 9543 10596 122E4 134 136 132 133 135F08 1269 7600 876E3 1703 6913 7672 885E3 110 105 104 100F09 6345 23047 182 167 1278 4860 5349 609E3 111 130 100F10 13355 1251 126 197E3 1148 464 1645 1893 199E3 100 188F11 6302 100 639E3 7944 5924 396E3 142E3 581E3 645E3 745E5 7981F12 390 881E3 890E3 100 890E3 4440 4778 1725 7011 7784 899E3F13 100 2166 207E3 669 580 430 27167 28246 10539 42926 47662F14 4614 100 3610 5593 122 640E3 4215 3289 3481 1272 5129

4 4 3 3 1 1 1 2 2 3 3

optimization algorithms such as ACO BA BBO DE ESGA HS PSO and SGA Based on the results of the tenapproaches on the test problems we can conclude that theHSBA significantly improves the performances of the HSand the BA on most multimodal and unimodal problems

In this work 14 benchmark functions are used to evaluatethe performance of our approach we will test our approachon more problems such as the high-dimensional (119863 ge20) CEC 2010 test suit [45] and the real-world problemsMoreover we will compare HSBA with other metaheuristicmethod such as DLHS [46 47] SGHS [48] adaptive DE[49] CLPSO [50] and DMS-PSO [51] In addition we onlyconsider the unconstrained function optimization in thiswork Our future work consists on adding the diversity rulesinto HSBA for constrained optimization problems such asconstrained real-parameter optimization CEC 2010 test suit[52]

In the field of optimization there are many issues worthyof further study and efficient optimizationmethod should bedeveloped depending on the analysis of specific engineeringproblem Our future work will focus on the two issues onthe one hand we would apply our proposed approach HSBA to solve practical engineering optimization problemsand obviously HSBA can become a fascinating method forreal-world engineering optimization problems on the otherhand we would develop new meta-hybrid approach to solveoptimization problem

Acknowledgments

This work was supported by State Key Laboratory of LaserInteraction with Material Research Fund under Grant noSKLLIM0902-01 and Key Research Technology of Electric-discharge Non-chain Pulsed DF Laser under Grant no LXJJ-11-Q80

References

[1] G Wang L Guo A H Gandomi et al ldquoLevy-flight krill herdalgorithmrdquo Mathematical Problems in Engineering vol 2013Article ID 682073 14 pages 2013

[2] A H Gandomi X S Yang S Talatahari and A H AlaviMetaheuristic Applications in Structures and InfrastructuresElsevier Waltham Mass USA 2013

[3] D H Ackley ldquoAn empirical study of bit vector functionoptimizationrdquo in Genetic Algorithms and Simulated AnnealingL Davis Ed pp 170ndash204 Pitman London UK 1987

[4] R Fletcher and M J D Powell ldquoA rapidly convergent descentmethod for minimizationrdquoThe Computer Journal vol 6 no 2pp 163ndash168 1963

[5] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[6] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[7] K A De Jong Analysis of the Behavior of a Class of GeneticAdaptive Systems University of Michigan 1975

[8] L A Rastrigin Extremal Control Systems NaukaMoscow Rus-sian 1974Theoretical Foundations of Engineering CyberneticsSeries

[9] S-K S Fan and J-M Chang ldquoDynamic multi-swarm particleswarm optimizer using parallel PC cluster systems for globaloptimization of large-scale multimodal functionsrdquo EngineeringOptimization vol 42 no 5 pp 431ndash451 2010

[10] W Gong Z Cai and C X Ling ldquoDEBBO a hybrid differentialevolution with biogeography-based optimization for globalnumerical optimizationrdquo Soft Computing vol 15 no 4 pp 645ndash665 2010

[11] G Wang L Guo H Duan L Liu and H Wang ldquoA modifiedfirefly algorithm for UCAV path planningrdquo International Jour-nal of Hybrid Information Technology vol 5 no 3 pp 123ndash1442012

20 Journal of Applied Mathematics

[12] GWang LGuoHDuan L Liu andHWang ldquoAhybridmeta-heuristic DECS algorithm for UCAVpath planningrdquo Journal ofInformation and Computational Science vol 9 no 16 pp 1ndash82012

[13] GWang L Guo H Duan L Liu HWang andM Shao ldquoPathplanning for uninhabited combat aerial vehicle using hybridmeta-heuristic DEBBO algorithmrdquo Advanced Science Engi-neering and Medicine vol 4 no 6 pp 550ndash564 2012

[14] G Wang L Guo H Duan H Wang L Liu and M Shao ldquoAhybrid meta-heuristic DECS algorithm for UCAV three-dimension path planningrdquo The Scientific World Journal vol2012 Article ID 583973 11 pages 2012

[15] H DuanW Zhao GWang and X Feng ldquoTest-sheet composi-tion using analytic hierarchy process and hybrid metaheuristicalgorithmTSBBOrdquoMathematical Problems in Engineering vol2012 Article ID 712752 22 pages 2012

[16] G Wang L Guo H Duan L Liu and H Wang ldquoDynamicdeployment of wireless sensor networks by biogeography basedoptimization algorithmrdquo Journal of Sensor and Actuator Net-works vol 1 no 2 pp 86ndash96 2012

[17] X S Yang A H Gandomi S Talatahari and A H AlaviMetaheuristics in Water Geotechnical and Transport Engineer-ing Elsevier Waltham Mass USA 2013

[18] D E GoldbergGenetic Algorithms in Search Optimization andMachine Learning Addison-Wesley New York NY USA 1998

[19] R Storn and K Price ldquoDifferential evolutionmdasha simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997

[20] S Das and P N Suganthan ldquoDifferential evolution a survey ofthe state-of-the-artrdquo IEEE Transactions on Evolutionary Com-putation vol 15 no 1 pp 4ndash31 2011

[21] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[22] A H Gandomi G J Yun X-S Yang and S Talatahari ldquoChaos-enhanced accelerated particle swarm optimizationrdquo Communi-cations in Nonlinear Science and Numerical Simulation vol 18no 2 pp 327ndash340 2013

[23] S Talatahari X S Yang R Sheikholeslami andAH GandomildquoOptimal parameter estimation for muskingum model usinghybrid CSS and PSO methodrdquo Journal of Computational andTheoretical Nanoscience In press

[24] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[25] G Wang L Guo H Duan L Liu H Wang and M ShaoldquoHybridizing harmony search with biogeography based opti-mization for global numerical optimizationrdquo Journal of Com-putational and Theoretical Nanoscience In press

[26] Z W Geem J H Kim and G V Loganathan ldquoA new heuristicoptimization algorithm harmony searchrdquo Simulation vol 76no 2 pp 60ndash68 2001

[27] G Wang L Guo H Duan H Wang L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[28] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress Frome UK 2nd edition 2010

[29] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing amp Applications In press

[30] X S Yang andAHGandomi ldquoBat algorithm a novel approachfor global engineering optimizationrdquo Engineering Computa-tions vol 29 no 5 pp 464ndash483 2012

[31] G Wang L Guo H Duan L Liu and H Wang ldquoA Bat algo-rithm with mutation for UCAV path planningrdquo The ScientificWorld Journal vol 2012 Article ID 418946 15 pages 2012

[32] R Parpinelli and H Lopes ldquoNew inspirations in swarm intelli-gence a surveyrdquo International Journal of Bio-Inspired Computa-tion vol 3 no 1 pp 1ndash16 2011

[33] P W Tsai J S Pan B Y Liao M J Tsai and V Istanda ldquoBatalgorithm inspired algorithm for solving numerical optimiza-tion problemsrdquo Applied Mechanics and Materials vol 148 pp134ndash137 2012

[34] Wikipedia ldquoMathematical optimizationrdquo httpenwikipediaorgwikiNumerical optimization

[35] W Y Zhang S Xu and S J Li ldquoNecessary conditions for weaksharp minima in cone-constrained optimization problemsrdquoAbstract and Applied Analysis vol 2011 Article ID 909520 11pages 2012

[36] Wikipedia ldquoGlobal optimizationrdquo httpenwikipediaorgwikiGlobal optimization

[37] S Kirkpatrick C D Gelatt Jr and M P Vecchi ldquoOptimizationby simulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983

[38] M Dorigo and T Stutzle Ant Colony Optimization The MITPress Cambridge Mass USA 2004

[39] M Dorigo Optimization Learning and Natural AlgorithmsPolitecnico di Milano Italy 1992

[40] H G Beyer The Theory of Evolution Strategies Springer NewYork NY USA 2001

[41] W Khatib and P Fleming ldquoThe stud GA a mini revolutionrdquo inParallel Problem Solving From Nature pp 683ndash691 1998

[42] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[43] M Friedman ldquoA comparison of alternative tests of significancefor the problem of 119898 rankingsrdquo The Annals of MathematicalStatistics vol 11 no 1 pp 86ndash92 1940

[44] R Brits A P Engelbrecht and F van den Bergh ldquoLocatingmultiple optima using particle swarm optimizationrdquo AppliedMathematics and Computation vol 189 no 2 pp 1859ndash18832007

[45] K Tang X Li P N Suganthan Z Yang and T Weise ldquoBench-mark functions for the CECrsquo2010 special session and competi-tion on large scale global optimizationrdquo Nature Inspired Com-putation and Applications Laboratory USTC Hefei China2010

[46] Q-K Pan P N Suganthan J J Liang and M F Tasgetiren ldquoAlocal-best harmony search algorithm with dynamic subpopula-tionsrdquo Engineering Optimization vol 42 no 2 pp 101ndash117 2010

[47] Q-K Pan P N Suganthan J J Liang and M F TasgetirenldquoA local-best harmony search algorithm with dynamic sub-harmony memories for lot-streaming flow shop schedulingproblemrdquo Expert Systems with Applications vol 38 no 4 pp3252ndash3259 2011

[48] Q-K Pan P N Suganthan M F Tasgetiren and J J LiangldquoA self-adaptive global best harmony search algorithm forcontinuous optimization problemsrdquo Applied Mathematics andComputation vol 216 no 3 pp 830ndash848 2010

Journal of Applied Mathematics 21

[49] S M Islam S Das S Ghosh S Roy and P N Suganthan ldquoAnadaptive differential evolution algorithm with novel mutationand crossover strategies for global numerical optimizationrdquoIEEE Transactions on Systems Man and Cybernetics B vol 42no 2 pp 482ndash500 2012

[50] J J Liang A K Qin P N Suganthan and S Baskar ldquoCompre-hensive learning particle swarm optimizer for global optimiza-tion of multimodal functionsrdquo IEEE Transactions on Evolution-ary Computation vol 10 no 3 pp 281ndash295 2006

[51] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[52] R Mallipeddi and P Suganthan Problem Definitions and Eval-uation Criteria for the CEC 2010 Competition on ConstrainedReal-Parameter Optimization Nanyang Technological Univer-sity Singapore 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Journal of Applied Mathematics 7

intelligence algorithm which is based on the swarm behaviorof fish and bird schooling in nature A stud genetic algorithm(SGA) [41] is a GA that uses the best individual at eachgeneration for crossover

In all experiments we will use the same parameters forHS BA and HSBA that are loudness 119860 = 095 pulse rate119903 = 06 scaling factor 120576 = 01 the harmony memoryconsideration rate HMCR = 095 and the pitch adjustmentrate PAR = 01 For ACO BBO DE ES GA PSO and SGAwe set the parameters as follows For ACO initial pheromonevalue 120591

0= 1119864 minus 6 pheromone update constant 119876 = 20

exploration constant 1199020= 1 global pheromone decay rate

120588119892= 09 local pheromone decay rate 120588

119897= 05 pheromone

sensitivity 119904 = 1 and visibility sensitivity 120573 = 5 for BBOhabitat modification probability = 1 immigration probabilitybounds per gene = [0 1] step size for numerical integration ofprobabilities = 1 maximum immigration and migration ratesfor each island = 1 and mutation probability = 0005 for DEa weighting factor 119865 = 05 and a crossover constant CR = 05for the ES the number of offspring 120582 = 10 produced ineach generation and standard deviation 120590 = 1 for changingsolutions For the GA we used roulette wheel selection andsingle-point crossover with a crossover probability of 1 anda mutation probability of 001 For PSO we set an inertialconstant = 03 a cognitive constant = 1 and a social constantfor swarm interaction = 1 For the SGA we used single-pointcrossover with a crossover probability of 1 and a mutationprobability of 001

Well-defined problem sets are favorable for evaluatingthe performance of optimization methods proposed in thispaper Based on mathematical functions benchmark func-tions can be applied as objective functions to perform suchtests The properties of these benchmark functions can beeasily achieved from their definitions Fourteen differentbenchmark functions are applied to verify our proposedmetaheuristic algorithmHSBA Each of the functions in thisstudy has 20 independent variables (ie119863 = 20)

The benchmark functions described in Table 1 are stan-dard testing functionsTheproperties of the benchmark func-tions are given in Table 2 The modality property means thenumber of the best solutions in the search space Unimodalbenchmark functions only have an optimum which is theglobal optimum Multimodal benchmark functions have atleast two optima in their search space indicating that theyhave more than one local optima except the global optimumMore details of all the benchmark functions can be found in[6]

We set population size 119873119875 = 50 an elitism parameter119870119890119890119901 = 2 and maximum generation119872119886119909119892119890119899 = 50 for eachalgorithm We ran 100 Monte Carlo simulations of eachalgorithm on each benchmark function to get representativeperformances Tables 3 and 4 illustrate the results of thesimulations Table 3 shows the average minima found byeach algorithm averaged over 100 Monte Carlo runs Table 4shows the absolute best minima found by each algorithmover 100 Monte Carlo runs In other words Table 3 showsthe average performance of each algorithm while Table 4shows the best performance of each algorithmThe best valueachieved for each test problem is shown in bold Note that the

normalizations in the tables are based on different scales sovalues cannot be compared between the two tables

From Table 3 we see that on average HSBA is the mosteffective at finding objective function minimum on ten ofthe fourteen benchmarks (F02 F03 F06ndashF11 and F13-F14)ACO is the second most effective performing the best on thebenchmarks F04 and F05 BBO and SGA are the third mosteffective performing the best on the benchmarks F12 andF01 respectively whenmultiple runs aremade Table 4 showsthat HSBA performs the best on thirteen of the fourteenbenchmarks (F02ndashF14) while BBO performs the secondbest at finding objective function minimum on the onlybenchmark (F01) when multiple runs are made In additionstatistical analysis [42] on these values obtained by the tenmethods on 14 benchmark functions based on the Friedmanrsquostest [43] reveals that the differences in the obtained averageand the best functionminima are statistically significant (119875 =

166 lowast 10minus17 and119875 = 725 lowast 10minus17 resp) at the confidence levelof 5

Furthermore the computational requirements of the tenoptimization methods were similar We collected the averagecomputational time of the optimization methods as appliedto the 14 benchmarks discussed in this sectionThe results areshown in Table 3 BA was the quickest optimization methodHSBA was the third fastest of the ten algorithms Howeverit should be noted that in the vast majority of real-worldapplications it is the fitness function evaluation that is by farthe most expensive part of a population-based optimizationalgorithm

Furthermore convergence graphs of ACO BA BBODE ES GA HS HSBA PSO and SGA are shown inFigures 1ndash14 which represent the process of optimizationThevalues shown in Figures 1ndash14 are the mean objective functionoptimum achieved from 100Monte Carlo simulations whichare the true objective function values not normalized

Figure 1 shows the results obtained for the ten methodswhen the F01 Ackley function is applied From Figure 1clearly we can draw the conclusion that BBO is significantlysuperior to all the other algorithms including HSBA duringthe process of optimization while HSBA performs the sec-ond best on this multimodal benchmark function Here allthe other algorithms show the almost same starting point andHSBA has a faster convergence rate than other algorithmswhile HSBA is outperformed by BBO after 13 generationsAlthough slower SGA eventually finds the global minimumclose to HSBA Obviously BBO HSBA and SGA outper-form ACO BA DE ES GA HS and PSO during the wholeprocess of searching the global minimum

Figure 2 illustrates the optimization results for F02 Fletc-her-Powell function In thismultimodal benchmark problemit is obvious that HSBA outperforms all other methods dur-ing the whole progress of optimization (except the generation20ndash33 BBO and HSBA performs approximately equally)

Figure 3 shows the optimization results for F03 Griewankfunction From Table 3 and Figure 3 we can see that HSBAperforms the best on this multimodal benchmark problemLooking at Figure 3 carefully HSBA shows a faster conver-gence rate initially than ACO however it is outperformed

8 Journal of Applied Mathematics

Table1Be

nchm

arkfunctio

ns

No

Nam

eDefinitio

nSource

F01

Ackley

119891(119909)=20

+119890minus20

sdot119890minus02sdot radic(1119899)

119899 sum 119894=1

1199092 119894

minus119890(1119899)

119899 sum 119894=1

cos(2120587119909119894)

[3]

F02

Fletcher-Pow

ell119891(119909)=119899 sum 119894=1

(119860119894minus119861119894)2

119860119894=119899 sum 119895=1

(119886119894119895sin

120572119895+119887 119894119895cos120572119895)

119861119894=119899 sum 119895=1

(119886119894119895sin

119909119895+119887 119894119895cos119909119895)

[4]

F03

Grie

wank

119891(119909)=119899 sum 119894=1

1199092 119894

4000

minus119899 prod 119894=1

cos(

119909119894

radic119894)

+1

[5]

F04

Penalty

1119891(119909)=

120587 3010sin2(1205871199101)+119899minus1

sum 119894=1

(119910119894minus1)2

sdot[1+10sin2(120587119910119894+1)]

+(119910119899minus1)2

+119899 sum 119894=1

119906(119909119894101004)

119910119894=1+025

(119909119894+1)

[6]

F05

Penalty

2119891(119909)=

01sin2(31205871199091)+119899minus1

sum 119894=1

(119909119894minus1)2

sdot[1+sin2(3120587119909119894+1)]+(119909119899minus1)2

[1+sin2(2120587119909119899)]

+119899 sum 119894=1

119906(11990911989451004)

[6]

F06

Quarticwith

noise

119891(119909)=119899 sum 119894=1

(119894sdot1199094 119894+119880(01))

[7]

F07

Rastrig

in119891(119909)=10

sdot119899+119899 sum 119894=1

(1199092 119894minus10

sdotcos

(2120587119909119894))

[8]

F08

Rosenb

rock

119891(119909)=119899minus1

sum 119894=1

[100(119909119894+1minus1199092 119894)2+(119909119894minus1)2

][7]

F09

Schw

efel226

119891(119909)=4189829

times119863minus119863 sum 119894=1

119909119894sin

(1003816 1003816 1003816 1003816119909119894

1003816 1003816 1003816 100381612

)[9]

F10

Schw

efel12

119891(119909)=119899 sum 119894=1

(119894 sum 119895=1

119909119895)

2

[10]

F11

Schw

efel222

119891(119909)=119899 sum 119894=1

|119909119894|+119899 prod 119894=1

|119909119894|

[10]

F12

Schw

efel221

119891(119909)=max 119894

1003816 1003816 1003816 1003816119909119894

1003816 1003816 1003816 10038161le119894le119899

[6]

Journal of Applied Mathematics 9

Table1Con

tinued

No

Nam

eDefinitio

nSource

F13

Sphere

119891(119909)=119899 sum 119894=1

1199092 119894

[7]

F14

Step

119891(119909)=6sdot119899

+119899 sum 119894=1

lfloor 119909119894rfloor

[7]

lowast

Inbenchm

arkfunctio

nF0

2them

atrix

elem

entsa 119899times119899b 119899times119899isin(minus100100)120572119899times1isin(minus120587120587)ared

rawnfro

mun

iform

distr

ibution[4]

lowastIn

benchm

arkfunctio

nsF0

4andF0

5thed

efinitio

nof

thefun

ction119906(119909119894119886119896119898)isgivenby119906(119909119894119886119896119898)=119896(119909119894minus119886)119898119909119894gt1198860minus119886le119909119894le119886119896(minus119909119894minus119886)119898119909119894ltminus119886

10 Journal of Applied Mathematics

Table 2 Properties of benchmark functions lb denotes lower bound ub denotes upper bound and opt denotes optimum point

No Function lb ub opt Continuity ModalityF01 Ackley minus32768 32768 0 Continuous MultimodalF02 Fletcher-Powell minus120587 120587 0 Continuous MultimodalF03 Griewank minus600 600 0 Continuous MultimodalF04 Penalty 1 minus50 50 0 Continuous MultimodalF05 Penalty 2 minus50 50 0 Continuous MultimodalF06 Quartic with noise minus128 128 1 Continuous MultimodalF07 Rastrigin minus512 512 0 Continuous MultimodalF08 Rosenbrock minus2048 2048 0 Continuous UnimodalF09 Schwefel 226 minus512 512 0 Continuous MultimodalF10 Schwefel 12 minus100 100 0 Continuous UnimodalF11 Schwefel 222 minus10 10 0 Continuous UnimodalF12 Schwefel 221 minus100 100 0 Continuous UnimodalF13 Sphere minus512 512 0 Continuous UnimodalF14 Step minus512 512 0 Discontinuous Unimodal

Table 3 Mean normalized optimization results in fourteen benchmark functions The values shown are the minimum objective functionvalues found by each algorithm averaged over 100 Monte Carlo simulations

ACO BA BBO DE ES GA HS HSBA PSO SGAF01 231 333 115 202 338 272 347 109 266 100F02 2458 2582 158 894 2435 545 1569 100 1396 133F03 316 6072 193 544 2385 322 7722 100 2577 142F04 100 30E38 40E32 56E33 27E38 31E32 14E39 23E32 41E36 96E31F05 100 11E8 29942 15E6 46E8 54E3 41E8 21551 55E7 11110F06 48901 68E3 3532 30829 18E4 27483 15E4 100 25E3 1009F07 809 1155 128 656 1187 617 1022 100 844 180F08 4225 2901 229 759 5999 905 4785 100 1204 215F09 317 2026 199 1358 1333 181 1992 100 1761 115F10 175 373 138 295 493 125 422 100 248 148F11 105 1970 183 714 2312 1113 1945 100 1322 246F12 186 403 100 299 391 192 374 138 238 112F13 9830 15084 380 1903 22652 4774 18232 100 7291 402F14 773 12048 393 1331 10256 1153 14655 100 6344 328TimE 274 100 132 164 167 179 233 143 203 176lowast

The values are normalized so that the minimum in each row is 100 These are not the absolute minima found by each algorithm but the average minimafound by each algorithm

by ACO after 5 generations For other algorithms althoughslower BBO GA and SGA eventually find the global mini-mum close to HSBA while BA DE ES HS and PSO fail tosearch the global minimum within the limited iterations

Figure 4 shows the results for F04 Penalty 1 functionFrom Figure 4 it is obvious that HSBA outperforms all othermethods during the whole progress of optimization in thismultimodal benchmark function Although slower SGA andBBO perform the second and third best at finding the globalminimum

Figure 5 shows the performance achieved for F05 Penalty2 function For this multimodal function similar to F04Penalty 1 function shown in Figure 4 HSBA is significantly

superior to all the other algorithms during the process ofoptimization Here PSO shows a faster convergence rateinitially than HSBA however it is outperformed by HSBAafter 3 generations

Figure 6 shows the results achieved for the ten methodswhen using the F06 Quartic (with noise) function FromTable 3 and Figure 6 we can conclude that HSBA performsthe best in this multimodal function Looking carefullyat Figure 6 PSO and HSBA show the same initial fastconvergence towards the known minimum as the procedureproceeds HSBA gets closer and closer to the minimumwhile PSO comes into being premature and traps into thelocal minimum BA ES GA HS and PSO do not manage

Journal of Applied Mathematics 11

Table 4 Best normalized optimization results in fourteen benchmark functions The values shown are the minimum objective functionvalues found by each algorithm

ACO BA BBO DE ES GA HS HSBA PSO SGAF01 185 231 100 143 225 203 230 105 191 104F02 1042 1448 109 422 1039 403 991 100 828 133F03 273 4622 187 447 2138 802 4324 100 1675 176F04 44E6 50E6 12E3 19E4 22E6 30E4 42E6 100 3755 322F05 30E4 32E4 5101 38633 16E4 88446 26E4 100 4113 457F06 5851 99255 650 2469 80848 5924 74691 100 18971 202F07 571 853 125 513 794 523 747 100 600 168F08 2642 2503 148 370 3352 616 2150 100 775 145F09 243 866 128 490 593 209 728 100 740 142F10 189 494 118 266 300 208 276 100 201 174F11 774 1261 121 336 1214 601 960 100 781 162F12 133 233 144 169 208 174 211 100 170 121F13 2804 5657 217 554 6057 1971 5286 100 2098 228F14 428 5402 197 485 3361 1060 4911 100 1972 185lowast

The values are normalized so that the minimum in each row is 100 These are the absolute best minima found by each algorithm

0 10 20 30 40 508

10

12

14

16

18

20

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 1 Comparison of the performance of the different methodsfor the F01 Ackley function

to succeed in this benchmark function within maximumnumber of generations At last BBO DE and SGA convergeto the value very close to HSBA

Figure 7 shows the optimization results for the F07Rastrigin function Very clearly HSBA has the fastest con-vergence rate at finding the globalminimumand significantlyoutperforms all other approaches Looking carefully at Fig-ure 7 approximately we can divide all the algorithms intothree groups one group including BBO HSBA and SGAperforming the best the other group including ACO DE

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

106

105

Figure 2 Comparison of the performance of the different methodsfor the F02 Fletcher-Powell function

GA and PSO performing the second best another groupincluding BA ES and HS performing the worst

Figure 8 shows the results for F08 Rosenbrock functionFrom Table 3 and Figure 8 we can see that HSBA is superiorto the other algorithms during the optimization process inthis relatively simple unimodal benchmark function Alsowe see that similar to the F06 Quartic (with noise) functionshown in Figure 6 PSO shows a faster convergence rateinitially than HSBA however it is outperformed by HSBAafter 4 generations and is premature to the local minimum

Figure 9 shows the equivalent results for the F09 Schwefel226 function From Figure 9 very clearly though HSBA isoutperformed by BBO between the generations 10ndash30 it hasthe stable convergence rate at finding the global minimum

12 Journal of Applied Mathematics

0 10 20 30 40 50

50

100

150

200

250

300

350

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 3 Comparison of the performance of the different methodsfor the F03 Griewank function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

1010

109

108

107

106

105

104

103

102

101

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 4 Comparison of the performance of the different methodsfor the F04 Penalty 1 function

and significantly outperforms all other approaches in thismultimodal benchmark function

Figure 10 shows the results for F10 Schwefel 12 functionFrom Figure 10 we can see that HSBA performs far betterthan other algorithms during the optimization process in thisrelative simple unimodal benchmark function PSO showsa faster convergence rate initially than HSBA howeverit is outperformed by HSBA after 3 generations Lookingcarefully at Figure 10 akin to F07 Rastrigin function shown inFigure 7 all the algorithms except BA can be approximately

0

Ben

chm

ark

func

tion

val

ue

5 10 15 20 25 30 35 40 45 50Number of generations

109

108

107

106

105

104

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 5 Comparison of the performance of the different methodsfor the F05 Penalty 2 function

0 10 20 30 40 500

5

10

15

20

25

30

35

40

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 6 Comparison of the performance of the different methodsfor the F06 Quartic (with noise) function

divided into three groups one group including BBO andHSBA performing the best the other group including ACOGA PSO and SGA performing the second best anothergroup including DE ES and HS performing the worst

Figure 11 shows the results for F11 Schwefel 222 functionVery clearly BBO shows a faster convergence rate initiallythan HSBA however it is outperformed by HSBA after 40generations At last HSBA reaches the optimal solution sig-nificantly superior to other algorithms BBO is only inferior

Journal of Applied Mathematics 13B

ench

mar

k fu

ncti

on v

alue

0 10 20 30 40 50

50

100

150

200

250

Number of generations

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 7 Comparison of the performance of the different methodsfor the F07 Rastrigin function

0 10 20 30 40 50

500

1000

1500

2000

2500

3000

3500

4000

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 8 Comparison of the performance of the different methodsfor the F08 Rosenbrock function

to HSBA and performs the second best in this unimodalfunction

Figure 12 shows the results for F12 Schwefel 221 functionVery clearly HSBAhas the fastest convergence rate at findingthe global minimum and significantly outperforms all otherapproaches

0 10 20 30 40 50

1000

2000

3000

4000

5000

6000

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 9 Comparison of the performance of the different methodsfor the F09 Schwefel 226 function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

104

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 10 Comparison of the performance of the differentmethodsfor the F10 Schwefel 12 function

Figure 13 shows the results for F13 Sphere function FromTable 3 and Figure 13 HSBA has the fastest convergencerate at finding the global minimum and outperforms all otherapproaches Looking carefully at Figure 13 for BBO and SGAwe can see that BBO has a faster convergence rate than SGAbut SGA does finally converge to the value of BBO that isapproaching to HSBA

Figure 14 shows the results for F14 Step function Appar-ently HSBA shows the fastest convergence rate at findingthe global minimum and significantly outperforms all otherapproaches Looking carefully at Figure 14 for BBO and SGA

14 Journal of Applied MathematicsB

ench

mar

k fu

ncti

on v

alue

0 5 10 15 20 25 30 35 40 45 50Number of generations

102

101

100

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 11 Comparison of the performance of the different methodsfor the F11 Schwefel 222 function

0 10 20 30 40 5030

35

40

45

50

55

60

65

70

75

80

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 12 Comparison of the performance of the different methodsfor the F12 Schwefel 221 function

we can see that BBO has a faster convergence rate than SGAbut SGA does finally converge to the value of BBO

From the above-analyses about Figures 1ndash14 we can cometo the conclusion that our proposed hybrid metaheuristicalgorithm HSBA significantly outperforms the other ninealgorithms In general ACO BBO and SGA are only inferiorto the HSBA and ACO BBO and SGA perform betterthanHSBA in the benchmarks F04-F05 the benchmark F12and the benchmark F01 respectively Further benchmarksF04 F05 F06 F08 and F10 illustrate that PSO has a faster

0 10 20 30 40 500

10

20

30

40

50

60

70

80

90

100

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 13 Comparison of the performance of the different methodsfor the F13 Sphere function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

104

103

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 14 Comparison of the performance of the differentmethodsfor the F14 Step function

convergence rate initially while later it converges slowerand slower to the true objective function value At lastwe must point out that in [24] Simon compared BBOwith seven state-of-the-art EAs over 14 benchmark func-tions and a real-world sensor selection problem The resultsdemonstrated the good performance of BBO It is also indi-rectly demonstrated that our proposed hybrid metaheuristicmethod HSBA is a more powerful and efficient optimiza-tion algorithm than other population-based optimizationalgorithms

Journal of Applied Mathematics 15

Table 5 Best normalized optimization results in fourteen benchmark functions with different 119860 The numbers shown are the best resultsfound after 100 Monte Carlo simulations HSBA algorithm

119860

0 01 02 03 04 05 06 07 08 09 10F01 166 133 155 140 100 133 121 122 116 113 114F02 125E4 100 100 100 100 100 100 100 100 100 100F03 341 1115 100 100 100 100 100 100 100 100 100F04 273E4 100 125E4 104 104 104 104 104 104 100 100F05 503E5 530 133 168E4 100 100 100 100 100 100 100F06 100 256 547 473 1781 990 289 674 990 260 557F07 1014 1492 435 110 110 1537 101 100 100 100 100F08 3849 108 1406 108 108 108 9901 101 100 100 100F09 28518 40458 138 474 119 119 119 36791 101 100 100F10 66522 468 169E3 1527 118 118 118 118 121 100 100F11 2003 100 996 1154 3922 996 996 996 996 3897 849F12 1032 100 1322 323 1469 279 279 279 279 279 1931F13 146 415 284 447 115 156 100 100 100 100 100F14 52752 1357 395 101 115 1291 100 100 100 100 100

1 4 2 2 3 3 5 6 7 10 10

42 Influence of Control Parameter In [28] Dr Yang con-cluded that if the parameters in BA can be adjusted properlyit can outperform GA HS (harmony search) [26] and PSOThe choice of the control parameters is of vital importancefor different problems To investigate the influence of theloudness119860 pulse rate 119903 harmonymemory consideration rateHMCR and pitch adjustment rate PAR on the performanceof HSBA we carry out this experiment in the benchmarkproblem with different parameters All other parametersettings are kept unchanged (unless noted otherwise in thefollowing paragraph) The results are recorded in Tables 5ndash12 after 100 Monte Carlo runs Among them Tables 5 7 9and 11 show the best minima found by HSBA algorithm over100Monte Carlo runs Tables 6 8 10 and 12 show the averageminima found byHSBA algorithm averaged over 100MonteCarlo runs In other words Tables 5 7 9 11 and Tables 68 10 12 show the best and average performance of HSBAalgorithm respectively In each table the last row is the totalnumber of functions on which HSBA performs the best withsome parameters

421 Loudness 119860 Tables 5 and 6 recorded the resultsperformed in the benchmark problem with the loudness 119860 =0 01 02 09 10 and fixed pulse rate 119903 = 06 harmonymemory consideration rate HMCR= 095 and pitch adjust-ment rate PAR= 01 From Tables 5 and 6 obviously it can beseen that (i) for the three benchmark functions F02 F03 andF04 HSBA performs slightly differently that is to say thesethree benchmark functions are insensitive to the parameter119860 (ii) For benchmark functions F01 F06 F11 and F12HSBA performs better on smaller119860(lt05) (iii) However forfunctions F05 F07ndash10 F13 and F14 HSBA performs betteron bigger119860(gt05) In sum HSBA performs the best when119860is equal or very close to 10 So we set119860 = 095 which is veryclose to 09 and 10 in other experiments

422 Pulse Rate 119903 Tables 7 and 8 recorded the resultsperformed on the benchmark problem with the pulse rate119903 = 0 01 02 09 10 and fixed loudness119860 = 095harmony memory consideration rate HMCR=095 andpitch adjustment rate PAR= 01 From Tables 7 and 8 wecan evidently conclude that HSBA performs significantlybetter on bigger 119903(gt05) than on smaller 119903(lt05) and HSBAperforms the best when r is equal or very close to 06 So weset 119903 = 06 in other experiments

423 Harmony Memory Consideration Rate119867119872119862119877 Tables9 and 10 recorded the results performed in the bench-mark problem with the harmony memory consideration rateHMCR = 0 01 02 09 10 fixed loudness 119860 = 095pulse rate 119903 = 06 and pitch adjustment rate PAR = 01 FromTables 9 and 10 we can recognize that the function valuesevaluated by HSBA are bettersmaller on bigger HMCR(gt05) than on smaller r (lt05) and most benchmarks reachminimum when HMCR is equal or very close to 1 So we setHMCR=095 in other experiments

424 Pitch Adjustment Rate 119875119860119877 Tables 11 and 12 recordedthe results performed on the benchmark problems with thepitch adjustment rate PAR = 0 01 02 09 10 fixedloudness119860 = 095 pulse rate 119903 = 06 and harmonymemory consideration rate HMCR = 095 From Tables 11and 12 we can recognize that the function values for HSBAvary little with the increasing PAR and HSBA reachesoptimumminimum in most benchmarks when PAR is equalor very close to 01 So we set PAR = 01 in other experiments

43 Discussion In the HSBA the bats fly in the sky usingecholocation to find foodprey (ie best solutions) Fourother parameters are the loudness (119860) the rate of pulseemission (119903) harmony memory consideration rate (HMCR)

16 Journal of Applied Mathematics

Table 6 Mean normalized optimization results in fourteen benchmark functions with different A The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119860

0 01 02 03 04 05 06 07 08 09 10F01 101 101 100 100 100 100 100 100 100 100 100F02 152 100 146 117 108 137 188 165 167 179 148F03 669 1041 106 101 103 103 101 102 105 100 100F04 20131 126 410 421 333 272 100 519 291 100 311F05 59122 1682 425 587 101 414 2404 100 368 733 900F06 100 490E4 63799 25871 495E4 218 217 217 218 217 217F07 857 221E6 643 27234 11048 211E4 102 101 100 100 100F08 4980 114E4 172E4 16160 22459 9119 174E4 103 111 100 100F09 8237 173E6 106 314 7445 10310 4226 794E3 100 137 129F10 9031 145E3 245E5 232E3 2334 2234 3138 1289 234E3 140 100F11 498 100 315E4 233 1441 52023 44365 61626 24991 478E4 214F12 369 210E4 457E4 100 212E4 26635 23313 19880 27614 11201 214E4F13 190 825 448E6 214E6 100 615 25451 22275 18996 26386 10701F14 6691 195E5 117E4 124E3 2104 186E3 2940 2392 100 1835 2475

1 2 1 2 2 1 2 2 4 5 5

Table 7 Best normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 184 184 484 159 171 100 134 109 116 101F02 944E3 100 100 100 100 100 100 100 100 100 100F03 373 546 186 186 186 186 100 172 121 161 101F04 1142 111 2429 123 126 100 129 129 129 129 129F05 177E5 230 115 109E4 100 100 100 100 100 100 100F06 6244 2996 4858 1560 2634 780 182 100 3520 239 155F07 831 448 258 129 129 349 100 100 100 100 100F08 961 118 420 137 137 137 695 101 101 100 100F09 37394 44422 168 335 168 168 100 29198 101 101 101F10 38693 316 1397 463 158 158 100 158 30931 158 101F11 4255 100 1829 2135 2018 1111 1904 2135 1581 1876 1155F12 11440 100 14184 4448 12547 4448 4448 4448 4448 4448 6943F13 639 637 601 296 302 100 152 139 100 257 249F14 12052 404 233 100 116 340 100 100 116 116 116

0 3 1 2 2 4 8 5 4 4 4

and pitch adjustment rate (PAR) The appropriate update forthe pitch adjustment rate (PAR) and the rate of pulse emission(119903) balances the exploration and exploitation behavior of eachbat respectively as the loudness usually decreases once a bathas found its preysolution while the rate of pulse emissionincreases in order to raise the attack accuracy

For all of the standard benchmark functions that havebeen considered the HSBA has been demonstrated toperform better than or be equal to the standard BA and otheracclaimed state-of-the-art population-based algorithms withthe HSBA performing significantly better in some functionsThe HSBA performs excellently and efficiently because ofits ability to simultaneously carry out a local search still

searching globally at the same time It succeeds in doingthis due to the local search via harmony search algorithmand global search via bat algorithm concurrently A similarbehavior may be performed in the PSO by using multiswarmfrom a particle population initially [44] However HSBArsquosadvantages include performing simply and easily and onlyhaving four parameters to regulate The work carried outhere demonstrates the HSBA to be robust over all kinds ofbenchmark functions

Benchmark evaluation is a good way for verifying theperformance of the metaheuristic algorithms but it also haslimitations First we did not make any special effort to tunethe optimization algorithms in this section Different tuning

Journal of Applied Mathematics 17

Table 8 Mean normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 199 186 194 159 100 142 134 109 116 171F02 331 538 457 308 382 114 100 289 279 402 311F03 373 546 533 229 336 241 100 172 121 161 136F04 1142 111 2429 149E4 113E3 394E3 278E4 1966 17522 114 100F05 1609 12862 3269 2446 6440 1277 2959 7617 456 100 446F06 6244 2696 4858 1560 2634 780 182 100 3520 239 155F07 330 178 134 103 123 139 155 100 119 136 349F08 488 393 394 194 219 383 353 433 100 227 314F09 139 166 114 100 121 115 115 109 108 104 109F10 721 994 875 360 846 611 483 414 576 100 271F11 382 423 373 205 181 100 171 220 142 168 189F12 164 217 204 147 180 186 100 121 113 134 156F13 639 637 601 296 302 190 152 139 100 257 249F14 574 1543 742 682 644 317 100 485 221 174 209

0 0 0 1 0 2 4 2 2 2 1

Table 9 Best normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 164 164 164 164 164 164 164 151 128 100 163F02 187E4 100 100 100 100 100 100 100 100 100 100F03 1982 2649 371 371 371 371 371 371 371 206 100F04 126E5 100 187E4 100 100 100 100 100 100 100 100F05 607E5 534 100 187E4 100 100 100 100 100 100 100F06 10892 24631 36504 34540 33857 23465 14349 4528 2330 100 2575F07 922 1428 534 100 100 930 100 100 100 100 100F08 1810 108 772 108 108 108 3594 100 100 100 100F09 28004 39161 127 681 127 127 127 22795 100 100 100F10 55165 876 176E3 1170 164 164 164 164 27448 100 100F11 1467 100 618 618 1399 618 618 618 440 269 616F12 768 100 1110 273 1021 273 273 273 273 273 473F13 772 2675 1590 1776 612 1040 612 595 255 100 225F14 53716 1428 534 100 100 713 100 100 100 100 100

0 4 2 4 5 3 5 6 7 11 9

parameter values in the optimization algorithmsmight resultin significant differences in their performance Second real-world optimization problems may not have much of arelationship to benchmark functionsThird benchmark testsmight result in different conclusions if the grading criteria orproblem setup change In our work we examined the meanand best results obtained with a certain population size andafter a certain number of generations However we mightarrive at different conclusions if (for example) we change thegeneration limit or look at how many generations it takes toreach a certain function value or if we change the populationsize In spite of these caveats the benchmark results shown

here are promising for HSBA and indicate that this novelmethod might be able to find a niche among the plethora ofpopulation-based optimization algorithms

We note that CPU time is a bottleneck to the implemen-tation of many population-based optimization algorithmsIf an algorithm cannot converge fast it will be impracticalsince it would take too long to find an optimal or suboptimalsolution HSBA does not seem to require an unreasonableamount of computational effort of the ten optimization algo-rithms compared in this paper HSBA was the third fastestNevertheless findingmechanisms to accelerate HSBA couldbe an important area for further research

18 Journal of Applied Mathematics

Table 10 Mean normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 101 101 101 101 101 101 101 101 101 100 101F02 74756 787 525 343 509 710 307 187 160 100 101F03 913 312E4 109 107 105 106 105 102 101 101 100F04 2101198646 1001198644 3951198644 9481198643 5971198643 2951198643 1511198643 1501198643 4694 100 237F05 3241198646 3271198644 2091198644 4141198644 7131198643 1541198643 8931198643 1531198643 62948 100 20373F06 100 5941198644 42204 63215 6001198644 192 191 191 191 191 191F07 1063 2071198646 900 21672 32455 307E4 104 103 102 101 100F08 5950 9881198643 304E4 14180 21701 32468 307E4 107 107 103 100F09 22687 4951198646 308 891 11422 17360 25941 245E4 145 100 171F10 25737 2831198644 935E5 137E4 9709 6656 9925 14940 139E4 100 273F11 607 100 183E4 202 1661 39040 26299 40300 60361 572E4 184F12 300 279E4 360E4 100 282E4 27095 19414 13077 20040 30016 284E4F13 232 966 561E6 189E6 100 815 26778 19186 12923 19805 29665F14 18466 384E5 117E4 185E3 100 571E3 2511 5539 3960 2657 4149

1 1 0 1 2 0 0 0 0 6 3

Table 11 Best normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 391E3 100 100 100 100 100 100 100 100 100 100F03 100 425 151 220 220 212 220 199 220 128 180F04 100 255 6554 255 255 255 255 255 255 255 255F05 252 100 171 169E3 171 171 171 171 171 171 171F06 100 5987 3418 2285 4652 2344 4354 4484 3688 1899 812F07 807 100 148 255 255 1306 255 255 255 255 255F08 543 100 192 100 100 100 1145 100 100 100 100F09 7672 252 117 100 171 171 171 15556 171 171 171F10 92910 148 100 491 255 255 255 222 235E3 255 255F11 318E3 100 582E3 399E3 339E3 582E3 491E3 582E3 582E3 958E3 582E3F12 30506 100 53728 9734 18758 9734 9734 9734 9734 9734 39827F13 192 422 587 1007 1298 466 148 401 317 100 409F14 8812 100 148 255 255 491 255 255 255 255 255

4 8 3 4 3 3 2 3 3 4 3

5 Conclusion and Future Work

This paper proposed a hybrid metaheuristic HSBA methodfor optimization problemWe improved the BAby combiningoriginal harmony search (HS) algorithm and evaluating theHSBA on multimodal numerical optimization problemsA novel type of BA model has been presented and animprovement is applied to mutate between bats using har-mony search algorithm during the process of bats updatingUsing the original configuration of the bat algorithm wegenerate the new harmonies based on the newly generatedbat each iteration after batrsquos position has been updated Thenew harmony vector substitutes the newly generated bat only

if it has better fitness This selection scheme is rather greedywhich often overtakes original HS and BA The HSBAattempts to take merits of the BA and the HS in order toavoid all bats getting trapped in inferior local optimal regionsThe HSBA enables the bats to have more diverse exemplarsto learn from as the bats are updated each iteration andalso form new harmonies to search in a larger search spaceThis new method can speed up the global convergence ratewithout losing the strong robustness of the basic BA Fromthe analysis of the experimental results we observe that theproposed HSBA makes good use of the information in pastsolutions more effectively to generate better quality solutionsfrequently when compared to the other population-based

Journal of Applied Mathematics 19

Table 12Mean normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 407 100 100 452 277 257 341 536 369 489 103F03 100 127E4 134 135 135 135 134 135 134 134 134F04 115 8127 939E3 100 101 101 129 100 100 101 101F05 34573 5061 8619 907E3 2501 108 101 191 333 100 118F06 100 542E6 427E4 474E4 548E6 57788 57788 57788 57788 57787 57787F07 486 153 100 9543 10596 122E4 134 136 132 133 135F08 1269 7600 876E3 1703 6913 7672 885E3 110 105 104 100F09 6345 23047 182 167 1278 4860 5349 609E3 111 130 100F10 13355 1251 126 197E3 1148 464 1645 1893 199E3 100 188F11 6302 100 639E3 7944 5924 396E3 142E3 581E3 645E3 745E5 7981F12 390 881E3 890E3 100 890E3 4440 4778 1725 7011 7784 899E3F13 100 2166 207E3 669 580 430 27167 28246 10539 42926 47662F14 4614 100 3610 5593 122 640E3 4215 3289 3481 1272 5129

4 4 3 3 1 1 1 2 2 3 3

optimization algorithms such as ACO BA BBO DE ESGA HS PSO and SGA Based on the results of the tenapproaches on the test problems we can conclude that theHSBA significantly improves the performances of the HSand the BA on most multimodal and unimodal problems

In this work 14 benchmark functions are used to evaluatethe performance of our approach we will test our approachon more problems such as the high-dimensional (119863 ge20) CEC 2010 test suit [45] and the real-world problemsMoreover we will compare HSBA with other metaheuristicmethod such as DLHS [46 47] SGHS [48] adaptive DE[49] CLPSO [50] and DMS-PSO [51] In addition we onlyconsider the unconstrained function optimization in thiswork Our future work consists on adding the diversity rulesinto HSBA for constrained optimization problems such asconstrained real-parameter optimization CEC 2010 test suit[52]

In the field of optimization there are many issues worthyof further study and efficient optimizationmethod should bedeveloped depending on the analysis of specific engineeringproblem Our future work will focus on the two issues onthe one hand we would apply our proposed approach HSBA to solve practical engineering optimization problemsand obviously HSBA can become a fascinating method forreal-world engineering optimization problems on the otherhand we would develop new meta-hybrid approach to solveoptimization problem

Acknowledgments

This work was supported by State Key Laboratory of LaserInteraction with Material Research Fund under Grant noSKLLIM0902-01 and Key Research Technology of Electric-discharge Non-chain Pulsed DF Laser under Grant no LXJJ-11-Q80

References

[1] G Wang L Guo A H Gandomi et al ldquoLevy-flight krill herdalgorithmrdquo Mathematical Problems in Engineering vol 2013Article ID 682073 14 pages 2013

[2] A H Gandomi X S Yang S Talatahari and A H AlaviMetaheuristic Applications in Structures and InfrastructuresElsevier Waltham Mass USA 2013

[3] D H Ackley ldquoAn empirical study of bit vector functionoptimizationrdquo in Genetic Algorithms and Simulated AnnealingL Davis Ed pp 170ndash204 Pitman London UK 1987

[4] R Fletcher and M J D Powell ldquoA rapidly convergent descentmethod for minimizationrdquoThe Computer Journal vol 6 no 2pp 163ndash168 1963

[5] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[6] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[7] K A De Jong Analysis of the Behavior of a Class of GeneticAdaptive Systems University of Michigan 1975

[8] L A Rastrigin Extremal Control Systems NaukaMoscow Rus-sian 1974Theoretical Foundations of Engineering CyberneticsSeries

[9] S-K S Fan and J-M Chang ldquoDynamic multi-swarm particleswarm optimizer using parallel PC cluster systems for globaloptimization of large-scale multimodal functionsrdquo EngineeringOptimization vol 42 no 5 pp 431ndash451 2010

[10] W Gong Z Cai and C X Ling ldquoDEBBO a hybrid differentialevolution with biogeography-based optimization for globalnumerical optimizationrdquo Soft Computing vol 15 no 4 pp 645ndash665 2010

[11] G Wang L Guo H Duan L Liu and H Wang ldquoA modifiedfirefly algorithm for UCAV path planningrdquo International Jour-nal of Hybrid Information Technology vol 5 no 3 pp 123ndash1442012

20 Journal of Applied Mathematics

[12] GWang LGuoHDuan L Liu andHWang ldquoAhybridmeta-heuristic DECS algorithm for UCAVpath planningrdquo Journal ofInformation and Computational Science vol 9 no 16 pp 1ndash82012

[13] GWang L Guo H Duan L Liu HWang andM Shao ldquoPathplanning for uninhabited combat aerial vehicle using hybridmeta-heuristic DEBBO algorithmrdquo Advanced Science Engi-neering and Medicine vol 4 no 6 pp 550ndash564 2012

[14] G Wang L Guo H Duan H Wang L Liu and M Shao ldquoAhybrid meta-heuristic DECS algorithm for UCAV three-dimension path planningrdquo The Scientific World Journal vol2012 Article ID 583973 11 pages 2012

[15] H DuanW Zhao GWang and X Feng ldquoTest-sheet composi-tion using analytic hierarchy process and hybrid metaheuristicalgorithmTSBBOrdquoMathematical Problems in Engineering vol2012 Article ID 712752 22 pages 2012

[16] G Wang L Guo H Duan L Liu and H Wang ldquoDynamicdeployment of wireless sensor networks by biogeography basedoptimization algorithmrdquo Journal of Sensor and Actuator Net-works vol 1 no 2 pp 86ndash96 2012

[17] X S Yang A H Gandomi S Talatahari and A H AlaviMetaheuristics in Water Geotechnical and Transport Engineer-ing Elsevier Waltham Mass USA 2013

[18] D E GoldbergGenetic Algorithms in Search Optimization andMachine Learning Addison-Wesley New York NY USA 1998

[19] R Storn and K Price ldquoDifferential evolutionmdasha simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997

[20] S Das and P N Suganthan ldquoDifferential evolution a survey ofthe state-of-the-artrdquo IEEE Transactions on Evolutionary Com-putation vol 15 no 1 pp 4ndash31 2011

[21] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[22] A H Gandomi G J Yun X-S Yang and S Talatahari ldquoChaos-enhanced accelerated particle swarm optimizationrdquo Communi-cations in Nonlinear Science and Numerical Simulation vol 18no 2 pp 327ndash340 2013

[23] S Talatahari X S Yang R Sheikholeslami andAH GandomildquoOptimal parameter estimation for muskingum model usinghybrid CSS and PSO methodrdquo Journal of Computational andTheoretical Nanoscience In press

[24] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[25] G Wang L Guo H Duan L Liu H Wang and M ShaoldquoHybridizing harmony search with biogeography based opti-mization for global numerical optimizationrdquo Journal of Com-putational and Theoretical Nanoscience In press

[26] Z W Geem J H Kim and G V Loganathan ldquoA new heuristicoptimization algorithm harmony searchrdquo Simulation vol 76no 2 pp 60ndash68 2001

[27] G Wang L Guo H Duan H Wang L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[28] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress Frome UK 2nd edition 2010

[29] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing amp Applications In press

[30] X S Yang andAHGandomi ldquoBat algorithm a novel approachfor global engineering optimizationrdquo Engineering Computa-tions vol 29 no 5 pp 464ndash483 2012

[31] G Wang L Guo H Duan L Liu and H Wang ldquoA Bat algo-rithm with mutation for UCAV path planningrdquo The ScientificWorld Journal vol 2012 Article ID 418946 15 pages 2012

[32] R Parpinelli and H Lopes ldquoNew inspirations in swarm intelli-gence a surveyrdquo International Journal of Bio-Inspired Computa-tion vol 3 no 1 pp 1ndash16 2011

[33] P W Tsai J S Pan B Y Liao M J Tsai and V Istanda ldquoBatalgorithm inspired algorithm for solving numerical optimiza-tion problemsrdquo Applied Mechanics and Materials vol 148 pp134ndash137 2012

[34] Wikipedia ldquoMathematical optimizationrdquo httpenwikipediaorgwikiNumerical optimization

[35] W Y Zhang S Xu and S J Li ldquoNecessary conditions for weaksharp minima in cone-constrained optimization problemsrdquoAbstract and Applied Analysis vol 2011 Article ID 909520 11pages 2012

[36] Wikipedia ldquoGlobal optimizationrdquo httpenwikipediaorgwikiGlobal optimization

[37] S Kirkpatrick C D Gelatt Jr and M P Vecchi ldquoOptimizationby simulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983

[38] M Dorigo and T Stutzle Ant Colony Optimization The MITPress Cambridge Mass USA 2004

[39] M Dorigo Optimization Learning and Natural AlgorithmsPolitecnico di Milano Italy 1992

[40] H G Beyer The Theory of Evolution Strategies Springer NewYork NY USA 2001

[41] W Khatib and P Fleming ldquoThe stud GA a mini revolutionrdquo inParallel Problem Solving From Nature pp 683ndash691 1998

[42] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[43] M Friedman ldquoA comparison of alternative tests of significancefor the problem of 119898 rankingsrdquo The Annals of MathematicalStatistics vol 11 no 1 pp 86ndash92 1940

[44] R Brits A P Engelbrecht and F van den Bergh ldquoLocatingmultiple optima using particle swarm optimizationrdquo AppliedMathematics and Computation vol 189 no 2 pp 1859ndash18832007

[45] K Tang X Li P N Suganthan Z Yang and T Weise ldquoBench-mark functions for the CECrsquo2010 special session and competi-tion on large scale global optimizationrdquo Nature Inspired Com-putation and Applications Laboratory USTC Hefei China2010

[46] Q-K Pan P N Suganthan J J Liang and M F Tasgetiren ldquoAlocal-best harmony search algorithm with dynamic subpopula-tionsrdquo Engineering Optimization vol 42 no 2 pp 101ndash117 2010

[47] Q-K Pan P N Suganthan J J Liang and M F TasgetirenldquoA local-best harmony search algorithm with dynamic sub-harmony memories for lot-streaming flow shop schedulingproblemrdquo Expert Systems with Applications vol 38 no 4 pp3252ndash3259 2011

[48] Q-K Pan P N Suganthan M F Tasgetiren and J J LiangldquoA self-adaptive global best harmony search algorithm forcontinuous optimization problemsrdquo Applied Mathematics andComputation vol 216 no 3 pp 830ndash848 2010

Journal of Applied Mathematics 21

[49] S M Islam S Das S Ghosh S Roy and P N Suganthan ldquoAnadaptive differential evolution algorithm with novel mutationand crossover strategies for global numerical optimizationrdquoIEEE Transactions on Systems Man and Cybernetics B vol 42no 2 pp 482ndash500 2012

[50] J J Liang A K Qin P N Suganthan and S Baskar ldquoCompre-hensive learning particle swarm optimizer for global optimiza-tion of multimodal functionsrdquo IEEE Transactions on Evolution-ary Computation vol 10 no 3 pp 281ndash295 2006

[51] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[52] R Mallipeddi and P Suganthan Problem Definitions and Eval-uation Criteria for the CEC 2010 Competition on ConstrainedReal-Parameter Optimization Nanyang Technological Univer-sity Singapore 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

8 Journal of Applied Mathematics

Table1Be

nchm

arkfunctio

ns

No

Nam

eDefinitio

nSource

F01

Ackley

119891(119909)=20

+119890minus20

sdot119890minus02sdot radic(1119899)

119899 sum 119894=1

1199092 119894

minus119890(1119899)

119899 sum 119894=1

cos(2120587119909119894)

[3]

F02

Fletcher-Pow

ell119891(119909)=119899 sum 119894=1

(119860119894minus119861119894)2

119860119894=119899 sum 119895=1

(119886119894119895sin

120572119895+119887 119894119895cos120572119895)

119861119894=119899 sum 119895=1

(119886119894119895sin

119909119895+119887 119894119895cos119909119895)

[4]

F03

Grie

wank

119891(119909)=119899 sum 119894=1

1199092 119894

4000

minus119899 prod 119894=1

cos(

119909119894

radic119894)

+1

[5]

F04

Penalty

1119891(119909)=

120587 3010sin2(1205871199101)+119899minus1

sum 119894=1

(119910119894minus1)2

sdot[1+10sin2(120587119910119894+1)]

+(119910119899minus1)2

+119899 sum 119894=1

119906(119909119894101004)

119910119894=1+025

(119909119894+1)

[6]

F05

Penalty

2119891(119909)=

01sin2(31205871199091)+119899minus1

sum 119894=1

(119909119894minus1)2

sdot[1+sin2(3120587119909119894+1)]+(119909119899minus1)2

[1+sin2(2120587119909119899)]

+119899 sum 119894=1

119906(11990911989451004)

[6]

F06

Quarticwith

noise

119891(119909)=119899 sum 119894=1

(119894sdot1199094 119894+119880(01))

[7]

F07

Rastrig

in119891(119909)=10

sdot119899+119899 sum 119894=1

(1199092 119894minus10

sdotcos

(2120587119909119894))

[8]

F08

Rosenb

rock

119891(119909)=119899minus1

sum 119894=1

[100(119909119894+1minus1199092 119894)2+(119909119894minus1)2

][7]

F09

Schw

efel226

119891(119909)=4189829

times119863minus119863 sum 119894=1

119909119894sin

(1003816 1003816 1003816 1003816119909119894

1003816 1003816 1003816 100381612

)[9]

F10

Schw

efel12

119891(119909)=119899 sum 119894=1

(119894 sum 119895=1

119909119895)

2

[10]

F11

Schw

efel222

119891(119909)=119899 sum 119894=1

|119909119894|+119899 prod 119894=1

|119909119894|

[10]

F12

Schw

efel221

119891(119909)=max 119894

1003816 1003816 1003816 1003816119909119894

1003816 1003816 1003816 10038161le119894le119899

[6]

Journal of Applied Mathematics 9

Table1Con

tinued

No

Nam

eDefinitio

nSource

F13

Sphere

119891(119909)=119899 sum 119894=1

1199092 119894

[7]

F14

Step

119891(119909)=6sdot119899

+119899 sum 119894=1

lfloor 119909119894rfloor

[7]

lowast

Inbenchm

arkfunctio

nF0

2them

atrix

elem

entsa 119899times119899b 119899times119899isin(minus100100)120572119899times1isin(minus120587120587)ared

rawnfro

mun

iform

distr

ibution[4]

lowastIn

benchm

arkfunctio

nsF0

4andF0

5thed

efinitio

nof

thefun

ction119906(119909119894119886119896119898)isgivenby119906(119909119894119886119896119898)=119896(119909119894minus119886)119898119909119894gt1198860minus119886le119909119894le119886119896(minus119909119894minus119886)119898119909119894ltminus119886

10 Journal of Applied Mathematics

Table 2 Properties of benchmark functions lb denotes lower bound ub denotes upper bound and opt denotes optimum point

No Function lb ub opt Continuity ModalityF01 Ackley minus32768 32768 0 Continuous MultimodalF02 Fletcher-Powell minus120587 120587 0 Continuous MultimodalF03 Griewank minus600 600 0 Continuous MultimodalF04 Penalty 1 minus50 50 0 Continuous MultimodalF05 Penalty 2 minus50 50 0 Continuous MultimodalF06 Quartic with noise minus128 128 1 Continuous MultimodalF07 Rastrigin minus512 512 0 Continuous MultimodalF08 Rosenbrock minus2048 2048 0 Continuous UnimodalF09 Schwefel 226 minus512 512 0 Continuous MultimodalF10 Schwefel 12 minus100 100 0 Continuous UnimodalF11 Schwefel 222 minus10 10 0 Continuous UnimodalF12 Schwefel 221 minus100 100 0 Continuous UnimodalF13 Sphere minus512 512 0 Continuous UnimodalF14 Step minus512 512 0 Discontinuous Unimodal

Table 3 Mean normalized optimization results in fourteen benchmark functions The values shown are the minimum objective functionvalues found by each algorithm averaged over 100 Monte Carlo simulations

ACO BA BBO DE ES GA HS HSBA PSO SGAF01 231 333 115 202 338 272 347 109 266 100F02 2458 2582 158 894 2435 545 1569 100 1396 133F03 316 6072 193 544 2385 322 7722 100 2577 142F04 100 30E38 40E32 56E33 27E38 31E32 14E39 23E32 41E36 96E31F05 100 11E8 29942 15E6 46E8 54E3 41E8 21551 55E7 11110F06 48901 68E3 3532 30829 18E4 27483 15E4 100 25E3 1009F07 809 1155 128 656 1187 617 1022 100 844 180F08 4225 2901 229 759 5999 905 4785 100 1204 215F09 317 2026 199 1358 1333 181 1992 100 1761 115F10 175 373 138 295 493 125 422 100 248 148F11 105 1970 183 714 2312 1113 1945 100 1322 246F12 186 403 100 299 391 192 374 138 238 112F13 9830 15084 380 1903 22652 4774 18232 100 7291 402F14 773 12048 393 1331 10256 1153 14655 100 6344 328TimE 274 100 132 164 167 179 233 143 203 176lowast

The values are normalized so that the minimum in each row is 100 These are not the absolute minima found by each algorithm but the average minimafound by each algorithm

by ACO after 5 generations For other algorithms althoughslower BBO GA and SGA eventually find the global mini-mum close to HSBA while BA DE ES HS and PSO fail tosearch the global minimum within the limited iterations

Figure 4 shows the results for F04 Penalty 1 functionFrom Figure 4 it is obvious that HSBA outperforms all othermethods during the whole progress of optimization in thismultimodal benchmark function Although slower SGA andBBO perform the second and third best at finding the globalminimum

Figure 5 shows the performance achieved for F05 Penalty2 function For this multimodal function similar to F04Penalty 1 function shown in Figure 4 HSBA is significantly

superior to all the other algorithms during the process ofoptimization Here PSO shows a faster convergence rateinitially than HSBA however it is outperformed by HSBAafter 3 generations

Figure 6 shows the results achieved for the ten methodswhen using the F06 Quartic (with noise) function FromTable 3 and Figure 6 we can conclude that HSBA performsthe best in this multimodal function Looking carefullyat Figure 6 PSO and HSBA show the same initial fastconvergence towards the known minimum as the procedureproceeds HSBA gets closer and closer to the minimumwhile PSO comes into being premature and traps into thelocal minimum BA ES GA HS and PSO do not manage

Journal of Applied Mathematics 11

Table 4 Best normalized optimization results in fourteen benchmark functions The values shown are the minimum objective functionvalues found by each algorithm

ACO BA BBO DE ES GA HS HSBA PSO SGAF01 185 231 100 143 225 203 230 105 191 104F02 1042 1448 109 422 1039 403 991 100 828 133F03 273 4622 187 447 2138 802 4324 100 1675 176F04 44E6 50E6 12E3 19E4 22E6 30E4 42E6 100 3755 322F05 30E4 32E4 5101 38633 16E4 88446 26E4 100 4113 457F06 5851 99255 650 2469 80848 5924 74691 100 18971 202F07 571 853 125 513 794 523 747 100 600 168F08 2642 2503 148 370 3352 616 2150 100 775 145F09 243 866 128 490 593 209 728 100 740 142F10 189 494 118 266 300 208 276 100 201 174F11 774 1261 121 336 1214 601 960 100 781 162F12 133 233 144 169 208 174 211 100 170 121F13 2804 5657 217 554 6057 1971 5286 100 2098 228F14 428 5402 197 485 3361 1060 4911 100 1972 185lowast

The values are normalized so that the minimum in each row is 100 These are the absolute best minima found by each algorithm

0 10 20 30 40 508

10

12

14

16

18

20

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 1 Comparison of the performance of the different methodsfor the F01 Ackley function

to succeed in this benchmark function within maximumnumber of generations At last BBO DE and SGA convergeto the value very close to HSBA

Figure 7 shows the optimization results for the F07Rastrigin function Very clearly HSBA has the fastest con-vergence rate at finding the globalminimumand significantlyoutperforms all other approaches Looking carefully at Fig-ure 7 approximately we can divide all the algorithms intothree groups one group including BBO HSBA and SGAperforming the best the other group including ACO DE

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

106

105

Figure 2 Comparison of the performance of the different methodsfor the F02 Fletcher-Powell function

GA and PSO performing the second best another groupincluding BA ES and HS performing the worst

Figure 8 shows the results for F08 Rosenbrock functionFrom Table 3 and Figure 8 we can see that HSBA is superiorto the other algorithms during the optimization process inthis relatively simple unimodal benchmark function Alsowe see that similar to the F06 Quartic (with noise) functionshown in Figure 6 PSO shows a faster convergence rateinitially than HSBA however it is outperformed by HSBAafter 4 generations and is premature to the local minimum

Figure 9 shows the equivalent results for the F09 Schwefel226 function From Figure 9 very clearly though HSBA isoutperformed by BBO between the generations 10ndash30 it hasthe stable convergence rate at finding the global minimum

12 Journal of Applied Mathematics

0 10 20 30 40 50

50

100

150

200

250

300

350

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 3 Comparison of the performance of the different methodsfor the F03 Griewank function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

1010

109

108

107

106

105

104

103

102

101

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 4 Comparison of the performance of the different methodsfor the F04 Penalty 1 function

and significantly outperforms all other approaches in thismultimodal benchmark function

Figure 10 shows the results for F10 Schwefel 12 functionFrom Figure 10 we can see that HSBA performs far betterthan other algorithms during the optimization process in thisrelative simple unimodal benchmark function PSO showsa faster convergence rate initially than HSBA howeverit is outperformed by HSBA after 3 generations Lookingcarefully at Figure 10 akin to F07 Rastrigin function shown inFigure 7 all the algorithms except BA can be approximately

0

Ben

chm

ark

func

tion

val

ue

5 10 15 20 25 30 35 40 45 50Number of generations

109

108

107

106

105

104

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 5 Comparison of the performance of the different methodsfor the F05 Penalty 2 function

0 10 20 30 40 500

5

10

15

20

25

30

35

40

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 6 Comparison of the performance of the different methodsfor the F06 Quartic (with noise) function

divided into three groups one group including BBO andHSBA performing the best the other group including ACOGA PSO and SGA performing the second best anothergroup including DE ES and HS performing the worst

Figure 11 shows the results for F11 Schwefel 222 functionVery clearly BBO shows a faster convergence rate initiallythan HSBA however it is outperformed by HSBA after 40generations At last HSBA reaches the optimal solution sig-nificantly superior to other algorithms BBO is only inferior

Journal of Applied Mathematics 13B

ench

mar

k fu

ncti

on v

alue

0 10 20 30 40 50

50

100

150

200

250

Number of generations

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 7 Comparison of the performance of the different methodsfor the F07 Rastrigin function

0 10 20 30 40 50

500

1000

1500

2000

2500

3000

3500

4000

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 8 Comparison of the performance of the different methodsfor the F08 Rosenbrock function

to HSBA and performs the second best in this unimodalfunction

Figure 12 shows the results for F12 Schwefel 221 functionVery clearly HSBAhas the fastest convergence rate at findingthe global minimum and significantly outperforms all otherapproaches

0 10 20 30 40 50

1000

2000

3000

4000

5000

6000

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 9 Comparison of the performance of the different methodsfor the F09 Schwefel 226 function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

104

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 10 Comparison of the performance of the differentmethodsfor the F10 Schwefel 12 function

Figure 13 shows the results for F13 Sphere function FromTable 3 and Figure 13 HSBA has the fastest convergencerate at finding the global minimum and outperforms all otherapproaches Looking carefully at Figure 13 for BBO and SGAwe can see that BBO has a faster convergence rate than SGAbut SGA does finally converge to the value of BBO that isapproaching to HSBA

Figure 14 shows the results for F14 Step function Appar-ently HSBA shows the fastest convergence rate at findingthe global minimum and significantly outperforms all otherapproaches Looking carefully at Figure 14 for BBO and SGA

14 Journal of Applied MathematicsB

ench

mar

k fu

ncti

on v

alue

0 5 10 15 20 25 30 35 40 45 50Number of generations

102

101

100

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 11 Comparison of the performance of the different methodsfor the F11 Schwefel 222 function

0 10 20 30 40 5030

35

40

45

50

55

60

65

70

75

80

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 12 Comparison of the performance of the different methodsfor the F12 Schwefel 221 function

we can see that BBO has a faster convergence rate than SGAbut SGA does finally converge to the value of BBO

From the above-analyses about Figures 1ndash14 we can cometo the conclusion that our proposed hybrid metaheuristicalgorithm HSBA significantly outperforms the other ninealgorithms In general ACO BBO and SGA are only inferiorto the HSBA and ACO BBO and SGA perform betterthanHSBA in the benchmarks F04-F05 the benchmark F12and the benchmark F01 respectively Further benchmarksF04 F05 F06 F08 and F10 illustrate that PSO has a faster

0 10 20 30 40 500

10

20

30

40

50

60

70

80

90

100

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 13 Comparison of the performance of the different methodsfor the F13 Sphere function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

104

103

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 14 Comparison of the performance of the differentmethodsfor the F14 Step function

convergence rate initially while later it converges slowerand slower to the true objective function value At lastwe must point out that in [24] Simon compared BBOwith seven state-of-the-art EAs over 14 benchmark func-tions and a real-world sensor selection problem The resultsdemonstrated the good performance of BBO It is also indi-rectly demonstrated that our proposed hybrid metaheuristicmethod HSBA is a more powerful and efficient optimiza-tion algorithm than other population-based optimizationalgorithms

Journal of Applied Mathematics 15

Table 5 Best normalized optimization results in fourteen benchmark functions with different 119860 The numbers shown are the best resultsfound after 100 Monte Carlo simulations HSBA algorithm

119860

0 01 02 03 04 05 06 07 08 09 10F01 166 133 155 140 100 133 121 122 116 113 114F02 125E4 100 100 100 100 100 100 100 100 100 100F03 341 1115 100 100 100 100 100 100 100 100 100F04 273E4 100 125E4 104 104 104 104 104 104 100 100F05 503E5 530 133 168E4 100 100 100 100 100 100 100F06 100 256 547 473 1781 990 289 674 990 260 557F07 1014 1492 435 110 110 1537 101 100 100 100 100F08 3849 108 1406 108 108 108 9901 101 100 100 100F09 28518 40458 138 474 119 119 119 36791 101 100 100F10 66522 468 169E3 1527 118 118 118 118 121 100 100F11 2003 100 996 1154 3922 996 996 996 996 3897 849F12 1032 100 1322 323 1469 279 279 279 279 279 1931F13 146 415 284 447 115 156 100 100 100 100 100F14 52752 1357 395 101 115 1291 100 100 100 100 100

1 4 2 2 3 3 5 6 7 10 10

42 Influence of Control Parameter In [28] Dr Yang con-cluded that if the parameters in BA can be adjusted properlyit can outperform GA HS (harmony search) [26] and PSOThe choice of the control parameters is of vital importancefor different problems To investigate the influence of theloudness119860 pulse rate 119903 harmonymemory consideration rateHMCR and pitch adjustment rate PAR on the performanceof HSBA we carry out this experiment in the benchmarkproblem with different parameters All other parametersettings are kept unchanged (unless noted otherwise in thefollowing paragraph) The results are recorded in Tables 5ndash12 after 100 Monte Carlo runs Among them Tables 5 7 9and 11 show the best minima found by HSBA algorithm over100Monte Carlo runs Tables 6 8 10 and 12 show the averageminima found byHSBA algorithm averaged over 100MonteCarlo runs In other words Tables 5 7 9 11 and Tables 68 10 12 show the best and average performance of HSBAalgorithm respectively In each table the last row is the totalnumber of functions on which HSBA performs the best withsome parameters

421 Loudness 119860 Tables 5 and 6 recorded the resultsperformed in the benchmark problem with the loudness 119860 =0 01 02 09 10 and fixed pulse rate 119903 = 06 harmonymemory consideration rate HMCR= 095 and pitch adjust-ment rate PAR= 01 From Tables 5 and 6 obviously it can beseen that (i) for the three benchmark functions F02 F03 andF04 HSBA performs slightly differently that is to say thesethree benchmark functions are insensitive to the parameter119860 (ii) For benchmark functions F01 F06 F11 and F12HSBA performs better on smaller119860(lt05) (iii) However forfunctions F05 F07ndash10 F13 and F14 HSBA performs betteron bigger119860(gt05) In sum HSBA performs the best when119860is equal or very close to 10 So we set119860 = 095 which is veryclose to 09 and 10 in other experiments

422 Pulse Rate 119903 Tables 7 and 8 recorded the resultsperformed on the benchmark problem with the pulse rate119903 = 0 01 02 09 10 and fixed loudness119860 = 095harmony memory consideration rate HMCR=095 andpitch adjustment rate PAR= 01 From Tables 7 and 8 wecan evidently conclude that HSBA performs significantlybetter on bigger 119903(gt05) than on smaller 119903(lt05) and HSBAperforms the best when r is equal or very close to 06 So weset 119903 = 06 in other experiments

423 Harmony Memory Consideration Rate119867119872119862119877 Tables9 and 10 recorded the results performed in the bench-mark problem with the harmony memory consideration rateHMCR = 0 01 02 09 10 fixed loudness 119860 = 095pulse rate 119903 = 06 and pitch adjustment rate PAR = 01 FromTables 9 and 10 we can recognize that the function valuesevaluated by HSBA are bettersmaller on bigger HMCR(gt05) than on smaller r (lt05) and most benchmarks reachminimum when HMCR is equal or very close to 1 So we setHMCR=095 in other experiments

424 Pitch Adjustment Rate 119875119860119877 Tables 11 and 12 recordedthe results performed on the benchmark problems with thepitch adjustment rate PAR = 0 01 02 09 10 fixedloudness119860 = 095 pulse rate 119903 = 06 and harmonymemory consideration rate HMCR = 095 From Tables 11and 12 we can recognize that the function values for HSBAvary little with the increasing PAR and HSBA reachesoptimumminimum in most benchmarks when PAR is equalor very close to 01 So we set PAR = 01 in other experiments

43 Discussion In the HSBA the bats fly in the sky usingecholocation to find foodprey (ie best solutions) Fourother parameters are the loudness (119860) the rate of pulseemission (119903) harmony memory consideration rate (HMCR)

16 Journal of Applied Mathematics

Table 6 Mean normalized optimization results in fourteen benchmark functions with different A The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119860

0 01 02 03 04 05 06 07 08 09 10F01 101 101 100 100 100 100 100 100 100 100 100F02 152 100 146 117 108 137 188 165 167 179 148F03 669 1041 106 101 103 103 101 102 105 100 100F04 20131 126 410 421 333 272 100 519 291 100 311F05 59122 1682 425 587 101 414 2404 100 368 733 900F06 100 490E4 63799 25871 495E4 218 217 217 218 217 217F07 857 221E6 643 27234 11048 211E4 102 101 100 100 100F08 4980 114E4 172E4 16160 22459 9119 174E4 103 111 100 100F09 8237 173E6 106 314 7445 10310 4226 794E3 100 137 129F10 9031 145E3 245E5 232E3 2334 2234 3138 1289 234E3 140 100F11 498 100 315E4 233 1441 52023 44365 61626 24991 478E4 214F12 369 210E4 457E4 100 212E4 26635 23313 19880 27614 11201 214E4F13 190 825 448E6 214E6 100 615 25451 22275 18996 26386 10701F14 6691 195E5 117E4 124E3 2104 186E3 2940 2392 100 1835 2475

1 2 1 2 2 1 2 2 4 5 5

Table 7 Best normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 184 184 484 159 171 100 134 109 116 101F02 944E3 100 100 100 100 100 100 100 100 100 100F03 373 546 186 186 186 186 100 172 121 161 101F04 1142 111 2429 123 126 100 129 129 129 129 129F05 177E5 230 115 109E4 100 100 100 100 100 100 100F06 6244 2996 4858 1560 2634 780 182 100 3520 239 155F07 831 448 258 129 129 349 100 100 100 100 100F08 961 118 420 137 137 137 695 101 101 100 100F09 37394 44422 168 335 168 168 100 29198 101 101 101F10 38693 316 1397 463 158 158 100 158 30931 158 101F11 4255 100 1829 2135 2018 1111 1904 2135 1581 1876 1155F12 11440 100 14184 4448 12547 4448 4448 4448 4448 4448 6943F13 639 637 601 296 302 100 152 139 100 257 249F14 12052 404 233 100 116 340 100 100 116 116 116

0 3 1 2 2 4 8 5 4 4 4

and pitch adjustment rate (PAR) The appropriate update forthe pitch adjustment rate (PAR) and the rate of pulse emission(119903) balances the exploration and exploitation behavior of eachbat respectively as the loudness usually decreases once a bathas found its preysolution while the rate of pulse emissionincreases in order to raise the attack accuracy

For all of the standard benchmark functions that havebeen considered the HSBA has been demonstrated toperform better than or be equal to the standard BA and otheracclaimed state-of-the-art population-based algorithms withthe HSBA performing significantly better in some functionsThe HSBA performs excellently and efficiently because ofits ability to simultaneously carry out a local search still

searching globally at the same time It succeeds in doingthis due to the local search via harmony search algorithmand global search via bat algorithm concurrently A similarbehavior may be performed in the PSO by using multiswarmfrom a particle population initially [44] However HSBArsquosadvantages include performing simply and easily and onlyhaving four parameters to regulate The work carried outhere demonstrates the HSBA to be robust over all kinds ofbenchmark functions

Benchmark evaluation is a good way for verifying theperformance of the metaheuristic algorithms but it also haslimitations First we did not make any special effort to tunethe optimization algorithms in this section Different tuning

Journal of Applied Mathematics 17

Table 8 Mean normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 199 186 194 159 100 142 134 109 116 171F02 331 538 457 308 382 114 100 289 279 402 311F03 373 546 533 229 336 241 100 172 121 161 136F04 1142 111 2429 149E4 113E3 394E3 278E4 1966 17522 114 100F05 1609 12862 3269 2446 6440 1277 2959 7617 456 100 446F06 6244 2696 4858 1560 2634 780 182 100 3520 239 155F07 330 178 134 103 123 139 155 100 119 136 349F08 488 393 394 194 219 383 353 433 100 227 314F09 139 166 114 100 121 115 115 109 108 104 109F10 721 994 875 360 846 611 483 414 576 100 271F11 382 423 373 205 181 100 171 220 142 168 189F12 164 217 204 147 180 186 100 121 113 134 156F13 639 637 601 296 302 190 152 139 100 257 249F14 574 1543 742 682 644 317 100 485 221 174 209

0 0 0 1 0 2 4 2 2 2 1

Table 9 Best normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 164 164 164 164 164 164 164 151 128 100 163F02 187E4 100 100 100 100 100 100 100 100 100 100F03 1982 2649 371 371 371 371 371 371 371 206 100F04 126E5 100 187E4 100 100 100 100 100 100 100 100F05 607E5 534 100 187E4 100 100 100 100 100 100 100F06 10892 24631 36504 34540 33857 23465 14349 4528 2330 100 2575F07 922 1428 534 100 100 930 100 100 100 100 100F08 1810 108 772 108 108 108 3594 100 100 100 100F09 28004 39161 127 681 127 127 127 22795 100 100 100F10 55165 876 176E3 1170 164 164 164 164 27448 100 100F11 1467 100 618 618 1399 618 618 618 440 269 616F12 768 100 1110 273 1021 273 273 273 273 273 473F13 772 2675 1590 1776 612 1040 612 595 255 100 225F14 53716 1428 534 100 100 713 100 100 100 100 100

0 4 2 4 5 3 5 6 7 11 9

parameter values in the optimization algorithmsmight resultin significant differences in their performance Second real-world optimization problems may not have much of arelationship to benchmark functionsThird benchmark testsmight result in different conclusions if the grading criteria orproblem setup change In our work we examined the meanand best results obtained with a certain population size andafter a certain number of generations However we mightarrive at different conclusions if (for example) we change thegeneration limit or look at how many generations it takes toreach a certain function value or if we change the populationsize In spite of these caveats the benchmark results shown

here are promising for HSBA and indicate that this novelmethod might be able to find a niche among the plethora ofpopulation-based optimization algorithms

We note that CPU time is a bottleneck to the implemen-tation of many population-based optimization algorithmsIf an algorithm cannot converge fast it will be impracticalsince it would take too long to find an optimal or suboptimalsolution HSBA does not seem to require an unreasonableamount of computational effort of the ten optimization algo-rithms compared in this paper HSBA was the third fastestNevertheless findingmechanisms to accelerate HSBA couldbe an important area for further research

18 Journal of Applied Mathematics

Table 10 Mean normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 101 101 101 101 101 101 101 101 101 100 101F02 74756 787 525 343 509 710 307 187 160 100 101F03 913 312E4 109 107 105 106 105 102 101 101 100F04 2101198646 1001198644 3951198644 9481198643 5971198643 2951198643 1511198643 1501198643 4694 100 237F05 3241198646 3271198644 2091198644 4141198644 7131198643 1541198643 8931198643 1531198643 62948 100 20373F06 100 5941198644 42204 63215 6001198644 192 191 191 191 191 191F07 1063 2071198646 900 21672 32455 307E4 104 103 102 101 100F08 5950 9881198643 304E4 14180 21701 32468 307E4 107 107 103 100F09 22687 4951198646 308 891 11422 17360 25941 245E4 145 100 171F10 25737 2831198644 935E5 137E4 9709 6656 9925 14940 139E4 100 273F11 607 100 183E4 202 1661 39040 26299 40300 60361 572E4 184F12 300 279E4 360E4 100 282E4 27095 19414 13077 20040 30016 284E4F13 232 966 561E6 189E6 100 815 26778 19186 12923 19805 29665F14 18466 384E5 117E4 185E3 100 571E3 2511 5539 3960 2657 4149

1 1 0 1 2 0 0 0 0 6 3

Table 11 Best normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 391E3 100 100 100 100 100 100 100 100 100 100F03 100 425 151 220 220 212 220 199 220 128 180F04 100 255 6554 255 255 255 255 255 255 255 255F05 252 100 171 169E3 171 171 171 171 171 171 171F06 100 5987 3418 2285 4652 2344 4354 4484 3688 1899 812F07 807 100 148 255 255 1306 255 255 255 255 255F08 543 100 192 100 100 100 1145 100 100 100 100F09 7672 252 117 100 171 171 171 15556 171 171 171F10 92910 148 100 491 255 255 255 222 235E3 255 255F11 318E3 100 582E3 399E3 339E3 582E3 491E3 582E3 582E3 958E3 582E3F12 30506 100 53728 9734 18758 9734 9734 9734 9734 9734 39827F13 192 422 587 1007 1298 466 148 401 317 100 409F14 8812 100 148 255 255 491 255 255 255 255 255

4 8 3 4 3 3 2 3 3 4 3

5 Conclusion and Future Work

This paper proposed a hybrid metaheuristic HSBA methodfor optimization problemWe improved the BAby combiningoriginal harmony search (HS) algorithm and evaluating theHSBA on multimodal numerical optimization problemsA novel type of BA model has been presented and animprovement is applied to mutate between bats using har-mony search algorithm during the process of bats updatingUsing the original configuration of the bat algorithm wegenerate the new harmonies based on the newly generatedbat each iteration after batrsquos position has been updated Thenew harmony vector substitutes the newly generated bat only

if it has better fitness This selection scheme is rather greedywhich often overtakes original HS and BA The HSBAattempts to take merits of the BA and the HS in order toavoid all bats getting trapped in inferior local optimal regionsThe HSBA enables the bats to have more diverse exemplarsto learn from as the bats are updated each iteration andalso form new harmonies to search in a larger search spaceThis new method can speed up the global convergence ratewithout losing the strong robustness of the basic BA Fromthe analysis of the experimental results we observe that theproposed HSBA makes good use of the information in pastsolutions more effectively to generate better quality solutionsfrequently when compared to the other population-based

Journal of Applied Mathematics 19

Table 12Mean normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 407 100 100 452 277 257 341 536 369 489 103F03 100 127E4 134 135 135 135 134 135 134 134 134F04 115 8127 939E3 100 101 101 129 100 100 101 101F05 34573 5061 8619 907E3 2501 108 101 191 333 100 118F06 100 542E6 427E4 474E4 548E6 57788 57788 57788 57788 57787 57787F07 486 153 100 9543 10596 122E4 134 136 132 133 135F08 1269 7600 876E3 1703 6913 7672 885E3 110 105 104 100F09 6345 23047 182 167 1278 4860 5349 609E3 111 130 100F10 13355 1251 126 197E3 1148 464 1645 1893 199E3 100 188F11 6302 100 639E3 7944 5924 396E3 142E3 581E3 645E3 745E5 7981F12 390 881E3 890E3 100 890E3 4440 4778 1725 7011 7784 899E3F13 100 2166 207E3 669 580 430 27167 28246 10539 42926 47662F14 4614 100 3610 5593 122 640E3 4215 3289 3481 1272 5129

4 4 3 3 1 1 1 2 2 3 3

optimization algorithms such as ACO BA BBO DE ESGA HS PSO and SGA Based on the results of the tenapproaches on the test problems we can conclude that theHSBA significantly improves the performances of the HSand the BA on most multimodal and unimodal problems

In this work 14 benchmark functions are used to evaluatethe performance of our approach we will test our approachon more problems such as the high-dimensional (119863 ge20) CEC 2010 test suit [45] and the real-world problemsMoreover we will compare HSBA with other metaheuristicmethod such as DLHS [46 47] SGHS [48] adaptive DE[49] CLPSO [50] and DMS-PSO [51] In addition we onlyconsider the unconstrained function optimization in thiswork Our future work consists on adding the diversity rulesinto HSBA for constrained optimization problems such asconstrained real-parameter optimization CEC 2010 test suit[52]

In the field of optimization there are many issues worthyof further study and efficient optimizationmethod should bedeveloped depending on the analysis of specific engineeringproblem Our future work will focus on the two issues onthe one hand we would apply our proposed approach HSBA to solve practical engineering optimization problemsand obviously HSBA can become a fascinating method forreal-world engineering optimization problems on the otherhand we would develop new meta-hybrid approach to solveoptimization problem

Acknowledgments

This work was supported by State Key Laboratory of LaserInteraction with Material Research Fund under Grant noSKLLIM0902-01 and Key Research Technology of Electric-discharge Non-chain Pulsed DF Laser under Grant no LXJJ-11-Q80

References

[1] G Wang L Guo A H Gandomi et al ldquoLevy-flight krill herdalgorithmrdquo Mathematical Problems in Engineering vol 2013Article ID 682073 14 pages 2013

[2] A H Gandomi X S Yang S Talatahari and A H AlaviMetaheuristic Applications in Structures and InfrastructuresElsevier Waltham Mass USA 2013

[3] D H Ackley ldquoAn empirical study of bit vector functionoptimizationrdquo in Genetic Algorithms and Simulated AnnealingL Davis Ed pp 170ndash204 Pitman London UK 1987

[4] R Fletcher and M J D Powell ldquoA rapidly convergent descentmethod for minimizationrdquoThe Computer Journal vol 6 no 2pp 163ndash168 1963

[5] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[6] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[7] K A De Jong Analysis of the Behavior of a Class of GeneticAdaptive Systems University of Michigan 1975

[8] L A Rastrigin Extremal Control Systems NaukaMoscow Rus-sian 1974Theoretical Foundations of Engineering CyberneticsSeries

[9] S-K S Fan and J-M Chang ldquoDynamic multi-swarm particleswarm optimizer using parallel PC cluster systems for globaloptimization of large-scale multimodal functionsrdquo EngineeringOptimization vol 42 no 5 pp 431ndash451 2010

[10] W Gong Z Cai and C X Ling ldquoDEBBO a hybrid differentialevolution with biogeography-based optimization for globalnumerical optimizationrdquo Soft Computing vol 15 no 4 pp 645ndash665 2010

[11] G Wang L Guo H Duan L Liu and H Wang ldquoA modifiedfirefly algorithm for UCAV path planningrdquo International Jour-nal of Hybrid Information Technology vol 5 no 3 pp 123ndash1442012

20 Journal of Applied Mathematics

[12] GWang LGuoHDuan L Liu andHWang ldquoAhybridmeta-heuristic DECS algorithm for UCAVpath planningrdquo Journal ofInformation and Computational Science vol 9 no 16 pp 1ndash82012

[13] GWang L Guo H Duan L Liu HWang andM Shao ldquoPathplanning for uninhabited combat aerial vehicle using hybridmeta-heuristic DEBBO algorithmrdquo Advanced Science Engi-neering and Medicine vol 4 no 6 pp 550ndash564 2012

[14] G Wang L Guo H Duan H Wang L Liu and M Shao ldquoAhybrid meta-heuristic DECS algorithm for UCAV three-dimension path planningrdquo The Scientific World Journal vol2012 Article ID 583973 11 pages 2012

[15] H DuanW Zhao GWang and X Feng ldquoTest-sheet composi-tion using analytic hierarchy process and hybrid metaheuristicalgorithmTSBBOrdquoMathematical Problems in Engineering vol2012 Article ID 712752 22 pages 2012

[16] G Wang L Guo H Duan L Liu and H Wang ldquoDynamicdeployment of wireless sensor networks by biogeography basedoptimization algorithmrdquo Journal of Sensor and Actuator Net-works vol 1 no 2 pp 86ndash96 2012

[17] X S Yang A H Gandomi S Talatahari and A H AlaviMetaheuristics in Water Geotechnical and Transport Engineer-ing Elsevier Waltham Mass USA 2013

[18] D E GoldbergGenetic Algorithms in Search Optimization andMachine Learning Addison-Wesley New York NY USA 1998

[19] R Storn and K Price ldquoDifferential evolutionmdasha simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997

[20] S Das and P N Suganthan ldquoDifferential evolution a survey ofthe state-of-the-artrdquo IEEE Transactions on Evolutionary Com-putation vol 15 no 1 pp 4ndash31 2011

[21] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[22] A H Gandomi G J Yun X-S Yang and S Talatahari ldquoChaos-enhanced accelerated particle swarm optimizationrdquo Communi-cations in Nonlinear Science and Numerical Simulation vol 18no 2 pp 327ndash340 2013

[23] S Talatahari X S Yang R Sheikholeslami andAH GandomildquoOptimal parameter estimation for muskingum model usinghybrid CSS and PSO methodrdquo Journal of Computational andTheoretical Nanoscience In press

[24] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[25] G Wang L Guo H Duan L Liu H Wang and M ShaoldquoHybridizing harmony search with biogeography based opti-mization for global numerical optimizationrdquo Journal of Com-putational and Theoretical Nanoscience In press

[26] Z W Geem J H Kim and G V Loganathan ldquoA new heuristicoptimization algorithm harmony searchrdquo Simulation vol 76no 2 pp 60ndash68 2001

[27] G Wang L Guo H Duan H Wang L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[28] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress Frome UK 2nd edition 2010

[29] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing amp Applications In press

[30] X S Yang andAHGandomi ldquoBat algorithm a novel approachfor global engineering optimizationrdquo Engineering Computa-tions vol 29 no 5 pp 464ndash483 2012

[31] G Wang L Guo H Duan L Liu and H Wang ldquoA Bat algo-rithm with mutation for UCAV path planningrdquo The ScientificWorld Journal vol 2012 Article ID 418946 15 pages 2012

[32] R Parpinelli and H Lopes ldquoNew inspirations in swarm intelli-gence a surveyrdquo International Journal of Bio-Inspired Computa-tion vol 3 no 1 pp 1ndash16 2011

[33] P W Tsai J S Pan B Y Liao M J Tsai and V Istanda ldquoBatalgorithm inspired algorithm for solving numerical optimiza-tion problemsrdquo Applied Mechanics and Materials vol 148 pp134ndash137 2012

[34] Wikipedia ldquoMathematical optimizationrdquo httpenwikipediaorgwikiNumerical optimization

[35] W Y Zhang S Xu and S J Li ldquoNecessary conditions for weaksharp minima in cone-constrained optimization problemsrdquoAbstract and Applied Analysis vol 2011 Article ID 909520 11pages 2012

[36] Wikipedia ldquoGlobal optimizationrdquo httpenwikipediaorgwikiGlobal optimization

[37] S Kirkpatrick C D Gelatt Jr and M P Vecchi ldquoOptimizationby simulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983

[38] M Dorigo and T Stutzle Ant Colony Optimization The MITPress Cambridge Mass USA 2004

[39] M Dorigo Optimization Learning and Natural AlgorithmsPolitecnico di Milano Italy 1992

[40] H G Beyer The Theory of Evolution Strategies Springer NewYork NY USA 2001

[41] W Khatib and P Fleming ldquoThe stud GA a mini revolutionrdquo inParallel Problem Solving From Nature pp 683ndash691 1998

[42] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[43] M Friedman ldquoA comparison of alternative tests of significancefor the problem of 119898 rankingsrdquo The Annals of MathematicalStatistics vol 11 no 1 pp 86ndash92 1940

[44] R Brits A P Engelbrecht and F van den Bergh ldquoLocatingmultiple optima using particle swarm optimizationrdquo AppliedMathematics and Computation vol 189 no 2 pp 1859ndash18832007

[45] K Tang X Li P N Suganthan Z Yang and T Weise ldquoBench-mark functions for the CECrsquo2010 special session and competi-tion on large scale global optimizationrdquo Nature Inspired Com-putation and Applications Laboratory USTC Hefei China2010

[46] Q-K Pan P N Suganthan J J Liang and M F Tasgetiren ldquoAlocal-best harmony search algorithm with dynamic subpopula-tionsrdquo Engineering Optimization vol 42 no 2 pp 101ndash117 2010

[47] Q-K Pan P N Suganthan J J Liang and M F TasgetirenldquoA local-best harmony search algorithm with dynamic sub-harmony memories for lot-streaming flow shop schedulingproblemrdquo Expert Systems with Applications vol 38 no 4 pp3252ndash3259 2011

[48] Q-K Pan P N Suganthan M F Tasgetiren and J J LiangldquoA self-adaptive global best harmony search algorithm forcontinuous optimization problemsrdquo Applied Mathematics andComputation vol 216 no 3 pp 830ndash848 2010

Journal of Applied Mathematics 21

[49] S M Islam S Das S Ghosh S Roy and P N Suganthan ldquoAnadaptive differential evolution algorithm with novel mutationand crossover strategies for global numerical optimizationrdquoIEEE Transactions on Systems Man and Cybernetics B vol 42no 2 pp 482ndash500 2012

[50] J J Liang A K Qin P N Suganthan and S Baskar ldquoCompre-hensive learning particle swarm optimizer for global optimiza-tion of multimodal functionsrdquo IEEE Transactions on Evolution-ary Computation vol 10 no 3 pp 281ndash295 2006

[51] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[52] R Mallipeddi and P Suganthan Problem Definitions and Eval-uation Criteria for the CEC 2010 Competition on ConstrainedReal-Parameter Optimization Nanyang Technological Univer-sity Singapore 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Journal of Applied Mathematics 9

Table1Con

tinued

No

Nam

eDefinitio

nSource

F13

Sphere

119891(119909)=119899 sum 119894=1

1199092 119894

[7]

F14

Step

119891(119909)=6sdot119899

+119899 sum 119894=1

lfloor 119909119894rfloor

[7]

lowast

Inbenchm

arkfunctio

nF0

2them

atrix

elem

entsa 119899times119899b 119899times119899isin(minus100100)120572119899times1isin(minus120587120587)ared

rawnfro

mun

iform

distr

ibution[4]

lowastIn

benchm

arkfunctio

nsF0

4andF0

5thed

efinitio

nof

thefun

ction119906(119909119894119886119896119898)isgivenby119906(119909119894119886119896119898)=119896(119909119894minus119886)119898119909119894gt1198860minus119886le119909119894le119886119896(minus119909119894minus119886)119898119909119894ltminus119886

10 Journal of Applied Mathematics

Table 2 Properties of benchmark functions lb denotes lower bound ub denotes upper bound and opt denotes optimum point

No Function lb ub opt Continuity ModalityF01 Ackley minus32768 32768 0 Continuous MultimodalF02 Fletcher-Powell minus120587 120587 0 Continuous MultimodalF03 Griewank minus600 600 0 Continuous MultimodalF04 Penalty 1 minus50 50 0 Continuous MultimodalF05 Penalty 2 minus50 50 0 Continuous MultimodalF06 Quartic with noise minus128 128 1 Continuous MultimodalF07 Rastrigin minus512 512 0 Continuous MultimodalF08 Rosenbrock minus2048 2048 0 Continuous UnimodalF09 Schwefel 226 minus512 512 0 Continuous MultimodalF10 Schwefel 12 minus100 100 0 Continuous UnimodalF11 Schwefel 222 minus10 10 0 Continuous UnimodalF12 Schwefel 221 minus100 100 0 Continuous UnimodalF13 Sphere minus512 512 0 Continuous UnimodalF14 Step minus512 512 0 Discontinuous Unimodal

Table 3 Mean normalized optimization results in fourteen benchmark functions The values shown are the minimum objective functionvalues found by each algorithm averaged over 100 Monte Carlo simulations

ACO BA BBO DE ES GA HS HSBA PSO SGAF01 231 333 115 202 338 272 347 109 266 100F02 2458 2582 158 894 2435 545 1569 100 1396 133F03 316 6072 193 544 2385 322 7722 100 2577 142F04 100 30E38 40E32 56E33 27E38 31E32 14E39 23E32 41E36 96E31F05 100 11E8 29942 15E6 46E8 54E3 41E8 21551 55E7 11110F06 48901 68E3 3532 30829 18E4 27483 15E4 100 25E3 1009F07 809 1155 128 656 1187 617 1022 100 844 180F08 4225 2901 229 759 5999 905 4785 100 1204 215F09 317 2026 199 1358 1333 181 1992 100 1761 115F10 175 373 138 295 493 125 422 100 248 148F11 105 1970 183 714 2312 1113 1945 100 1322 246F12 186 403 100 299 391 192 374 138 238 112F13 9830 15084 380 1903 22652 4774 18232 100 7291 402F14 773 12048 393 1331 10256 1153 14655 100 6344 328TimE 274 100 132 164 167 179 233 143 203 176lowast

The values are normalized so that the minimum in each row is 100 These are not the absolute minima found by each algorithm but the average minimafound by each algorithm

by ACO after 5 generations For other algorithms althoughslower BBO GA and SGA eventually find the global mini-mum close to HSBA while BA DE ES HS and PSO fail tosearch the global minimum within the limited iterations

Figure 4 shows the results for F04 Penalty 1 functionFrom Figure 4 it is obvious that HSBA outperforms all othermethods during the whole progress of optimization in thismultimodal benchmark function Although slower SGA andBBO perform the second and third best at finding the globalminimum

Figure 5 shows the performance achieved for F05 Penalty2 function For this multimodal function similar to F04Penalty 1 function shown in Figure 4 HSBA is significantly

superior to all the other algorithms during the process ofoptimization Here PSO shows a faster convergence rateinitially than HSBA however it is outperformed by HSBAafter 3 generations

Figure 6 shows the results achieved for the ten methodswhen using the F06 Quartic (with noise) function FromTable 3 and Figure 6 we can conclude that HSBA performsthe best in this multimodal function Looking carefullyat Figure 6 PSO and HSBA show the same initial fastconvergence towards the known minimum as the procedureproceeds HSBA gets closer and closer to the minimumwhile PSO comes into being premature and traps into thelocal minimum BA ES GA HS and PSO do not manage

Journal of Applied Mathematics 11

Table 4 Best normalized optimization results in fourteen benchmark functions The values shown are the minimum objective functionvalues found by each algorithm

ACO BA BBO DE ES GA HS HSBA PSO SGAF01 185 231 100 143 225 203 230 105 191 104F02 1042 1448 109 422 1039 403 991 100 828 133F03 273 4622 187 447 2138 802 4324 100 1675 176F04 44E6 50E6 12E3 19E4 22E6 30E4 42E6 100 3755 322F05 30E4 32E4 5101 38633 16E4 88446 26E4 100 4113 457F06 5851 99255 650 2469 80848 5924 74691 100 18971 202F07 571 853 125 513 794 523 747 100 600 168F08 2642 2503 148 370 3352 616 2150 100 775 145F09 243 866 128 490 593 209 728 100 740 142F10 189 494 118 266 300 208 276 100 201 174F11 774 1261 121 336 1214 601 960 100 781 162F12 133 233 144 169 208 174 211 100 170 121F13 2804 5657 217 554 6057 1971 5286 100 2098 228F14 428 5402 197 485 3361 1060 4911 100 1972 185lowast

The values are normalized so that the minimum in each row is 100 These are the absolute best minima found by each algorithm

0 10 20 30 40 508

10

12

14

16

18

20

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 1 Comparison of the performance of the different methodsfor the F01 Ackley function

to succeed in this benchmark function within maximumnumber of generations At last BBO DE and SGA convergeto the value very close to HSBA

Figure 7 shows the optimization results for the F07Rastrigin function Very clearly HSBA has the fastest con-vergence rate at finding the globalminimumand significantlyoutperforms all other approaches Looking carefully at Fig-ure 7 approximately we can divide all the algorithms intothree groups one group including BBO HSBA and SGAperforming the best the other group including ACO DE

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

106

105

Figure 2 Comparison of the performance of the different methodsfor the F02 Fletcher-Powell function

GA and PSO performing the second best another groupincluding BA ES and HS performing the worst

Figure 8 shows the results for F08 Rosenbrock functionFrom Table 3 and Figure 8 we can see that HSBA is superiorto the other algorithms during the optimization process inthis relatively simple unimodal benchmark function Alsowe see that similar to the F06 Quartic (with noise) functionshown in Figure 6 PSO shows a faster convergence rateinitially than HSBA however it is outperformed by HSBAafter 4 generations and is premature to the local minimum

Figure 9 shows the equivalent results for the F09 Schwefel226 function From Figure 9 very clearly though HSBA isoutperformed by BBO between the generations 10ndash30 it hasthe stable convergence rate at finding the global minimum

12 Journal of Applied Mathematics

0 10 20 30 40 50

50

100

150

200

250

300

350

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 3 Comparison of the performance of the different methodsfor the F03 Griewank function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

1010

109

108

107

106

105

104

103

102

101

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 4 Comparison of the performance of the different methodsfor the F04 Penalty 1 function

and significantly outperforms all other approaches in thismultimodal benchmark function

Figure 10 shows the results for F10 Schwefel 12 functionFrom Figure 10 we can see that HSBA performs far betterthan other algorithms during the optimization process in thisrelative simple unimodal benchmark function PSO showsa faster convergence rate initially than HSBA howeverit is outperformed by HSBA after 3 generations Lookingcarefully at Figure 10 akin to F07 Rastrigin function shown inFigure 7 all the algorithms except BA can be approximately

0

Ben

chm

ark

func

tion

val

ue

5 10 15 20 25 30 35 40 45 50Number of generations

109

108

107

106

105

104

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 5 Comparison of the performance of the different methodsfor the F05 Penalty 2 function

0 10 20 30 40 500

5

10

15

20

25

30

35

40

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 6 Comparison of the performance of the different methodsfor the F06 Quartic (with noise) function

divided into three groups one group including BBO andHSBA performing the best the other group including ACOGA PSO and SGA performing the second best anothergroup including DE ES and HS performing the worst

Figure 11 shows the results for F11 Schwefel 222 functionVery clearly BBO shows a faster convergence rate initiallythan HSBA however it is outperformed by HSBA after 40generations At last HSBA reaches the optimal solution sig-nificantly superior to other algorithms BBO is only inferior

Journal of Applied Mathematics 13B

ench

mar

k fu

ncti

on v

alue

0 10 20 30 40 50

50

100

150

200

250

Number of generations

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 7 Comparison of the performance of the different methodsfor the F07 Rastrigin function

0 10 20 30 40 50

500

1000

1500

2000

2500

3000

3500

4000

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 8 Comparison of the performance of the different methodsfor the F08 Rosenbrock function

to HSBA and performs the second best in this unimodalfunction

Figure 12 shows the results for F12 Schwefel 221 functionVery clearly HSBAhas the fastest convergence rate at findingthe global minimum and significantly outperforms all otherapproaches

0 10 20 30 40 50

1000

2000

3000

4000

5000

6000

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 9 Comparison of the performance of the different methodsfor the F09 Schwefel 226 function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

104

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 10 Comparison of the performance of the differentmethodsfor the F10 Schwefel 12 function

Figure 13 shows the results for F13 Sphere function FromTable 3 and Figure 13 HSBA has the fastest convergencerate at finding the global minimum and outperforms all otherapproaches Looking carefully at Figure 13 for BBO and SGAwe can see that BBO has a faster convergence rate than SGAbut SGA does finally converge to the value of BBO that isapproaching to HSBA

Figure 14 shows the results for F14 Step function Appar-ently HSBA shows the fastest convergence rate at findingthe global minimum and significantly outperforms all otherapproaches Looking carefully at Figure 14 for BBO and SGA

14 Journal of Applied MathematicsB

ench

mar

k fu

ncti

on v

alue

0 5 10 15 20 25 30 35 40 45 50Number of generations

102

101

100

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 11 Comparison of the performance of the different methodsfor the F11 Schwefel 222 function

0 10 20 30 40 5030

35

40

45

50

55

60

65

70

75

80

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 12 Comparison of the performance of the different methodsfor the F12 Schwefel 221 function

we can see that BBO has a faster convergence rate than SGAbut SGA does finally converge to the value of BBO

From the above-analyses about Figures 1ndash14 we can cometo the conclusion that our proposed hybrid metaheuristicalgorithm HSBA significantly outperforms the other ninealgorithms In general ACO BBO and SGA are only inferiorto the HSBA and ACO BBO and SGA perform betterthanHSBA in the benchmarks F04-F05 the benchmark F12and the benchmark F01 respectively Further benchmarksF04 F05 F06 F08 and F10 illustrate that PSO has a faster

0 10 20 30 40 500

10

20

30

40

50

60

70

80

90

100

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 13 Comparison of the performance of the different methodsfor the F13 Sphere function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

104

103

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 14 Comparison of the performance of the differentmethodsfor the F14 Step function

convergence rate initially while later it converges slowerand slower to the true objective function value At lastwe must point out that in [24] Simon compared BBOwith seven state-of-the-art EAs over 14 benchmark func-tions and a real-world sensor selection problem The resultsdemonstrated the good performance of BBO It is also indi-rectly demonstrated that our proposed hybrid metaheuristicmethod HSBA is a more powerful and efficient optimiza-tion algorithm than other population-based optimizationalgorithms

Journal of Applied Mathematics 15

Table 5 Best normalized optimization results in fourteen benchmark functions with different 119860 The numbers shown are the best resultsfound after 100 Monte Carlo simulations HSBA algorithm

119860

0 01 02 03 04 05 06 07 08 09 10F01 166 133 155 140 100 133 121 122 116 113 114F02 125E4 100 100 100 100 100 100 100 100 100 100F03 341 1115 100 100 100 100 100 100 100 100 100F04 273E4 100 125E4 104 104 104 104 104 104 100 100F05 503E5 530 133 168E4 100 100 100 100 100 100 100F06 100 256 547 473 1781 990 289 674 990 260 557F07 1014 1492 435 110 110 1537 101 100 100 100 100F08 3849 108 1406 108 108 108 9901 101 100 100 100F09 28518 40458 138 474 119 119 119 36791 101 100 100F10 66522 468 169E3 1527 118 118 118 118 121 100 100F11 2003 100 996 1154 3922 996 996 996 996 3897 849F12 1032 100 1322 323 1469 279 279 279 279 279 1931F13 146 415 284 447 115 156 100 100 100 100 100F14 52752 1357 395 101 115 1291 100 100 100 100 100

1 4 2 2 3 3 5 6 7 10 10

42 Influence of Control Parameter In [28] Dr Yang con-cluded that if the parameters in BA can be adjusted properlyit can outperform GA HS (harmony search) [26] and PSOThe choice of the control parameters is of vital importancefor different problems To investigate the influence of theloudness119860 pulse rate 119903 harmonymemory consideration rateHMCR and pitch adjustment rate PAR on the performanceof HSBA we carry out this experiment in the benchmarkproblem with different parameters All other parametersettings are kept unchanged (unless noted otherwise in thefollowing paragraph) The results are recorded in Tables 5ndash12 after 100 Monte Carlo runs Among them Tables 5 7 9and 11 show the best minima found by HSBA algorithm over100Monte Carlo runs Tables 6 8 10 and 12 show the averageminima found byHSBA algorithm averaged over 100MonteCarlo runs In other words Tables 5 7 9 11 and Tables 68 10 12 show the best and average performance of HSBAalgorithm respectively In each table the last row is the totalnumber of functions on which HSBA performs the best withsome parameters

421 Loudness 119860 Tables 5 and 6 recorded the resultsperformed in the benchmark problem with the loudness 119860 =0 01 02 09 10 and fixed pulse rate 119903 = 06 harmonymemory consideration rate HMCR= 095 and pitch adjust-ment rate PAR= 01 From Tables 5 and 6 obviously it can beseen that (i) for the three benchmark functions F02 F03 andF04 HSBA performs slightly differently that is to say thesethree benchmark functions are insensitive to the parameter119860 (ii) For benchmark functions F01 F06 F11 and F12HSBA performs better on smaller119860(lt05) (iii) However forfunctions F05 F07ndash10 F13 and F14 HSBA performs betteron bigger119860(gt05) In sum HSBA performs the best when119860is equal or very close to 10 So we set119860 = 095 which is veryclose to 09 and 10 in other experiments

422 Pulse Rate 119903 Tables 7 and 8 recorded the resultsperformed on the benchmark problem with the pulse rate119903 = 0 01 02 09 10 and fixed loudness119860 = 095harmony memory consideration rate HMCR=095 andpitch adjustment rate PAR= 01 From Tables 7 and 8 wecan evidently conclude that HSBA performs significantlybetter on bigger 119903(gt05) than on smaller 119903(lt05) and HSBAperforms the best when r is equal or very close to 06 So weset 119903 = 06 in other experiments

423 Harmony Memory Consideration Rate119867119872119862119877 Tables9 and 10 recorded the results performed in the bench-mark problem with the harmony memory consideration rateHMCR = 0 01 02 09 10 fixed loudness 119860 = 095pulse rate 119903 = 06 and pitch adjustment rate PAR = 01 FromTables 9 and 10 we can recognize that the function valuesevaluated by HSBA are bettersmaller on bigger HMCR(gt05) than on smaller r (lt05) and most benchmarks reachminimum when HMCR is equal or very close to 1 So we setHMCR=095 in other experiments

424 Pitch Adjustment Rate 119875119860119877 Tables 11 and 12 recordedthe results performed on the benchmark problems with thepitch adjustment rate PAR = 0 01 02 09 10 fixedloudness119860 = 095 pulse rate 119903 = 06 and harmonymemory consideration rate HMCR = 095 From Tables 11and 12 we can recognize that the function values for HSBAvary little with the increasing PAR and HSBA reachesoptimumminimum in most benchmarks when PAR is equalor very close to 01 So we set PAR = 01 in other experiments

43 Discussion In the HSBA the bats fly in the sky usingecholocation to find foodprey (ie best solutions) Fourother parameters are the loudness (119860) the rate of pulseemission (119903) harmony memory consideration rate (HMCR)

16 Journal of Applied Mathematics

Table 6 Mean normalized optimization results in fourteen benchmark functions with different A The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119860

0 01 02 03 04 05 06 07 08 09 10F01 101 101 100 100 100 100 100 100 100 100 100F02 152 100 146 117 108 137 188 165 167 179 148F03 669 1041 106 101 103 103 101 102 105 100 100F04 20131 126 410 421 333 272 100 519 291 100 311F05 59122 1682 425 587 101 414 2404 100 368 733 900F06 100 490E4 63799 25871 495E4 218 217 217 218 217 217F07 857 221E6 643 27234 11048 211E4 102 101 100 100 100F08 4980 114E4 172E4 16160 22459 9119 174E4 103 111 100 100F09 8237 173E6 106 314 7445 10310 4226 794E3 100 137 129F10 9031 145E3 245E5 232E3 2334 2234 3138 1289 234E3 140 100F11 498 100 315E4 233 1441 52023 44365 61626 24991 478E4 214F12 369 210E4 457E4 100 212E4 26635 23313 19880 27614 11201 214E4F13 190 825 448E6 214E6 100 615 25451 22275 18996 26386 10701F14 6691 195E5 117E4 124E3 2104 186E3 2940 2392 100 1835 2475

1 2 1 2 2 1 2 2 4 5 5

Table 7 Best normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 184 184 484 159 171 100 134 109 116 101F02 944E3 100 100 100 100 100 100 100 100 100 100F03 373 546 186 186 186 186 100 172 121 161 101F04 1142 111 2429 123 126 100 129 129 129 129 129F05 177E5 230 115 109E4 100 100 100 100 100 100 100F06 6244 2996 4858 1560 2634 780 182 100 3520 239 155F07 831 448 258 129 129 349 100 100 100 100 100F08 961 118 420 137 137 137 695 101 101 100 100F09 37394 44422 168 335 168 168 100 29198 101 101 101F10 38693 316 1397 463 158 158 100 158 30931 158 101F11 4255 100 1829 2135 2018 1111 1904 2135 1581 1876 1155F12 11440 100 14184 4448 12547 4448 4448 4448 4448 4448 6943F13 639 637 601 296 302 100 152 139 100 257 249F14 12052 404 233 100 116 340 100 100 116 116 116

0 3 1 2 2 4 8 5 4 4 4

and pitch adjustment rate (PAR) The appropriate update forthe pitch adjustment rate (PAR) and the rate of pulse emission(119903) balances the exploration and exploitation behavior of eachbat respectively as the loudness usually decreases once a bathas found its preysolution while the rate of pulse emissionincreases in order to raise the attack accuracy

For all of the standard benchmark functions that havebeen considered the HSBA has been demonstrated toperform better than or be equal to the standard BA and otheracclaimed state-of-the-art population-based algorithms withthe HSBA performing significantly better in some functionsThe HSBA performs excellently and efficiently because ofits ability to simultaneously carry out a local search still

searching globally at the same time It succeeds in doingthis due to the local search via harmony search algorithmand global search via bat algorithm concurrently A similarbehavior may be performed in the PSO by using multiswarmfrom a particle population initially [44] However HSBArsquosadvantages include performing simply and easily and onlyhaving four parameters to regulate The work carried outhere demonstrates the HSBA to be robust over all kinds ofbenchmark functions

Benchmark evaluation is a good way for verifying theperformance of the metaheuristic algorithms but it also haslimitations First we did not make any special effort to tunethe optimization algorithms in this section Different tuning

Journal of Applied Mathematics 17

Table 8 Mean normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 199 186 194 159 100 142 134 109 116 171F02 331 538 457 308 382 114 100 289 279 402 311F03 373 546 533 229 336 241 100 172 121 161 136F04 1142 111 2429 149E4 113E3 394E3 278E4 1966 17522 114 100F05 1609 12862 3269 2446 6440 1277 2959 7617 456 100 446F06 6244 2696 4858 1560 2634 780 182 100 3520 239 155F07 330 178 134 103 123 139 155 100 119 136 349F08 488 393 394 194 219 383 353 433 100 227 314F09 139 166 114 100 121 115 115 109 108 104 109F10 721 994 875 360 846 611 483 414 576 100 271F11 382 423 373 205 181 100 171 220 142 168 189F12 164 217 204 147 180 186 100 121 113 134 156F13 639 637 601 296 302 190 152 139 100 257 249F14 574 1543 742 682 644 317 100 485 221 174 209

0 0 0 1 0 2 4 2 2 2 1

Table 9 Best normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 164 164 164 164 164 164 164 151 128 100 163F02 187E4 100 100 100 100 100 100 100 100 100 100F03 1982 2649 371 371 371 371 371 371 371 206 100F04 126E5 100 187E4 100 100 100 100 100 100 100 100F05 607E5 534 100 187E4 100 100 100 100 100 100 100F06 10892 24631 36504 34540 33857 23465 14349 4528 2330 100 2575F07 922 1428 534 100 100 930 100 100 100 100 100F08 1810 108 772 108 108 108 3594 100 100 100 100F09 28004 39161 127 681 127 127 127 22795 100 100 100F10 55165 876 176E3 1170 164 164 164 164 27448 100 100F11 1467 100 618 618 1399 618 618 618 440 269 616F12 768 100 1110 273 1021 273 273 273 273 273 473F13 772 2675 1590 1776 612 1040 612 595 255 100 225F14 53716 1428 534 100 100 713 100 100 100 100 100

0 4 2 4 5 3 5 6 7 11 9

parameter values in the optimization algorithmsmight resultin significant differences in their performance Second real-world optimization problems may not have much of arelationship to benchmark functionsThird benchmark testsmight result in different conclusions if the grading criteria orproblem setup change In our work we examined the meanand best results obtained with a certain population size andafter a certain number of generations However we mightarrive at different conclusions if (for example) we change thegeneration limit or look at how many generations it takes toreach a certain function value or if we change the populationsize In spite of these caveats the benchmark results shown

here are promising for HSBA and indicate that this novelmethod might be able to find a niche among the plethora ofpopulation-based optimization algorithms

We note that CPU time is a bottleneck to the implemen-tation of many population-based optimization algorithmsIf an algorithm cannot converge fast it will be impracticalsince it would take too long to find an optimal or suboptimalsolution HSBA does not seem to require an unreasonableamount of computational effort of the ten optimization algo-rithms compared in this paper HSBA was the third fastestNevertheless findingmechanisms to accelerate HSBA couldbe an important area for further research

18 Journal of Applied Mathematics

Table 10 Mean normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 101 101 101 101 101 101 101 101 101 100 101F02 74756 787 525 343 509 710 307 187 160 100 101F03 913 312E4 109 107 105 106 105 102 101 101 100F04 2101198646 1001198644 3951198644 9481198643 5971198643 2951198643 1511198643 1501198643 4694 100 237F05 3241198646 3271198644 2091198644 4141198644 7131198643 1541198643 8931198643 1531198643 62948 100 20373F06 100 5941198644 42204 63215 6001198644 192 191 191 191 191 191F07 1063 2071198646 900 21672 32455 307E4 104 103 102 101 100F08 5950 9881198643 304E4 14180 21701 32468 307E4 107 107 103 100F09 22687 4951198646 308 891 11422 17360 25941 245E4 145 100 171F10 25737 2831198644 935E5 137E4 9709 6656 9925 14940 139E4 100 273F11 607 100 183E4 202 1661 39040 26299 40300 60361 572E4 184F12 300 279E4 360E4 100 282E4 27095 19414 13077 20040 30016 284E4F13 232 966 561E6 189E6 100 815 26778 19186 12923 19805 29665F14 18466 384E5 117E4 185E3 100 571E3 2511 5539 3960 2657 4149

1 1 0 1 2 0 0 0 0 6 3

Table 11 Best normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 391E3 100 100 100 100 100 100 100 100 100 100F03 100 425 151 220 220 212 220 199 220 128 180F04 100 255 6554 255 255 255 255 255 255 255 255F05 252 100 171 169E3 171 171 171 171 171 171 171F06 100 5987 3418 2285 4652 2344 4354 4484 3688 1899 812F07 807 100 148 255 255 1306 255 255 255 255 255F08 543 100 192 100 100 100 1145 100 100 100 100F09 7672 252 117 100 171 171 171 15556 171 171 171F10 92910 148 100 491 255 255 255 222 235E3 255 255F11 318E3 100 582E3 399E3 339E3 582E3 491E3 582E3 582E3 958E3 582E3F12 30506 100 53728 9734 18758 9734 9734 9734 9734 9734 39827F13 192 422 587 1007 1298 466 148 401 317 100 409F14 8812 100 148 255 255 491 255 255 255 255 255

4 8 3 4 3 3 2 3 3 4 3

5 Conclusion and Future Work

This paper proposed a hybrid metaheuristic HSBA methodfor optimization problemWe improved the BAby combiningoriginal harmony search (HS) algorithm and evaluating theHSBA on multimodal numerical optimization problemsA novel type of BA model has been presented and animprovement is applied to mutate between bats using har-mony search algorithm during the process of bats updatingUsing the original configuration of the bat algorithm wegenerate the new harmonies based on the newly generatedbat each iteration after batrsquos position has been updated Thenew harmony vector substitutes the newly generated bat only

if it has better fitness This selection scheme is rather greedywhich often overtakes original HS and BA The HSBAattempts to take merits of the BA and the HS in order toavoid all bats getting trapped in inferior local optimal regionsThe HSBA enables the bats to have more diverse exemplarsto learn from as the bats are updated each iteration andalso form new harmonies to search in a larger search spaceThis new method can speed up the global convergence ratewithout losing the strong robustness of the basic BA Fromthe analysis of the experimental results we observe that theproposed HSBA makes good use of the information in pastsolutions more effectively to generate better quality solutionsfrequently when compared to the other population-based

Journal of Applied Mathematics 19

Table 12Mean normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 407 100 100 452 277 257 341 536 369 489 103F03 100 127E4 134 135 135 135 134 135 134 134 134F04 115 8127 939E3 100 101 101 129 100 100 101 101F05 34573 5061 8619 907E3 2501 108 101 191 333 100 118F06 100 542E6 427E4 474E4 548E6 57788 57788 57788 57788 57787 57787F07 486 153 100 9543 10596 122E4 134 136 132 133 135F08 1269 7600 876E3 1703 6913 7672 885E3 110 105 104 100F09 6345 23047 182 167 1278 4860 5349 609E3 111 130 100F10 13355 1251 126 197E3 1148 464 1645 1893 199E3 100 188F11 6302 100 639E3 7944 5924 396E3 142E3 581E3 645E3 745E5 7981F12 390 881E3 890E3 100 890E3 4440 4778 1725 7011 7784 899E3F13 100 2166 207E3 669 580 430 27167 28246 10539 42926 47662F14 4614 100 3610 5593 122 640E3 4215 3289 3481 1272 5129

4 4 3 3 1 1 1 2 2 3 3

optimization algorithms such as ACO BA BBO DE ESGA HS PSO and SGA Based on the results of the tenapproaches on the test problems we can conclude that theHSBA significantly improves the performances of the HSand the BA on most multimodal and unimodal problems

In this work 14 benchmark functions are used to evaluatethe performance of our approach we will test our approachon more problems such as the high-dimensional (119863 ge20) CEC 2010 test suit [45] and the real-world problemsMoreover we will compare HSBA with other metaheuristicmethod such as DLHS [46 47] SGHS [48] adaptive DE[49] CLPSO [50] and DMS-PSO [51] In addition we onlyconsider the unconstrained function optimization in thiswork Our future work consists on adding the diversity rulesinto HSBA for constrained optimization problems such asconstrained real-parameter optimization CEC 2010 test suit[52]

In the field of optimization there are many issues worthyof further study and efficient optimizationmethod should bedeveloped depending on the analysis of specific engineeringproblem Our future work will focus on the two issues onthe one hand we would apply our proposed approach HSBA to solve practical engineering optimization problemsand obviously HSBA can become a fascinating method forreal-world engineering optimization problems on the otherhand we would develop new meta-hybrid approach to solveoptimization problem

Acknowledgments

This work was supported by State Key Laboratory of LaserInteraction with Material Research Fund under Grant noSKLLIM0902-01 and Key Research Technology of Electric-discharge Non-chain Pulsed DF Laser under Grant no LXJJ-11-Q80

References

[1] G Wang L Guo A H Gandomi et al ldquoLevy-flight krill herdalgorithmrdquo Mathematical Problems in Engineering vol 2013Article ID 682073 14 pages 2013

[2] A H Gandomi X S Yang S Talatahari and A H AlaviMetaheuristic Applications in Structures and InfrastructuresElsevier Waltham Mass USA 2013

[3] D H Ackley ldquoAn empirical study of bit vector functionoptimizationrdquo in Genetic Algorithms and Simulated AnnealingL Davis Ed pp 170ndash204 Pitman London UK 1987

[4] R Fletcher and M J D Powell ldquoA rapidly convergent descentmethod for minimizationrdquoThe Computer Journal vol 6 no 2pp 163ndash168 1963

[5] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[6] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[7] K A De Jong Analysis of the Behavior of a Class of GeneticAdaptive Systems University of Michigan 1975

[8] L A Rastrigin Extremal Control Systems NaukaMoscow Rus-sian 1974Theoretical Foundations of Engineering CyberneticsSeries

[9] S-K S Fan and J-M Chang ldquoDynamic multi-swarm particleswarm optimizer using parallel PC cluster systems for globaloptimization of large-scale multimodal functionsrdquo EngineeringOptimization vol 42 no 5 pp 431ndash451 2010

[10] W Gong Z Cai and C X Ling ldquoDEBBO a hybrid differentialevolution with biogeography-based optimization for globalnumerical optimizationrdquo Soft Computing vol 15 no 4 pp 645ndash665 2010

[11] G Wang L Guo H Duan L Liu and H Wang ldquoA modifiedfirefly algorithm for UCAV path planningrdquo International Jour-nal of Hybrid Information Technology vol 5 no 3 pp 123ndash1442012

20 Journal of Applied Mathematics

[12] GWang LGuoHDuan L Liu andHWang ldquoAhybridmeta-heuristic DECS algorithm for UCAVpath planningrdquo Journal ofInformation and Computational Science vol 9 no 16 pp 1ndash82012

[13] GWang L Guo H Duan L Liu HWang andM Shao ldquoPathplanning for uninhabited combat aerial vehicle using hybridmeta-heuristic DEBBO algorithmrdquo Advanced Science Engi-neering and Medicine vol 4 no 6 pp 550ndash564 2012

[14] G Wang L Guo H Duan H Wang L Liu and M Shao ldquoAhybrid meta-heuristic DECS algorithm for UCAV three-dimension path planningrdquo The Scientific World Journal vol2012 Article ID 583973 11 pages 2012

[15] H DuanW Zhao GWang and X Feng ldquoTest-sheet composi-tion using analytic hierarchy process and hybrid metaheuristicalgorithmTSBBOrdquoMathematical Problems in Engineering vol2012 Article ID 712752 22 pages 2012

[16] G Wang L Guo H Duan L Liu and H Wang ldquoDynamicdeployment of wireless sensor networks by biogeography basedoptimization algorithmrdquo Journal of Sensor and Actuator Net-works vol 1 no 2 pp 86ndash96 2012

[17] X S Yang A H Gandomi S Talatahari and A H AlaviMetaheuristics in Water Geotechnical and Transport Engineer-ing Elsevier Waltham Mass USA 2013

[18] D E GoldbergGenetic Algorithms in Search Optimization andMachine Learning Addison-Wesley New York NY USA 1998

[19] R Storn and K Price ldquoDifferential evolutionmdasha simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997

[20] S Das and P N Suganthan ldquoDifferential evolution a survey ofthe state-of-the-artrdquo IEEE Transactions on Evolutionary Com-putation vol 15 no 1 pp 4ndash31 2011

[21] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[22] A H Gandomi G J Yun X-S Yang and S Talatahari ldquoChaos-enhanced accelerated particle swarm optimizationrdquo Communi-cations in Nonlinear Science and Numerical Simulation vol 18no 2 pp 327ndash340 2013

[23] S Talatahari X S Yang R Sheikholeslami andAH GandomildquoOptimal parameter estimation for muskingum model usinghybrid CSS and PSO methodrdquo Journal of Computational andTheoretical Nanoscience In press

[24] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[25] G Wang L Guo H Duan L Liu H Wang and M ShaoldquoHybridizing harmony search with biogeography based opti-mization for global numerical optimizationrdquo Journal of Com-putational and Theoretical Nanoscience In press

[26] Z W Geem J H Kim and G V Loganathan ldquoA new heuristicoptimization algorithm harmony searchrdquo Simulation vol 76no 2 pp 60ndash68 2001

[27] G Wang L Guo H Duan H Wang L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[28] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress Frome UK 2nd edition 2010

[29] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing amp Applications In press

[30] X S Yang andAHGandomi ldquoBat algorithm a novel approachfor global engineering optimizationrdquo Engineering Computa-tions vol 29 no 5 pp 464ndash483 2012

[31] G Wang L Guo H Duan L Liu and H Wang ldquoA Bat algo-rithm with mutation for UCAV path planningrdquo The ScientificWorld Journal vol 2012 Article ID 418946 15 pages 2012

[32] R Parpinelli and H Lopes ldquoNew inspirations in swarm intelli-gence a surveyrdquo International Journal of Bio-Inspired Computa-tion vol 3 no 1 pp 1ndash16 2011

[33] P W Tsai J S Pan B Y Liao M J Tsai and V Istanda ldquoBatalgorithm inspired algorithm for solving numerical optimiza-tion problemsrdquo Applied Mechanics and Materials vol 148 pp134ndash137 2012

[34] Wikipedia ldquoMathematical optimizationrdquo httpenwikipediaorgwikiNumerical optimization

[35] W Y Zhang S Xu and S J Li ldquoNecessary conditions for weaksharp minima in cone-constrained optimization problemsrdquoAbstract and Applied Analysis vol 2011 Article ID 909520 11pages 2012

[36] Wikipedia ldquoGlobal optimizationrdquo httpenwikipediaorgwikiGlobal optimization

[37] S Kirkpatrick C D Gelatt Jr and M P Vecchi ldquoOptimizationby simulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983

[38] M Dorigo and T Stutzle Ant Colony Optimization The MITPress Cambridge Mass USA 2004

[39] M Dorigo Optimization Learning and Natural AlgorithmsPolitecnico di Milano Italy 1992

[40] H G Beyer The Theory of Evolution Strategies Springer NewYork NY USA 2001

[41] W Khatib and P Fleming ldquoThe stud GA a mini revolutionrdquo inParallel Problem Solving From Nature pp 683ndash691 1998

[42] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[43] M Friedman ldquoA comparison of alternative tests of significancefor the problem of 119898 rankingsrdquo The Annals of MathematicalStatistics vol 11 no 1 pp 86ndash92 1940

[44] R Brits A P Engelbrecht and F van den Bergh ldquoLocatingmultiple optima using particle swarm optimizationrdquo AppliedMathematics and Computation vol 189 no 2 pp 1859ndash18832007

[45] K Tang X Li P N Suganthan Z Yang and T Weise ldquoBench-mark functions for the CECrsquo2010 special session and competi-tion on large scale global optimizationrdquo Nature Inspired Com-putation and Applications Laboratory USTC Hefei China2010

[46] Q-K Pan P N Suganthan J J Liang and M F Tasgetiren ldquoAlocal-best harmony search algorithm with dynamic subpopula-tionsrdquo Engineering Optimization vol 42 no 2 pp 101ndash117 2010

[47] Q-K Pan P N Suganthan J J Liang and M F TasgetirenldquoA local-best harmony search algorithm with dynamic sub-harmony memories for lot-streaming flow shop schedulingproblemrdquo Expert Systems with Applications vol 38 no 4 pp3252ndash3259 2011

[48] Q-K Pan P N Suganthan M F Tasgetiren and J J LiangldquoA self-adaptive global best harmony search algorithm forcontinuous optimization problemsrdquo Applied Mathematics andComputation vol 216 no 3 pp 830ndash848 2010

Journal of Applied Mathematics 21

[49] S M Islam S Das S Ghosh S Roy and P N Suganthan ldquoAnadaptive differential evolution algorithm with novel mutationand crossover strategies for global numerical optimizationrdquoIEEE Transactions on Systems Man and Cybernetics B vol 42no 2 pp 482ndash500 2012

[50] J J Liang A K Qin P N Suganthan and S Baskar ldquoCompre-hensive learning particle swarm optimizer for global optimiza-tion of multimodal functionsrdquo IEEE Transactions on Evolution-ary Computation vol 10 no 3 pp 281ndash295 2006

[51] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[52] R Mallipeddi and P Suganthan Problem Definitions and Eval-uation Criteria for the CEC 2010 Competition on ConstrainedReal-Parameter Optimization Nanyang Technological Univer-sity Singapore 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

10 Journal of Applied Mathematics

Table 2 Properties of benchmark functions lb denotes lower bound ub denotes upper bound and opt denotes optimum point

No Function lb ub opt Continuity ModalityF01 Ackley minus32768 32768 0 Continuous MultimodalF02 Fletcher-Powell minus120587 120587 0 Continuous MultimodalF03 Griewank minus600 600 0 Continuous MultimodalF04 Penalty 1 minus50 50 0 Continuous MultimodalF05 Penalty 2 minus50 50 0 Continuous MultimodalF06 Quartic with noise minus128 128 1 Continuous MultimodalF07 Rastrigin minus512 512 0 Continuous MultimodalF08 Rosenbrock minus2048 2048 0 Continuous UnimodalF09 Schwefel 226 minus512 512 0 Continuous MultimodalF10 Schwefel 12 minus100 100 0 Continuous UnimodalF11 Schwefel 222 minus10 10 0 Continuous UnimodalF12 Schwefel 221 minus100 100 0 Continuous UnimodalF13 Sphere minus512 512 0 Continuous UnimodalF14 Step minus512 512 0 Discontinuous Unimodal

Table 3 Mean normalized optimization results in fourteen benchmark functions The values shown are the minimum objective functionvalues found by each algorithm averaged over 100 Monte Carlo simulations

ACO BA BBO DE ES GA HS HSBA PSO SGAF01 231 333 115 202 338 272 347 109 266 100F02 2458 2582 158 894 2435 545 1569 100 1396 133F03 316 6072 193 544 2385 322 7722 100 2577 142F04 100 30E38 40E32 56E33 27E38 31E32 14E39 23E32 41E36 96E31F05 100 11E8 29942 15E6 46E8 54E3 41E8 21551 55E7 11110F06 48901 68E3 3532 30829 18E4 27483 15E4 100 25E3 1009F07 809 1155 128 656 1187 617 1022 100 844 180F08 4225 2901 229 759 5999 905 4785 100 1204 215F09 317 2026 199 1358 1333 181 1992 100 1761 115F10 175 373 138 295 493 125 422 100 248 148F11 105 1970 183 714 2312 1113 1945 100 1322 246F12 186 403 100 299 391 192 374 138 238 112F13 9830 15084 380 1903 22652 4774 18232 100 7291 402F14 773 12048 393 1331 10256 1153 14655 100 6344 328TimE 274 100 132 164 167 179 233 143 203 176lowast

The values are normalized so that the minimum in each row is 100 These are not the absolute minima found by each algorithm but the average minimafound by each algorithm

by ACO after 5 generations For other algorithms althoughslower BBO GA and SGA eventually find the global mini-mum close to HSBA while BA DE ES HS and PSO fail tosearch the global minimum within the limited iterations

Figure 4 shows the results for F04 Penalty 1 functionFrom Figure 4 it is obvious that HSBA outperforms all othermethods during the whole progress of optimization in thismultimodal benchmark function Although slower SGA andBBO perform the second and third best at finding the globalminimum

Figure 5 shows the performance achieved for F05 Penalty2 function For this multimodal function similar to F04Penalty 1 function shown in Figure 4 HSBA is significantly

superior to all the other algorithms during the process ofoptimization Here PSO shows a faster convergence rateinitially than HSBA however it is outperformed by HSBAafter 3 generations

Figure 6 shows the results achieved for the ten methodswhen using the F06 Quartic (with noise) function FromTable 3 and Figure 6 we can conclude that HSBA performsthe best in this multimodal function Looking carefullyat Figure 6 PSO and HSBA show the same initial fastconvergence towards the known minimum as the procedureproceeds HSBA gets closer and closer to the minimumwhile PSO comes into being premature and traps into thelocal minimum BA ES GA HS and PSO do not manage

Journal of Applied Mathematics 11

Table 4 Best normalized optimization results in fourteen benchmark functions The values shown are the minimum objective functionvalues found by each algorithm

ACO BA BBO DE ES GA HS HSBA PSO SGAF01 185 231 100 143 225 203 230 105 191 104F02 1042 1448 109 422 1039 403 991 100 828 133F03 273 4622 187 447 2138 802 4324 100 1675 176F04 44E6 50E6 12E3 19E4 22E6 30E4 42E6 100 3755 322F05 30E4 32E4 5101 38633 16E4 88446 26E4 100 4113 457F06 5851 99255 650 2469 80848 5924 74691 100 18971 202F07 571 853 125 513 794 523 747 100 600 168F08 2642 2503 148 370 3352 616 2150 100 775 145F09 243 866 128 490 593 209 728 100 740 142F10 189 494 118 266 300 208 276 100 201 174F11 774 1261 121 336 1214 601 960 100 781 162F12 133 233 144 169 208 174 211 100 170 121F13 2804 5657 217 554 6057 1971 5286 100 2098 228F14 428 5402 197 485 3361 1060 4911 100 1972 185lowast

The values are normalized so that the minimum in each row is 100 These are the absolute best minima found by each algorithm

0 10 20 30 40 508

10

12

14

16

18

20

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 1 Comparison of the performance of the different methodsfor the F01 Ackley function

to succeed in this benchmark function within maximumnumber of generations At last BBO DE and SGA convergeto the value very close to HSBA

Figure 7 shows the optimization results for the F07Rastrigin function Very clearly HSBA has the fastest con-vergence rate at finding the globalminimumand significantlyoutperforms all other approaches Looking carefully at Fig-ure 7 approximately we can divide all the algorithms intothree groups one group including BBO HSBA and SGAperforming the best the other group including ACO DE

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

106

105

Figure 2 Comparison of the performance of the different methodsfor the F02 Fletcher-Powell function

GA and PSO performing the second best another groupincluding BA ES and HS performing the worst

Figure 8 shows the results for F08 Rosenbrock functionFrom Table 3 and Figure 8 we can see that HSBA is superiorto the other algorithms during the optimization process inthis relatively simple unimodal benchmark function Alsowe see that similar to the F06 Quartic (with noise) functionshown in Figure 6 PSO shows a faster convergence rateinitially than HSBA however it is outperformed by HSBAafter 4 generations and is premature to the local minimum

Figure 9 shows the equivalent results for the F09 Schwefel226 function From Figure 9 very clearly though HSBA isoutperformed by BBO between the generations 10ndash30 it hasthe stable convergence rate at finding the global minimum

12 Journal of Applied Mathematics

0 10 20 30 40 50

50

100

150

200

250

300

350

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 3 Comparison of the performance of the different methodsfor the F03 Griewank function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

1010

109

108

107

106

105

104

103

102

101

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 4 Comparison of the performance of the different methodsfor the F04 Penalty 1 function

and significantly outperforms all other approaches in thismultimodal benchmark function

Figure 10 shows the results for F10 Schwefel 12 functionFrom Figure 10 we can see that HSBA performs far betterthan other algorithms during the optimization process in thisrelative simple unimodal benchmark function PSO showsa faster convergence rate initially than HSBA howeverit is outperformed by HSBA after 3 generations Lookingcarefully at Figure 10 akin to F07 Rastrigin function shown inFigure 7 all the algorithms except BA can be approximately

0

Ben

chm

ark

func

tion

val

ue

5 10 15 20 25 30 35 40 45 50Number of generations

109

108

107

106

105

104

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 5 Comparison of the performance of the different methodsfor the F05 Penalty 2 function

0 10 20 30 40 500

5

10

15

20

25

30

35

40

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 6 Comparison of the performance of the different methodsfor the F06 Quartic (with noise) function

divided into three groups one group including BBO andHSBA performing the best the other group including ACOGA PSO and SGA performing the second best anothergroup including DE ES and HS performing the worst

Figure 11 shows the results for F11 Schwefel 222 functionVery clearly BBO shows a faster convergence rate initiallythan HSBA however it is outperformed by HSBA after 40generations At last HSBA reaches the optimal solution sig-nificantly superior to other algorithms BBO is only inferior

Journal of Applied Mathematics 13B

ench

mar

k fu

ncti

on v

alue

0 10 20 30 40 50

50

100

150

200

250

Number of generations

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 7 Comparison of the performance of the different methodsfor the F07 Rastrigin function

0 10 20 30 40 50

500

1000

1500

2000

2500

3000

3500

4000

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 8 Comparison of the performance of the different methodsfor the F08 Rosenbrock function

to HSBA and performs the second best in this unimodalfunction

Figure 12 shows the results for F12 Schwefel 221 functionVery clearly HSBAhas the fastest convergence rate at findingthe global minimum and significantly outperforms all otherapproaches

0 10 20 30 40 50

1000

2000

3000

4000

5000

6000

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 9 Comparison of the performance of the different methodsfor the F09 Schwefel 226 function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

104

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 10 Comparison of the performance of the differentmethodsfor the F10 Schwefel 12 function

Figure 13 shows the results for F13 Sphere function FromTable 3 and Figure 13 HSBA has the fastest convergencerate at finding the global minimum and outperforms all otherapproaches Looking carefully at Figure 13 for BBO and SGAwe can see that BBO has a faster convergence rate than SGAbut SGA does finally converge to the value of BBO that isapproaching to HSBA

Figure 14 shows the results for F14 Step function Appar-ently HSBA shows the fastest convergence rate at findingthe global minimum and significantly outperforms all otherapproaches Looking carefully at Figure 14 for BBO and SGA

14 Journal of Applied MathematicsB

ench

mar

k fu

ncti

on v

alue

0 5 10 15 20 25 30 35 40 45 50Number of generations

102

101

100

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 11 Comparison of the performance of the different methodsfor the F11 Schwefel 222 function

0 10 20 30 40 5030

35

40

45

50

55

60

65

70

75

80

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 12 Comparison of the performance of the different methodsfor the F12 Schwefel 221 function

we can see that BBO has a faster convergence rate than SGAbut SGA does finally converge to the value of BBO

From the above-analyses about Figures 1ndash14 we can cometo the conclusion that our proposed hybrid metaheuristicalgorithm HSBA significantly outperforms the other ninealgorithms In general ACO BBO and SGA are only inferiorto the HSBA and ACO BBO and SGA perform betterthanHSBA in the benchmarks F04-F05 the benchmark F12and the benchmark F01 respectively Further benchmarksF04 F05 F06 F08 and F10 illustrate that PSO has a faster

0 10 20 30 40 500

10

20

30

40

50

60

70

80

90

100

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 13 Comparison of the performance of the different methodsfor the F13 Sphere function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

104

103

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 14 Comparison of the performance of the differentmethodsfor the F14 Step function

convergence rate initially while later it converges slowerand slower to the true objective function value At lastwe must point out that in [24] Simon compared BBOwith seven state-of-the-art EAs over 14 benchmark func-tions and a real-world sensor selection problem The resultsdemonstrated the good performance of BBO It is also indi-rectly demonstrated that our proposed hybrid metaheuristicmethod HSBA is a more powerful and efficient optimiza-tion algorithm than other population-based optimizationalgorithms

Journal of Applied Mathematics 15

Table 5 Best normalized optimization results in fourteen benchmark functions with different 119860 The numbers shown are the best resultsfound after 100 Monte Carlo simulations HSBA algorithm

119860

0 01 02 03 04 05 06 07 08 09 10F01 166 133 155 140 100 133 121 122 116 113 114F02 125E4 100 100 100 100 100 100 100 100 100 100F03 341 1115 100 100 100 100 100 100 100 100 100F04 273E4 100 125E4 104 104 104 104 104 104 100 100F05 503E5 530 133 168E4 100 100 100 100 100 100 100F06 100 256 547 473 1781 990 289 674 990 260 557F07 1014 1492 435 110 110 1537 101 100 100 100 100F08 3849 108 1406 108 108 108 9901 101 100 100 100F09 28518 40458 138 474 119 119 119 36791 101 100 100F10 66522 468 169E3 1527 118 118 118 118 121 100 100F11 2003 100 996 1154 3922 996 996 996 996 3897 849F12 1032 100 1322 323 1469 279 279 279 279 279 1931F13 146 415 284 447 115 156 100 100 100 100 100F14 52752 1357 395 101 115 1291 100 100 100 100 100

1 4 2 2 3 3 5 6 7 10 10

42 Influence of Control Parameter In [28] Dr Yang con-cluded that if the parameters in BA can be adjusted properlyit can outperform GA HS (harmony search) [26] and PSOThe choice of the control parameters is of vital importancefor different problems To investigate the influence of theloudness119860 pulse rate 119903 harmonymemory consideration rateHMCR and pitch adjustment rate PAR on the performanceof HSBA we carry out this experiment in the benchmarkproblem with different parameters All other parametersettings are kept unchanged (unless noted otherwise in thefollowing paragraph) The results are recorded in Tables 5ndash12 after 100 Monte Carlo runs Among them Tables 5 7 9and 11 show the best minima found by HSBA algorithm over100Monte Carlo runs Tables 6 8 10 and 12 show the averageminima found byHSBA algorithm averaged over 100MonteCarlo runs In other words Tables 5 7 9 11 and Tables 68 10 12 show the best and average performance of HSBAalgorithm respectively In each table the last row is the totalnumber of functions on which HSBA performs the best withsome parameters

421 Loudness 119860 Tables 5 and 6 recorded the resultsperformed in the benchmark problem with the loudness 119860 =0 01 02 09 10 and fixed pulse rate 119903 = 06 harmonymemory consideration rate HMCR= 095 and pitch adjust-ment rate PAR= 01 From Tables 5 and 6 obviously it can beseen that (i) for the three benchmark functions F02 F03 andF04 HSBA performs slightly differently that is to say thesethree benchmark functions are insensitive to the parameter119860 (ii) For benchmark functions F01 F06 F11 and F12HSBA performs better on smaller119860(lt05) (iii) However forfunctions F05 F07ndash10 F13 and F14 HSBA performs betteron bigger119860(gt05) In sum HSBA performs the best when119860is equal or very close to 10 So we set119860 = 095 which is veryclose to 09 and 10 in other experiments

422 Pulse Rate 119903 Tables 7 and 8 recorded the resultsperformed on the benchmark problem with the pulse rate119903 = 0 01 02 09 10 and fixed loudness119860 = 095harmony memory consideration rate HMCR=095 andpitch adjustment rate PAR= 01 From Tables 7 and 8 wecan evidently conclude that HSBA performs significantlybetter on bigger 119903(gt05) than on smaller 119903(lt05) and HSBAperforms the best when r is equal or very close to 06 So weset 119903 = 06 in other experiments

423 Harmony Memory Consideration Rate119867119872119862119877 Tables9 and 10 recorded the results performed in the bench-mark problem with the harmony memory consideration rateHMCR = 0 01 02 09 10 fixed loudness 119860 = 095pulse rate 119903 = 06 and pitch adjustment rate PAR = 01 FromTables 9 and 10 we can recognize that the function valuesevaluated by HSBA are bettersmaller on bigger HMCR(gt05) than on smaller r (lt05) and most benchmarks reachminimum when HMCR is equal or very close to 1 So we setHMCR=095 in other experiments

424 Pitch Adjustment Rate 119875119860119877 Tables 11 and 12 recordedthe results performed on the benchmark problems with thepitch adjustment rate PAR = 0 01 02 09 10 fixedloudness119860 = 095 pulse rate 119903 = 06 and harmonymemory consideration rate HMCR = 095 From Tables 11and 12 we can recognize that the function values for HSBAvary little with the increasing PAR and HSBA reachesoptimumminimum in most benchmarks when PAR is equalor very close to 01 So we set PAR = 01 in other experiments

43 Discussion In the HSBA the bats fly in the sky usingecholocation to find foodprey (ie best solutions) Fourother parameters are the loudness (119860) the rate of pulseemission (119903) harmony memory consideration rate (HMCR)

16 Journal of Applied Mathematics

Table 6 Mean normalized optimization results in fourteen benchmark functions with different A The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119860

0 01 02 03 04 05 06 07 08 09 10F01 101 101 100 100 100 100 100 100 100 100 100F02 152 100 146 117 108 137 188 165 167 179 148F03 669 1041 106 101 103 103 101 102 105 100 100F04 20131 126 410 421 333 272 100 519 291 100 311F05 59122 1682 425 587 101 414 2404 100 368 733 900F06 100 490E4 63799 25871 495E4 218 217 217 218 217 217F07 857 221E6 643 27234 11048 211E4 102 101 100 100 100F08 4980 114E4 172E4 16160 22459 9119 174E4 103 111 100 100F09 8237 173E6 106 314 7445 10310 4226 794E3 100 137 129F10 9031 145E3 245E5 232E3 2334 2234 3138 1289 234E3 140 100F11 498 100 315E4 233 1441 52023 44365 61626 24991 478E4 214F12 369 210E4 457E4 100 212E4 26635 23313 19880 27614 11201 214E4F13 190 825 448E6 214E6 100 615 25451 22275 18996 26386 10701F14 6691 195E5 117E4 124E3 2104 186E3 2940 2392 100 1835 2475

1 2 1 2 2 1 2 2 4 5 5

Table 7 Best normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 184 184 484 159 171 100 134 109 116 101F02 944E3 100 100 100 100 100 100 100 100 100 100F03 373 546 186 186 186 186 100 172 121 161 101F04 1142 111 2429 123 126 100 129 129 129 129 129F05 177E5 230 115 109E4 100 100 100 100 100 100 100F06 6244 2996 4858 1560 2634 780 182 100 3520 239 155F07 831 448 258 129 129 349 100 100 100 100 100F08 961 118 420 137 137 137 695 101 101 100 100F09 37394 44422 168 335 168 168 100 29198 101 101 101F10 38693 316 1397 463 158 158 100 158 30931 158 101F11 4255 100 1829 2135 2018 1111 1904 2135 1581 1876 1155F12 11440 100 14184 4448 12547 4448 4448 4448 4448 4448 6943F13 639 637 601 296 302 100 152 139 100 257 249F14 12052 404 233 100 116 340 100 100 116 116 116

0 3 1 2 2 4 8 5 4 4 4

and pitch adjustment rate (PAR) The appropriate update forthe pitch adjustment rate (PAR) and the rate of pulse emission(119903) balances the exploration and exploitation behavior of eachbat respectively as the loudness usually decreases once a bathas found its preysolution while the rate of pulse emissionincreases in order to raise the attack accuracy

For all of the standard benchmark functions that havebeen considered the HSBA has been demonstrated toperform better than or be equal to the standard BA and otheracclaimed state-of-the-art population-based algorithms withthe HSBA performing significantly better in some functionsThe HSBA performs excellently and efficiently because ofits ability to simultaneously carry out a local search still

searching globally at the same time It succeeds in doingthis due to the local search via harmony search algorithmand global search via bat algorithm concurrently A similarbehavior may be performed in the PSO by using multiswarmfrom a particle population initially [44] However HSBArsquosadvantages include performing simply and easily and onlyhaving four parameters to regulate The work carried outhere demonstrates the HSBA to be robust over all kinds ofbenchmark functions

Benchmark evaluation is a good way for verifying theperformance of the metaheuristic algorithms but it also haslimitations First we did not make any special effort to tunethe optimization algorithms in this section Different tuning

Journal of Applied Mathematics 17

Table 8 Mean normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 199 186 194 159 100 142 134 109 116 171F02 331 538 457 308 382 114 100 289 279 402 311F03 373 546 533 229 336 241 100 172 121 161 136F04 1142 111 2429 149E4 113E3 394E3 278E4 1966 17522 114 100F05 1609 12862 3269 2446 6440 1277 2959 7617 456 100 446F06 6244 2696 4858 1560 2634 780 182 100 3520 239 155F07 330 178 134 103 123 139 155 100 119 136 349F08 488 393 394 194 219 383 353 433 100 227 314F09 139 166 114 100 121 115 115 109 108 104 109F10 721 994 875 360 846 611 483 414 576 100 271F11 382 423 373 205 181 100 171 220 142 168 189F12 164 217 204 147 180 186 100 121 113 134 156F13 639 637 601 296 302 190 152 139 100 257 249F14 574 1543 742 682 644 317 100 485 221 174 209

0 0 0 1 0 2 4 2 2 2 1

Table 9 Best normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 164 164 164 164 164 164 164 151 128 100 163F02 187E4 100 100 100 100 100 100 100 100 100 100F03 1982 2649 371 371 371 371 371 371 371 206 100F04 126E5 100 187E4 100 100 100 100 100 100 100 100F05 607E5 534 100 187E4 100 100 100 100 100 100 100F06 10892 24631 36504 34540 33857 23465 14349 4528 2330 100 2575F07 922 1428 534 100 100 930 100 100 100 100 100F08 1810 108 772 108 108 108 3594 100 100 100 100F09 28004 39161 127 681 127 127 127 22795 100 100 100F10 55165 876 176E3 1170 164 164 164 164 27448 100 100F11 1467 100 618 618 1399 618 618 618 440 269 616F12 768 100 1110 273 1021 273 273 273 273 273 473F13 772 2675 1590 1776 612 1040 612 595 255 100 225F14 53716 1428 534 100 100 713 100 100 100 100 100

0 4 2 4 5 3 5 6 7 11 9

parameter values in the optimization algorithmsmight resultin significant differences in their performance Second real-world optimization problems may not have much of arelationship to benchmark functionsThird benchmark testsmight result in different conclusions if the grading criteria orproblem setup change In our work we examined the meanand best results obtained with a certain population size andafter a certain number of generations However we mightarrive at different conclusions if (for example) we change thegeneration limit or look at how many generations it takes toreach a certain function value or if we change the populationsize In spite of these caveats the benchmark results shown

here are promising for HSBA and indicate that this novelmethod might be able to find a niche among the plethora ofpopulation-based optimization algorithms

We note that CPU time is a bottleneck to the implemen-tation of many population-based optimization algorithmsIf an algorithm cannot converge fast it will be impracticalsince it would take too long to find an optimal or suboptimalsolution HSBA does not seem to require an unreasonableamount of computational effort of the ten optimization algo-rithms compared in this paper HSBA was the third fastestNevertheless findingmechanisms to accelerate HSBA couldbe an important area for further research

18 Journal of Applied Mathematics

Table 10 Mean normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 101 101 101 101 101 101 101 101 101 100 101F02 74756 787 525 343 509 710 307 187 160 100 101F03 913 312E4 109 107 105 106 105 102 101 101 100F04 2101198646 1001198644 3951198644 9481198643 5971198643 2951198643 1511198643 1501198643 4694 100 237F05 3241198646 3271198644 2091198644 4141198644 7131198643 1541198643 8931198643 1531198643 62948 100 20373F06 100 5941198644 42204 63215 6001198644 192 191 191 191 191 191F07 1063 2071198646 900 21672 32455 307E4 104 103 102 101 100F08 5950 9881198643 304E4 14180 21701 32468 307E4 107 107 103 100F09 22687 4951198646 308 891 11422 17360 25941 245E4 145 100 171F10 25737 2831198644 935E5 137E4 9709 6656 9925 14940 139E4 100 273F11 607 100 183E4 202 1661 39040 26299 40300 60361 572E4 184F12 300 279E4 360E4 100 282E4 27095 19414 13077 20040 30016 284E4F13 232 966 561E6 189E6 100 815 26778 19186 12923 19805 29665F14 18466 384E5 117E4 185E3 100 571E3 2511 5539 3960 2657 4149

1 1 0 1 2 0 0 0 0 6 3

Table 11 Best normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 391E3 100 100 100 100 100 100 100 100 100 100F03 100 425 151 220 220 212 220 199 220 128 180F04 100 255 6554 255 255 255 255 255 255 255 255F05 252 100 171 169E3 171 171 171 171 171 171 171F06 100 5987 3418 2285 4652 2344 4354 4484 3688 1899 812F07 807 100 148 255 255 1306 255 255 255 255 255F08 543 100 192 100 100 100 1145 100 100 100 100F09 7672 252 117 100 171 171 171 15556 171 171 171F10 92910 148 100 491 255 255 255 222 235E3 255 255F11 318E3 100 582E3 399E3 339E3 582E3 491E3 582E3 582E3 958E3 582E3F12 30506 100 53728 9734 18758 9734 9734 9734 9734 9734 39827F13 192 422 587 1007 1298 466 148 401 317 100 409F14 8812 100 148 255 255 491 255 255 255 255 255

4 8 3 4 3 3 2 3 3 4 3

5 Conclusion and Future Work

This paper proposed a hybrid metaheuristic HSBA methodfor optimization problemWe improved the BAby combiningoriginal harmony search (HS) algorithm and evaluating theHSBA on multimodal numerical optimization problemsA novel type of BA model has been presented and animprovement is applied to mutate between bats using har-mony search algorithm during the process of bats updatingUsing the original configuration of the bat algorithm wegenerate the new harmonies based on the newly generatedbat each iteration after batrsquos position has been updated Thenew harmony vector substitutes the newly generated bat only

if it has better fitness This selection scheme is rather greedywhich often overtakes original HS and BA The HSBAattempts to take merits of the BA and the HS in order toavoid all bats getting trapped in inferior local optimal regionsThe HSBA enables the bats to have more diverse exemplarsto learn from as the bats are updated each iteration andalso form new harmonies to search in a larger search spaceThis new method can speed up the global convergence ratewithout losing the strong robustness of the basic BA Fromthe analysis of the experimental results we observe that theproposed HSBA makes good use of the information in pastsolutions more effectively to generate better quality solutionsfrequently when compared to the other population-based

Journal of Applied Mathematics 19

Table 12Mean normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 407 100 100 452 277 257 341 536 369 489 103F03 100 127E4 134 135 135 135 134 135 134 134 134F04 115 8127 939E3 100 101 101 129 100 100 101 101F05 34573 5061 8619 907E3 2501 108 101 191 333 100 118F06 100 542E6 427E4 474E4 548E6 57788 57788 57788 57788 57787 57787F07 486 153 100 9543 10596 122E4 134 136 132 133 135F08 1269 7600 876E3 1703 6913 7672 885E3 110 105 104 100F09 6345 23047 182 167 1278 4860 5349 609E3 111 130 100F10 13355 1251 126 197E3 1148 464 1645 1893 199E3 100 188F11 6302 100 639E3 7944 5924 396E3 142E3 581E3 645E3 745E5 7981F12 390 881E3 890E3 100 890E3 4440 4778 1725 7011 7784 899E3F13 100 2166 207E3 669 580 430 27167 28246 10539 42926 47662F14 4614 100 3610 5593 122 640E3 4215 3289 3481 1272 5129

4 4 3 3 1 1 1 2 2 3 3

optimization algorithms such as ACO BA BBO DE ESGA HS PSO and SGA Based on the results of the tenapproaches on the test problems we can conclude that theHSBA significantly improves the performances of the HSand the BA on most multimodal and unimodal problems

In this work 14 benchmark functions are used to evaluatethe performance of our approach we will test our approachon more problems such as the high-dimensional (119863 ge20) CEC 2010 test suit [45] and the real-world problemsMoreover we will compare HSBA with other metaheuristicmethod such as DLHS [46 47] SGHS [48] adaptive DE[49] CLPSO [50] and DMS-PSO [51] In addition we onlyconsider the unconstrained function optimization in thiswork Our future work consists on adding the diversity rulesinto HSBA for constrained optimization problems such asconstrained real-parameter optimization CEC 2010 test suit[52]

In the field of optimization there are many issues worthyof further study and efficient optimizationmethod should bedeveloped depending on the analysis of specific engineeringproblem Our future work will focus on the two issues onthe one hand we would apply our proposed approach HSBA to solve practical engineering optimization problemsand obviously HSBA can become a fascinating method forreal-world engineering optimization problems on the otherhand we would develop new meta-hybrid approach to solveoptimization problem

Acknowledgments

This work was supported by State Key Laboratory of LaserInteraction with Material Research Fund under Grant noSKLLIM0902-01 and Key Research Technology of Electric-discharge Non-chain Pulsed DF Laser under Grant no LXJJ-11-Q80

References

[1] G Wang L Guo A H Gandomi et al ldquoLevy-flight krill herdalgorithmrdquo Mathematical Problems in Engineering vol 2013Article ID 682073 14 pages 2013

[2] A H Gandomi X S Yang S Talatahari and A H AlaviMetaheuristic Applications in Structures and InfrastructuresElsevier Waltham Mass USA 2013

[3] D H Ackley ldquoAn empirical study of bit vector functionoptimizationrdquo in Genetic Algorithms and Simulated AnnealingL Davis Ed pp 170ndash204 Pitman London UK 1987

[4] R Fletcher and M J D Powell ldquoA rapidly convergent descentmethod for minimizationrdquoThe Computer Journal vol 6 no 2pp 163ndash168 1963

[5] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[6] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[7] K A De Jong Analysis of the Behavior of a Class of GeneticAdaptive Systems University of Michigan 1975

[8] L A Rastrigin Extremal Control Systems NaukaMoscow Rus-sian 1974Theoretical Foundations of Engineering CyberneticsSeries

[9] S-K S Fan and J-M Chang ldquoDynamic multi-swarm particleswarm optimizer using parallel PC cluster systems for globaloptimization of large-scale multimodal functionsrdquo EngineeringOptimization vol 42 no 5 pp 431ndash451 2010

[10] W Gong Z Cai and C X Ling ldquoDEBBO a hybrid differentialevolution with biogeography-based optimization for globalnumerical optimizationrdquo Soft Computing vol 15 no 4 pp 645ndash665 2010

[11] G Wang L Guo H Duan L Liu and H Wang ldquoA modifiedfirefly algorithm for UCAV path planningrdquo International Jour-nal of Hybrid Information Technology vol 5 no 3 pp 123ndash1442012

20 Journal of Applied Mathematics

[12] GWang LGuoHDuan L Liu andHWang ldquoAhybridmeta-heuristic DECS algorithm for UCAVpath planningrdquo Journal ofInformation and Computational Science vol 9 no 16 pp 1ndash82012

[13] GWang L Guo H Duan L Liu HWang andM Shao ldquoPathplanning for uninhabited combat aerial vehicle using hybridmeta-heuristic DEBBO algorithmrdquo Advanced Science Engi-neering and Medicine vol 4 no 6 pp 550ndash564 2012

[14] G Wang L Guo H Duan H Wang L Liu and M Shao ldquoAhybrid meta-heuristic DECS algorithm for UCAV three-dimension path planningrdquo The Scientific World Journal vol2012 Article ID 583973 11 pages 2012

[15] H DuanW Zhao GWang and X Feng ldquoTest-sheet composi-tion using analytic hierarchy process and hybrid metaheuristicalgorithmTSBBOrdquoMathematical Problems in Engineering vol2012 Article ID 712752 22 pages 2012

[16] G Wang L Guo H Duan L Liu and H Wang ldquoDynamicdeployment of wireless sensor networks by biogeography basedoptimization algorithmrdquo Journal of Sensor and Actuator Net-works vol 1 no 2 pp 86ndash96 2012

[17] X S Yang A H Gandomi S Talatahari and A H AlaviMetaheuristics in Water Geotechnical and Transport Engineer-ing Elsevier Waltham Mass USA 2013

[18] D E GoldbergGenetic Algorithms in Search Optimization andMachine Learning Addison-Wesley New York NY USA 1998

[19] R Storn and K Price ldquoDifferential evolutionmdasha simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997

[20] S Das and P N Suganthan ldquoDifferential evolution a survey ofthe state-of-the-artrdquo IEEE Transactions on Evolutionary Com-putation vol 15 no 1 pp 4ndash31 2011

[21] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[22] A H Gandomi G J Yun X-S Yang and S Talatahari ldquoChaos-enhanced accelerated particle swarm optimizationrdquo Communi-cations in Nonlinear Science and Numerical Simulation vol 18no 2 pp 327ndash340 2013

[23] S Talatahari X S Yang R Sheikholeslami andAH GandomildquoOptimal parameter estimation for muskingum model usinghybrid CSS and PSO methodrdquo Journal of Computational andTheoretical Nanoscience In press

[24] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[25] G Wang L Guo H Duan L Liu H Wang and M ShaoldquoHybridizing harmony search with biogeography based opti-mization for global numerical optimizationrdquo Journal of Com-putational and Theoretical Nanoscience In press

[26] Z W Geem J H Kim and G V Loganathan ldquoA new heuristicoptimization algorithm harmony searchrdquo Simulation vol 76no 2 pp 60ndash68 2001

[27] G Wang L Guo H Duan H Wang L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[28] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress Frome UK 2nd edition 2010

[29] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing amp Applications In press

[30] X S Yang andAHGandomi ldquoBat algorithm a novel approachfor global engineering optimizationrdquo Engineering Computa-tions vol 29 no 5 pp 464ndash483 2012

[31] G Wang L Guo H Duan L Liu and H Wang ldquoA Bat algo-rithm with mutation for UCAV path planningrdquo The ScientificWorld Journal vol 2012 Article ID 418946 15 pages 2012

[32] R Parpinelli and H Lopes ldquoNew inspirations in swarm intelli-gence a surveyrdquo International Journal of Bio-Inspired Computa-tion vol 3 no 1 pp 1ndash16 2011

[33] P W Tsai J S Pan B Y Liao M J Tsai and V Istanda ldquoBatalgorithm inspired algorithm for solving numerical optimiza-tion problemsrdquo Applied Mechanics and Materials vol 148 pp134ndash137 2012

[34] Wikipedia ldquoMathematical optimizationrdquo httpenwikipediaorgwikiNumerical optimization

[35] W Y Zhang S Xu and S J Li ldquoNecessary conditions for weaksharp minima in cone-constrained optimization problemsrdquoAbstract and Applied Analysis vol 2011 Article ID 909520 11pages 2012

[36] Wikipedia ldquoGlobal optimizationrdquo httpenwikipediaorgwikiGlobal optimization

[37] S Kirkpatrick C D Gelatt Jr and M P Vecchi ldquoOptimizationby simulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983

[38] M Dorigo and T Stutzle Ant Colony Optimization The MITPress Cambridge Mass USA 2004

[39] M Dorigo Optimization Learning and Natural AlgorithmsPolitecnico di Milano Italy 1992

[40] H G Beyer The Theory of Evolution Strategies Springer NewYork NY USA 2001

[41] W Khatib and P Fleming ldquoThe stud GA a mini revolutionrdquo inParallel Problem Solving From Nature pp 683ndash691 1998

[42] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[43] M Friedman ldquoA comparison of alternative tests of significancefor the problem of 119898 rankingsrdquo The Annals of MathematicalStatistics vol 11 no 1 pp 86ndash92 1940

[44] R Brits A P Engelbrecht and F van den Bergh ldquoLocatingmultiple optima using particle swarm optimizationrdquo AppliedMathematics and Computation vol 189 no 2 pp 1859ndash18832007

[45] K Tang X Li P N Suganthan Z Yang and T Weise ldquoBench-mark functions for the CECrsquo2010 special session and competi-tion on large scale global optimizationrdquo Nature Inspired Com-putation and Applications Laboratory USTC Hefei China2010

[46] Q-K Pan P N Suganthan J J Liang and M F Tasgetiren ldquoAlocal-best harmony search algorithm with dynamic subpopula-tionsrdquo Engineering Optimization vol 42 no 2 pp 101ndash117 2010

[47] Q-K Pan P N Suganthan J J Liang and M F TasgetirenldquoA local-best harmony search algorithm with dynamic sub-harmony memories for lot-streaming flow shop schedulingproblemrdquo Expert Systems with Applications vol 38 no 4 pp3252ndash3259 2011

[48] Q-K Pan P N Suganthan M F Tasgetiren and J J LiangldquoA self-adaptive global best harmony search algorithm forcontinuous optimization problemsrdquo Applied Mathematics andComputation vol 216 no 3 pp 830ndash848 2010

Journal of Applied Mathematics 21

[49] S M Islam S Das S Ghosh S Roy and P N Suganthan ldquoAnadaptive differential evolution algorithm with novel mutationand crossover strategies for global numerical optimizationrdquoIEEE Transactions on Systems Man and Cybernetics B vol 42no 2 pp 482ndash500 2012

[50] J J Liang A K Qin P N Suganthan and S Baskar ldquoCompre-hensive learning particle swarm optimizer for global optimiza-tion of multimodal functionsrdquo IEEE Transactions on Evolution-ary Computation vol 10 no 3 pp 281ndash295 2006

[51] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[52] R Mallipeddi and P Suganthan Problem Definitions and Eval-uation Criteria for the CEC 2010 Competition on ConstrainedReal-Parameter Optimization Nanyang Technological Univer-sity Singapore 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Journal of Applied Mathematics 11

Table 4 Best normalized optimization results in fourteen benchmark functions The values shown are the minimum objective functionvalues found by each algorithm

ACO BA BBO DE ES GA HS HSBA PSO SGAF01 185 231 100 143 225 203 230 105 191 104F02 1042 1448 109 422 1039 403 991 100 828 133F03 273 4622 187 447 2138 802 4324 100 1675 176F04 44E6 50E6 12E3 19E4 22E6 30E4 42E6 100 3755 322F05 30E4 32E4 5101 38633 16E4 88446 26E4 100 4113 457F06 5851 99255 650 2469 80848 5924 74691 100 18971 202F07 571 853 125 513 794 523 747 100 600 168F08 2642 2503 148 370 3352 616 2150 100 775 145F09 243 866 128 490 593 209 728 100 740 142F10 189 494 118 266 300 208 276 100 201 174F11 774 1261 121 336 1214 601 960 100 781 162F12 133 233 144 169 208 174 211 100 170 121F13 2804 5657 217 554 6057 1971 5286 100 2098 228F14 428 5402 197 485 3361 1060 4911 100 1972 185lowast

The values are normalized so that the minimum in each row is 100 These are the absolute best minima found by each algorithm

0 10 20 30 40 508

10

12

14

16

18

20

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 1 Comparison of the performance of the different methodsfor the F01 Ackley function

to succeed in this benchmark function within maximumnumber of generations At last BBO DE and SGA convergeto the value very close to HSBA

Figure 7 shows the optimization results for the F07Rastrigin function Very clearly HSBA has the fastest con-vergence rate at finding the globalminimumand significantlyoutperforms all other approaches Looking carefully at Fig-ure 7 approximately we can divide all the algorithms intothree groups one group including BBO HSBA and SGAperforming the best the other group including ACO DE

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

106

105

Figure 2 Comparison of the performance of the different methodsfor the F02 Fletcher-Powell function

GA and PSO performing the second best another groupincluding BA ES and HS performing the worst

Figure 8 shows the results for F08 Rosenbrock functionFrom Table 3 and Figure 8 we can see that HSBA is superiorto the other algorithms during the optimization process inthis relatively simple unimodal benchmark function Alsowe see that similar to the F06 Quartic (with noise) functionshown in Figure 6 PSO shows a faster convergence rateinitially than HSBA however it is outperformed by HSBAafter 4 generations and is premature to the local minimum

Figure 9 shows the equivalent results for the F09 Schwefel226 function From Figure 9 very clearly though HSBA isoutperformed by BBO between the generations 10ndash30 it hasthe stable convergence rate at finding the global minimum

12 Journal of Applied Mathematics

0 10 20 30 40 50

50

100

150

200

250

300

350

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 3 Comparison of the performance of the different methodsfor the F03 Griewank function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

1010

109

108

107

106

105

104

103

102

101

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 4 Comparison of the performance of the different methodsfor the F04 Penalty 1 function

and significantly outperforms all other approaches in thismultimodal benchmark function

Figure 10 shows the results for F10 Schwefel 12 functionFrom Figure 10 we can see that HSBA performs far betterthan other algorithms during the optimization process in thisrelative simple unimodal benchmark function PSO showsa faster convergence rate initially than HSBA howeverit is outperformed by HSBA after 3 generations Lookingcarefully at Figure 10 akin to F07 Rastrigin function shown inFigure 7 all the algorithms except BA can be approximately

0

Ben

chm

ark

func

tion

val

ue

5 10 15 20 25 30 35 40 45 50Number of generations

109

108

107

106

105

104

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 5 Comparison of the performance of the different methodsfor the F05 Penalty 2 function

0 10 20 30 40 500

5

10

15

20

25

30

35

40

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 6 Comparison of the performance of the different methodsfor the F06 Quartic (with noise) function

divided into three groups one group including BBO andHSBA performing the best the other group including ACOGA PSO and SGA performing the second best anothergroup including DE ES and HS performing the worst

Figure 11 shows the results for F11 Schwefel 222 functionVery clearly BBO shows a faster convergence rate initiallythan HSBA however it is outperformed by HSBA after 40generations At last HSBA reaches the optimal solution sig-nificantly superior to other algorithms BBO is only inferior

Journal of Applied Mathematics 13B

ench

mar

k fu

ncti

on v

alue

0 10 20 30 40 50

50

100

150

200

250

Number of generations

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 7 Comparison of the performance of the different methodsfor the F07 Rastrigin function

0 10 20 30 40 50

500

1000

1500

2000

2500

3000

3500

4000

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 8 Comparison of the performance of the different methodsfor the F08 Rosenbrock function

to HSBA and performs the second best in this unimodalfunction

Figure 12 shows the results for F12 Schwefel 221 functionVery clearly HSBAhas the fastest convergence rate at findingthe global minimum and significantly outperforms all otherapproaches

0 10 20 30 40 50

1000

2000

3000

4000

5000

6000

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 9 Comparison of the performance of the different methodsfor the F09 Schwefel 226 function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

104

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 10 Comparison of the performance of the differentmethodsfor the F10 Schwefel 12 function

Figure 13 shows the results for F13 Sphere function FromTable 3 and Figure 13 HSBA has the fastest convergencerate at finding the global minimum and outperforms all otherapproaches Looking carefully at Figure 13 for BBO and SGAwe can see that BBO has a faster convergence rate than SGAbut SGA does finally converge to the value of BBO that isapproaching to HSBA

Figure 14 shows the results for F14 Step function Appar-ently HSBA shows the fastest convergence rate at findingthe global minimum and significantly outperforms all otherapproaches Looking carefully at Figure 14 for BBO and SGA

14 Journal of Applied MathematicsB

ench

mar

k fu

ncti

on v

alue

0 5 10 15 20 25 30 35 40 45 50Number of generations

102

101

100

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 11 Comparison of the performance of the different methodsfor the F11 Schwefel 222 function

0 10 20 30 40 5030

35

40

45

50

55

60

65

70

75

80

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 12 Comparison of the performance of the different methodsfor the F12 Schwefel 221 function

we can see that BBO has a faster convergence rate than SGAbut SGA does finally converge to the value of BBO

From the above-analyses about Figures 1ndash14 we can cometo the conclusion that our proposed hybrid metaheuristicalgorithm HSBA significantly outperforms the other ninealgorithms In general ACO BBO and SGA are only inferiorto the HSBA and ACO BBO and SGA perform betterthanHSBA in the benchmarks F04-F05 the benchmark F12and the benchmark F01 respectively Further benchmarksF04 F05 F06 F08 and F10 illustrate that PSO has a faster

0 10 20 30 40 500

10

20

30

40

50

60

70

80

90

100

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 13 Comparison of the performance of the different methodsfor the F13 Sphere function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

104

103

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 14 Comparison of the performance of the differentmethodsfor the F14 Step function

convergence rate initially while later it converges slowerand slower to the true objective function value At lastwe must point out that in [24] Simon compared BBOwith seven state-of-the-art EAs over 14 benchmark func-tions and a real-world sensor selection problem The resultsdemonstrated the good performance of BBO It is also indi-rectly demonstrated that our proposed hybrid metaheuristicmethod HSBA is a more powerful and efficient optimiza-tion algorithm than other population-based optimizationalgorithms

Journal of Applied Mathematics 15

Table 5 Best normalized optimization results in fourteen benchmark functions with different 119860 The numbers shown are the best resultsfound after 100 Monte Carlo simulations HSBA algorithm

119860

0 01 02 03 04 05 06 07 08 09 10F01 166 133 155 140 100 133 121 122 116 113 114F02 125E4 100 100 100 100 100 100 100 100 100 100F03 341 1115 100 100 100 100 100 100 100 100 100F04 273E4 100 125E4 104 104 104 104 104 104 100 100F05 503E5 530 133 168E4 100 100 100 100 100 100 100F06 100 256 547 473 1781 990 289 674 990 260 557F07 1014 1492 435 110 110 1537 101 100 100 100 100F08 3849 108 1406 108 108 108 9901 101 100 100 100F09 28518 40458 138 474 119 119 119 36791 101 100 100F10 66522 468 169E3 1527 118 118 118 118 121 100 100F11 2003 100 996 1154 3922 996 996 996 996 3897 849F12 1032 100 1322 323 1469 279 279 279 279 279 1931F13 146 415 284 447 115 156 100 100 100 100 100F14 52752 1357 395 101 115 1291 100 100 100 100 100

1 4 2 2 3 3 5 6 7 10 10

42 Influence of Control Parameter In [28] Dr Yang con-cluded that if the parameters in BA can be adjusted properlyit can outperform GA HS (harmony search) [26] and PSOThe choice of the control parameters is of vital importancefor different problems To investigate the influence of theloudness119860 pulse rate 119903 harmonymemory consideration rateHMCR and pitch adjustment rate PAR on the performanceof HSBA we carry out this experiment in the benchmarkproblem with different parameters All other parametersettings are kept unchanged (unless noted otherwise in thefollowing paragraph) The results are recorded in Tables 5ndash12 after 100 Monte Carlo runs Among them Tables 5 7 9and 11 show the best minima found by HSBA algorithm over100Monte Carlo runs Tables 6 8 10 and 12 show the averageminima found byHSBA algorithm averaged over 100MonteCarlo runs In other words Tables 5 7 9 11 and Tables 68 10 12 show the best and average performance of HSBAalgorithm respectively In each table the last row is the totalnumber of functions on which HSBA performs the best withsome parameters

421 Loudness 119860 Tables 5 and 6 recorded the resultsperformed in the benchmark problem with the loudness 119860 =0 01 02 09 10 and fixed pulse rate 119903 = 06 harmonymemory consideration rate HMCR= 095 and pitch adjust-ment rate PAR= 01 From Tables 5 and 6 obviously it can beseen that (i) for the three benchmark functions F02 F03 andF04 HSBA performs slightly differently that is to say thesethree benchmark functions are insensitive to the parameter119860 (ii) For benchmark functions F01 F06 F11 and F12HSBA performs better on smaller119860(lt05) (iii) However forfunctions F05 F07ndash10 F13 and F14 HSBA performs betteron bigger119860(gt05) In sum HSBA performs the best when119860is equal or very close to 10 So we set119860 = 095 which is veryclose to 09 and 10 in other experiments

422 Pulse Rate 119903 Tables 7 and 8 recorded the resultsperformed on the benchmark problem with the pulse rate119903 = 0 01 02 09 10 and fixed loudness119860 = 095harmony memory consideration rate HMCR=095 andpitch adjustment rate PAR= 01 From Tables 7 and 8 wecan evidently conclude that HSBA performs significantlybetter on bigger 119903(gt05) than on smaller 119903(lt05) and HSBAperforms the best when r is equal or very close to 06 So weset 119903 = 06 in other experiments

423 Harmony Memory Consideration Rate119867119872119862119877 Tables9 and 10 recorded the results performed in the bench-mark problem with the harmony memory consideration rateHMCR = 0 01 02 09 10 fixed loudness 119860 = 095pulse rate 119903 = 06 and pitch adjustment rate PAR = 01 FromTables 9 and 10 we can recognize that the function valuesevaluated by HSBA are bettersmaller on bigger HMCR(gt05) than on smaller r (lt05) and most benchmarks reachminimum when HMCR is equal or very close to 1 So we setHMCR=095 in other experiments

424 Pitch Adjustment Rate 119875119860119877 Tables 11 and 12 recordedthe results performed on the benchmark problems with thepitch adjustment rate PAR = 0 01 02 09 10 fixedloudness119860 = 095 pulse rate 119903 = 06 and harmonymemory consideration rate HMCR = 095 From Tables 11and 12 we can recognize that the function values for HSBAvary little with the increasing PAR and HSBA reachesoptimumminimum in most benchmarks when PAR is equalor very close to 01 So we set PAR = 01 in other experiments

43 Discussion In the HSBA the bats fly in the sky usingecholocation to find foodprey (ie best solutions) Fourother parameters are the loudness (119860) the rate of pulseemission (119903) harmony memory consideration rate (HMCR)

16 Journal of Applied Mathematics

Table 6 Mean normalized optimization results in fourteen benchmark functions with different A The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119860

0 01 02 03 04 05 06 07 08 09 10F01 101 101 100 100 100 100 100 100 100 100 100F02 152 100 146 117 108 137 188 165 167 179 148F03 669 1041 106 101 103 103 101 102 105 100 100F04 20131 126 410 421 333 272 100 519 291 100 311F05 59122 1682 425 587 101 414 2404 100 368 733 900F06 100 490E4 63799 25871 495E4 218 217 217 218 217 217F07 857 221E6 643 27234 11048 211E4 102 101 100 100 100F08 4980 114E4 172E4 16160 22459 9119 174E4 103 111 100 100F09 8237 173E6 106 314 7445 10310 4226 794E3 100 137 129F10 9031 145E3 245E5 232E3 2334 2234 3138 1289 234E3 140 100F11 498 100 315E4 233 1441 52023 44365 61626 24991 478E4 214F12 369 210E4 457E4 100 212E4 26635 23313 19880 27614 11201 214E4F13 190 825 448E6 214E6 100 615 25451 22275 18996 26386 10701F14 6691 195E5 117E4 124E3 2104 186E3 2940 2392 100 1835 2475

1 2 1 2 2 1 2 2 4 5 5

Table 7 Best normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 184 184 484 159 171 100 134 109 116 101F02 944E3 100 100 100 100 100 100 100 100 100 100F03 373 546 186 186 186 186 100 172 121 161 101F04 1142 111 2429 123 126 100 129 129 129 129 129F05 177E5 230 115 109E4 100 100 100 100 100 100 100F06 6244 2996 4858 1560 2634 780 182 100 3520 239 155F07 831 448 258 129 129 349 100 100 100 100 100F08 961 118 420 137 137 137 695 101 101 100 100F09 37394 44422 168 335 168 168 100 29198 101 101 101F10 38693 316 1397 463 158 158 100 158 30931 158 101F11 4255 100 1829 2135 2018 1111 1904 2135 1581 1876 1155F12 11440 100 14184 4448 12547 4448 4448 4448 4448 4448 6943F13 639 637 601 296 302 100 152 139 100 257 249F14 12052 404 233 100 116 340 100 100 116 116 116

0 3 1 2 2 4 8 5 4 4 4

and pitch adjustment rate (PAR) The appropriate update forthe pitch adjustment rate (PAR) and the rate of pulse emission(119903) balances the exploration and exploitation behavior of eachbat respectively as the loudness usually decreases once a bathas found its preysolution while the rate of pulse emissionincreases in order to raise the attack accuracy

For all of the standard benchmark functions that havebeen considered the HSBA has been demonstrated toperform better than or be equal to the standard BA and otheracclaimed state-of-the-art population-based algorithms withthe HSBA performing significantly better in some functionsThe HSBA performs excellently and efficiently because ofits ability to simultaneously carry out a local search still

searching globally at the same time It succeeds in doingthis due to the local search via harmony search algorithmand global search via bat algorithm concurrently A similarbehavior may be performed in the PSO by using multiswarmfrom a particle population initially [44] However HSBArsquosadvantages include performing simply and easily and onlyhaving four parameters to regulate The work carried outhere demonstrates the HSBA to be robust over all kinds ofbenchmark functions

Benchmark evaluation is a good way for verifying theperformance of the metaheuristic algorithms but it also haslimitations First we did not make any special effort to tunethe optimization algorithms in this section Different tuning

Journal of Applied Mathematics 17

Table 8 Mean normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 199 186 194 159 100 142 134 109 116 171F02 331 538 457 308 382 114 100 289 279 402 311F03 373 546 533 229 336 241 100 172 121 161 136F04 1142 111 2429 149E4 113E3 394E3 278E4 1966 17522 114 100F05 1609 12862 3269 2446 6440 1277 2959 7617 456 100 446F06 6244 2696 4858 1560 2634 780 182 100 3520 239 155F07 330 178 134 103 123 139 155 100 119 136 349F08 488 393 394 194 219 383 353 433 100 227 314F09 139 166 114 100 121 115 115 109 108 104 109F10 721 994 875 360 846 611 483 414 576 100 271F11 382 423 373 205 181 100 171 220 142 168 189F12 164 217 204 147 180 186 100 121 113 134 156F13 639 637 601 296 302 190 152 139 100 257 249F14 574 1543 742 682 644 317 100 485 221 174 209

0 0 0 1 0 2 4 2 2 2 1

Table 9 Best normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 164 164 164 164 164 164 164 151 128 100 163F02 187E4 100 100 100 100 100 100 100 100 100 100F03 1982 2649 371 371 371 371 371 371 371 206 100F04 126E5 100 187E4 100 100 100 100 100 100 100 100F05 607E5 534 100 187E4 100 100 100 100 100 100 100F06 10892 24631 36504 34540 33857 23465 14349 4528 2330 100 2575F07 922 1428 534 100 100 930 100 100 100 100 100F08 1810 108 772 108 108 108 3594 100 100 100 100F09 28004 39161 127 681 127 127 127 22795 100 100 100F10 55165 876 176E3 1170 164 164 164 164 27448 100 100F11 1467 100 618 618 1399 618 618 618 440 269 616F12 768 100 1110 273 1021 273 273 273 273 273 473F13 772 2675 1590 1776 612 1040 612 595 255 100 225F14 53716 1428 534 100 100 713 100 100 100 100 100

0 4 2 4 5 3 5 6 7 11 9

parameter values in the optimization algorithmsmight resultin significant differences in their performance Second real-world optimization problems may not have much of arelationship to benchmark functionsThird benchmark testsmight result in different conclusions if the grading criteria orproblem setup change In our work we examined the meanand best results obtained with a certain population size andafter a certain number of generations However we mightarrive at different conclusions if (for example) we change thegeneration limit or look at how many generations it takes toreach a certain function value or if we change the populationsize In spite of these caveats the benchmark results shown

here are promising for HSBA and indicate that this novelmethod might be able to find a niche among the plethora ofpopulation-based optimization algorithms

We note that CPU time is a bottleneck to the implemen-tation of many population-based optimization algorithmsIf an algorithm cannot converge fast it will be impracticalsince it would take too long to find an optimal or suboptimalsolution HSBA does not seem to require an unreasonableamount of computational effort of the ten optimization algo-rithms compared in this paper HSBA was the third fastestNevertheless findingmechanisms to accelerate HSBA couldbe an important area for further research

18 Journal of Applied Mathematics

Table 10 Mean normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 101 101 101 101 101 101 101 101 101 100 101F02 74756 787 525 343 509 710 307 187 160 100 101F03 913 312E4 109 107 105 106 105 102 101 101 100F04 2101198646 1001198644 3951198644 9481198643 5971198643 2951198643 1511198643 1501198643 4694 100 237F05 3241198646 3271198644 2091198644 4141198644 7131198643 1541198643 8931198643 1531198643 62948 100 20373F06 100 5941198644 42204 63215 6001198644 192 191 191 191 191 191F07 1063 2071198646 900 21672 32455 307E4 104 103 102 101 100F08 5950 9881198643 304E4 14180 21701 32468 307E4 107 107 103 100F09 22687 4951198646 308 891 11422 17360 25941 245E4 145 100 171F10 25737 2831198644 935E5 137E4 9709 6656 9925 14940 139E4 100 273F11 607 100 183E4 202 1661 39040 26299 40300 60361 572E4 184F12 300 279E4 360E4 100 282E4 27095 19414 13077 20040 30016 284E4F13 232 966 561E6 189E6 100 815 26778 19186 12923 19805 29665F14 18466 384E5 117E4 185E3 100 571E3 2511 5539 3960 2657 4149

1 1 0 1 2 0 0 0 0 6 3

Table 11 Best normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 391E3 100 100 100 100 100 100 100 100 100 100F03 100 425 151 220 220 212 220 199 220 128 180F04 100 255 6554 255 255 255 255 255 255 255 255F05 252 100 171 169E3 171 171 171 171 171 171 171F06 100 5987 3418 2285 4652 2344 4354 4484 3688 1899 812F07 807 100 148 255 255 1306 255 255 255 255 255F08 543 100 192 100 100 100 1145 100 100 100 100F09 7672 252 117 100 171 171 171 15556 171 171 171F10 92910 148 100 491 255 255 255 222 235E3 255 255F11 318E3 100 582E3 399E3 339E3 582E3 491E3 582E3 582E3 958E3 582E3F12 30506 100 53728 9734 18758 9734 9734 9734 9734 9734 39827F13 192 422 587 1007 1298 466 148 401 317 100 409F14 8812 100 148 255 255 491 255 255 255 255 255

4 8 3 4 3 3 2 3 3 4 3

5 Conclusion and Future Work

This paper proposed a hybrid metaheuristic HSBA methodfor optimization problemWe improved the BAby combiningoriginal harmony search (HS) algorithm and evaluating theHSBA on multimodal numerical optimization problemsA novel type of BA model has been presented and animprovement is applied to mutate between bats using har-mony search algorithm during the process of bats updatingUsing the original configuration of the bat algorithm wegenerate the new harmonies based on the newly generatedbat each iteration after batrsquos position has been updated Thenew harmony vector substitutes the newly generated bat only

if it has better fitness This selection scheme is rather greedywhich often overtakes original HS and BA The HSBAattempts to take merits of the BA and the HS in order toavoid all bats getting trapped in inferior local optimal regionsThe HSBA enables the bats to have more diverse exemplarsto learn from as the bats are updated each iteration andalso form new harmonies to search in a larger search spaceThis new method can speed up the global convergence ratewithout losing the strong robustness of the basic BA Fromthe analysis of the experimental results we observe that theproposed HSBA makes good use of the information in pastsolutions more effectively to generate better quality solutionsfrequently when compared to the other population-based

Journal of Applied Mathematics 19

Table 12Mean normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 407 100 100 452 277 257 341 536 369 489 103F03 100 127E4 134 135 135 135 134 135 134 134 134F04 115 8127 939E3 100 101 101 129 100 100 101 101F05 34573 5061 8619 907E3 2501 108 101 191 333 100 118F06 100 542E6 427E4 474E4 548E6 57788 57788 57788 57788 57787 57787F07 486 153 100 9543 10596 122E4 134 136 132 133 135F08 1269 7600 876E3 1703 6913 7672 885E3 110 105 104 100F09 6345 23047 182 167 1278 4860 5349 609E3 111 130 100F10 13355 1251 126 197E3 1148 464 1645 1893 199E3 100 188F11 6302 100 639E3 7944 5924 396E3 142E3 581E3 645E3 745E5 7981F12 390 881E3 890E3 100 890E3 4440 4778 1725 7011 7784 899E3F13 100 2166 207E3 669 580 430 27167 28246 10539 42926 47662F14 4614 100 3610 5593 122 640E3 4215 3289 3481 1272 5129

4 4 3 3 1 1 1 2 2 3 3

optimization algorithms such as ACO BA BBO DE ESGA HS PSO and SGA Based on the results of the tenapproaches on the test problems we can conclude that theHSBA significantly improves the performances of the HSand the BA on most multimodal and unimodal problems

In this work 14 benchmark functions are used to evaluatethe performance of our approach we will test our approachon more problems such as the high-dimensional (119863 ge20) CEC 2010 test suit [45] and the real-world problemsMoreover we will compare HSBA with other metaheuristicmethod such as DLHS [46 47] SGHS [48] adaptive DE[49] CLPSO [50] and DMS-PSO [51] In addition we onlyconsider the unconstrained function optimization in thiswork Our future work consists on adding the diversity rulesinto HSBA for constrained optimization problems such asconstrained real-parameter optimization CEC 2010 test suit[52]

In the field of optimization there are many issues worthyof further study and efficient optimizationmethod should bedeveloped depending on the analysis of specific engineeringproblem Our future work will focus on the two issues onthe one hand we would apply our proposed approach HSBA to solve practical engineering optimization problemsand obviously HSBA can become a fascinating method forreal-world engineering optimization problems on the otherhand we would develop new meta-hybrid approach to solveoptimization problem

Acknowledgments

This work was supported by State Key Laboratory of LaserInteraction with Material Research Fund under Grant noSKLLIM0902-01 and Key Research Technology of Electric-discharge Non-chain Pulsed DF Laser under Grant no LXJJ-11-Q80

References

[1] G Wang L Guo A H Gandomi et al ldquoLevy-flight krill herdalgorithmrdquo Mathematical Problems in Engineering vol 2013Article ID 682073 14 pages 2013

[2] A H Gandomi X S Yang S Talatahari and A H AlaviMetaheuristic Applications in Structures and InfrastructuresElsevier Waltham Mass USA 2013

[3] D H Ackley ldquoAn empirical study of bit vector functionoptimizationrdquo in Genetic Algorithms and Simulated AnnealingL Davis Ed pp 170ndash204 Pitman London UK 1987

[4] R Fletcher and M J D Powell ldquoA rapidly convergent descentmethod for minimizationrdquoThe Computer Journal vol 6 no 2pp 163ndash168 1963

[5] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[6] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[7] K A De Jong Analysis of the Behavior of a Class of GeneticAdaptive Systems University of Michigan 1975

[8] L A Rastrigin Extremal Control Systems NaukaMoscow Rus-sian 1974Theoretical Foundations of Engineering CyberneticsSeries

[9] S-K S Fan and J-M Chang ldquoDynamic multi-swarm particleswarm optimizer using parallel PC cluster systems for globaloptimization of large-scale multimodal functionsrdquo EngineeringOptimization vol 42 no 5 pp 431ndash451 2010

[10] W Gong Z Cai and C X Ling ldquoDEBBO a hybrid differentialevolution with biogeography-based optimization for globalnumerical optimizationrdquo Soft Computing vol 15 no 4 pp 645ndash665 2010

[11] G Wang L Guo H Duan L Liu and H Wang ldquoA modifiedfirefly algorithm for UCAV path planningrdquo International Jour-nal of Hybrid Information Technology vol 5 no 3 pp 123ndash1442012

20 Journal of Applied Mathematics

[12] GWang LGuoHDuan L Liu andHWang ldquoAhybridmeta-heuristic DECS algorithm for UCAVpath planningrdquo Journal ofInformation and Computational Science vol 9 no 16 pp 1ndash82012

[13] GWang L Guo H Duan L Liu HWang andM Shao ldquoPathplanning for uninhabited combat aerial vehicle using hybridmeta-heuristic DEBBO algorithmrdquo Advanced Science Engi-neering and Medicine vol 4 no 6 pp 550ndash564 2012

[14] G Wang L Guo H Duan H Wang L Liu and M Shao ldquoAhybrid meta-heuristic DECS algorithm for UCAV three-dimension path planningrdquo The Scientific World Journal vol2012 Article ID 583973 11 pages 2012

[15] H DuanW Zhao GWang and X Feng ldquoTest-sheet composi-tion using analytic hierarchy process and hybrid metaheuristicalgorithmTSBBOrdquoMathematical Problems in Engineering vol2012 Article ID 712752 22 pages 2012

[16] G Wang L Guo H Duan L Liu and H Wang ldquoDynamicdeployment of wireless sensor networks by biogeography basedoptimization algorithmrdquo Journal of Sensor and Actuator Net-works vol 1 no 2 pp 86ndash96 2012

[17] X S Yang A H Gandomi S Talatahari and A H AlaviMetaheuristics in Water Geotechnical and Transport Engineer-ing Elsevier Waltham Mass USA 2013

[18] D E GoldbergGenetic Algorithms in Search Optimization andMachine Learning Addison-Wesley New York NY USA 1998

[19] R Storn and K Price ldquoDifferential evolutionmdasha simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997

[20] S Das and P N Suganthan ldquoDifferential evolution a survey ofthe state-of-the-artrdquo IEEE Transactions on Evolutionary Com-putation vol 15 no 1 pp 4ndash31 2011

[21] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[22] A H Gandomi G J Yun X-S Yang and S Talatahari ldquoChaos-enhanced accelerated particle swarm optimizationrdquo Communi-cations in Nonlinear Science and Numerical Simulation vol 18no 2 pp 327ndash340 2013

[23] S Talatahari X S Yang R Sheikholeslami andAH GandomildquoOptimal parameter estimation for muskingum model usinghybrid CSS and PSO methodrdquo Journal of Computational andTheoretical Nanoscience In press

[24] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[25] G Wang L Guo H Duan L Liu H Wang and M ShaoldquoHybridizing harmony search with biogeography based opti-mization for global numerical optimizationrdquo Journal of Com-putational and Theoretical Nanoscience In press

[26] Z W Geem J H Kim and G V Loganathan ldquoA new heuristicoptimization algorithm harmony searchrdquo Simulation vol 76no 2 pp 60ndash68 2001

[27] G Wang L Guo H Duan H Wang L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[28] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress Frome UK 2nd edition 2010

[29] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing amp Applications In press

[30] X S Yang andAHGandomi ldquoBat algorithm a novel approachfor global engineering optimizationrdquo Engineering Computa-tions vol 29 no 5 pp 464ndash483 2012

[31] G Wang L Guo H Duan L Liu and H Wang ldquoA Bat algo-rithm with mutation for UCAV path planningrdquo The ScientificWorld Journal vol 2012 Article ID 418946 15 pages 2012

[32] R Parpinelli and H Lopes ldquoNew inspirations in swarm intelli-gence a surveyrdquo International Journal of Bio-Inspired Computa-tion vol 3 no 1 pp 1ndash16 2011

[33] P W Tsai J S Pan B Y Liao M J Tsai and V Istanda ldquoBatalgorithm inspired algorithm for solving numerical optimiza-tion problemsrdquo Applied Mechanics and Materials vol 148 pp134ndash137 2012

[34] Wikipedia ldquoMathematical optimizationrdquo httpenwikipediaorgwikiNumerical optimization

[35] W Y Zhang S Xu and S J Li ldquoNecessary conditions for weaksharp minima in cone-constrained optimization problemsrdquoAbstract and Applied Analysis vol 2011 Article ID 909520 11pages 2012

[36] Wikipedia ldquoGlobal optimizationrdquo httpenwikipediaorgwikiGlobal optimization

[37] S Kirkpatrick C D Gelatt Jr and M P Vecchi ldquoOptimizationby simulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983

[38] M Dorigo and T Stutzle Ant Colony Optimization The MITPress Cambridge Mass USA 2004

[39] M Dorigo Optimization Learning and Natural AlgorithmsPolitecnico di Milano Italy 1992

[40] H G Beyer The Theory of Evolution Strategies Springer NewYork NY USA 2001

[41] W Khatib and P Fleming ldquoThe stud GA a mini revolutionrdquo inParallel Problem Solving From Nature pp 683ndash691 1998

[42] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[43] M Friedman ldquoA comparison of alternative tests of significancefor the problem of 119898 rankingsrdquo The Annals of MathematicalStatistics vol 11 no 1 pp 86ndash92 1940

[44] R Brits A P Engelbrecht and F van den Bergh ldquoLocatingmultiple optima using particle swarm optimizationrdquo AppliedMathematics and Computation vol 189 no 2 pp 1859ndash18832007

[45] K Tang X Li P N Suganthan Z Yang and T Weise ldquoBench-mark functions for the CECrsquo2010 special session and competi-tion on large scale global optimizationrdquo Nature Inspired Com-putation and Applications Laboratory USTC Hefei China2010

[46] Q-K Pan P N Suganthan J J Liang and M F Tasgetiren ldquoAlocal-best harmony search algorithm with dynamic subpopula-tionsrdquo Engineering Optimization vol 42 no 2 pp 101ndash117 2010

[47] Q-K Pan P N Suganthan J J Liang and M F TasgetirenldquoA local-best harmony search algorithm with dynamic sub-harmony memories for lot-streaming flow shop schedulingproblemrdquo Expert Systems with Applications vol 38 no 4 pp3252ndash3259 2011

[48] Q-K Pan P N Suganthan M F Tasgetiren and J J LiangldquoA self-adaptive global best harmony search algorithm forcontinuous optimization problemsrdquo Applied Mathematics andComputation vol 216 no 3 pp 830ndash848 2010

Journal of Applied Mathematics 21

[49] S M Islam S Das S Ghosh S Roy and P N Suganthan ldquoAnadaptive differential evolution algorithm with novel mutationand crossover strategies for global numerical optimizationrdquoIEEE Transactions on Systems Man and Cybernetics B vol 42no 2 pp 482ndash500 2012

[50] J J Liang A K Qin P N Suganthan and S Baskar ldquoCompre-hensive learning particle swarm optimizer for global optimiza-tion of multimodal functionsrdquo IEEE Transactions on Evolution-ary Computation vol 10 no 3 pp 281ndash295 2006

[51] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[52] R Mallipeddi and P Suganthan Problem Definitions and Eval-uation Criteria for the CEC 2010 Competition on ConstrainedReal-Parameter Optimization Nanyang Technological Univer-sity Singapore 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

12 Journal of Applied Mathematics

0 10 20 30 40 50

50

100

150

200

250

300

350

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 3 Comparison of the performance of the different methodsfor the F03 Griewank function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

1010

109

108

107

106

105

104

103

102

101

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 4 Comparison of the performance of the different methodsfor the F04 Penalty 1 function

and significantly outperforms all other approaches in thismultimodal benchmark function

Figure 10 shows the results for F10 Schwefel 12 functionFrom Figure 10 we can see that HSBA performs far betterthan other algorithms during the optimization process in thisrelative simple unimodal benchmark function PSO showsa faster convergence rate initially than HSBA howeverit is outperformed by HSBA after 3 generations Lookingcarefully at Figure 10 akin to F07 Rastrigin function shown inFigure 7 all the algorithms except BA can be approximately

0

Ben

chm

ark

func

tion

val

ue

5 10 15 20 25 30 35 40 45 50Number of generations

109

108

107

106

105

104

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 5 Comparison of the performance of the different methodsfor the F05 Penalty 2 function

0 10 20 30 40 500

5

10

15

20

25

30

35

40

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 6 Comparison of the performance of the different methodsfor the F06 Quartic (with noise) function

divided into three groups one group including BBO andHSBA performing the best the other group including ACOGA PSO and SGA performing the second best anothergroup including DE ES and HS performing the worst

Figure 11 shows the results for F11 Schwefel 222 functionVery clearly BBO shows a faster convergence rate initiallythan HSBA however it is outperformed by HSBA after 40generations At last HSBA reaches the optimal solution sig-nificantly superior to other algorithms BBO is only inferior

Journal of Applied Mathematics 13B

ench

mar

k fu

ncti

on v

alue

0 10 20 30 40 50

50

100

150

200

250

Number of generations

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 7 Comparison of the performance of the different methodsfor the F07 Rastrigin function

0 10 20 30 40 50

500

1000

1500

2000

2500

3000

3500

4000

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 8 Comparison of the performance of the different methodsfor the F08 Rosenbrock function

to HSBA and performs the second best in this unimodalfunction

Figure 12 shows the results for F12 Schwefel 221 functionVery clearly HSBAhas the fastest convergence rate at findingthe global minimum and significantly outperforms all otherapproaches

0 10 20 30 40 50

1000

2000

3000

4000

5000

6000

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 9 Comparison of the performance of the different methodsfor the F09 Schwefel 226 function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

104

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 10 Comparison of the performance of the differentmethodsfor the F10 Schwefel 12 function

Figure 13 shows the results for F13 Sphere function FromTable 3 and Figure 13 HSBA has the fastest convergencerate at finding the global minimum and outperforms all otherapproaches Looking carefully at Figure 13 for BBO and SGAwe can see that BBO has a faster convergence rate than SGAbut SGA does finally converge to the value of BBO that isapproaching to HSBA

Figure 14 shows the results for F14 Step function Appar-ently HSBA shows the fastest convergence rate at findingthe global minimum and significantly outperforms all otherapproaches Looking carefully at Figure 14 for BBO and SGA

14 Journal of Applied MathematicsB

ench

mar

k fu

ncti

on v

alue

0 5 10 15 20 25 30 35 40 45 50Number of generations

102

101

100

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 11 Comparison of the performance of the different methodsfor the F11 Schwefel 222 function

0 10 20 30 40 5030

35

40

45

50

55

60

65

70

75

80

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 12 Comparison of the performance of the different methodsfor the F12 Schwefel 221 function

we can see that BBO has a faster convergence rate than SGAbut SGA does finally converge to the value of BBO

From the above-analyses about Figures 1ndash14 we can cometo the conclusion that our proposed hybrid metaheuristicalgorithm HSBA significantly outperforms the other ninealgorithms In general ACO BBO and SGA are only inferiorto the HSBA and ACO BBO and SGA perform betterthanHSBA in the benchmarks F04-F05 the benchmark F12and the benchmark F01 respectively Further benchmarksF04 F05 F06 F08 and F10 illustrate that PSO has a faster

0 10 20 30 40 500

10

20

30

40

50

60

70

80

90

100

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 13 Comparison of the performance of the different methodsfor the F13 Sphere function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

104

103

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 14 Comparison of the performance of the differentmethodsfor the F14 Step function

convergence rate initially while later it converges slowerand slower to the true objective function value At lastwe must point out that in [24] Simon compared BBOwith seven state-of-the-art EAs over 14 benchmark func-tions and a real-world sensor selection problem The resultsdemonstrated the good performance of BBO It is also indi-rectly demonstrated that our proposed hybrid metaheuristicmethod HSBA is a more powerful and efficient optimiza-tion algorithm than other population-based optimizationalgorithms

Journal of Applied Mathematics 15

Table 5 Best normalized optimization results in fourteen benchmark functions with different 119860 The numbers shown are the best resultsfound after 100 Monte Carlo simulations HSBA algorithm

119860

0 01 02 03 04 05 06 07 08 09 10F01 166 133 155 140 100 133 121 122 116 113 114F02 125E4 100 100 100 100 100 100 100 100 100 100F03 341 1115 100 100 100 100 100 100 100 100 100F04 273E4 100 125E4 104 104 104 104 104 104 100 100F05 503E5 530 133 168E4 100 100 100 100 100 100 100F06 100 256 547 473 1781 990 289 674 990 260 557F07 1014 1492 435 110 110 1537 101 100 100 100 100F08 3849 108 1406 108 108 108 9901 101 100 100 100F09 28518 40458 138 474 119 119 119 36791 101 100 100F10 66522 468 169E3 1527 118 118 118 118 121 100 100F11 2003 100 996 1154 3922 996 996 996 996 3897 849F12 1032 100 1322 323 1469 279 279 279 279 279 1931F13 146 415 284 447 115 156 100 100 100 100 100F14 52752 1357 395 101 115 1291 100 100 100 100 100

1 4 2 2 3 3 5 6 7 10 10

42 Influence of Control Parameter In [28] Dr Yang con-cluded that if the parameters in BA can be adjusted properlyit can outperform GA HS (harmony search) [26] and PSOThe choice of the control parameters is of vital importancefor different problems To investigate the influence of theloudness119860 pulse rate 119903 harmonymemory consideration rateHMCR and pitch adjustment rate PAR on the performanceof HSBA we carry out this experiment in the benchmarkproblem with different parameters All other parametersettings are kept unchanged (unless noted otherwise in thefollowing paragraph) The results are recorded in Tables 5ndash12 after 100 Monte Carlo runs Among them Tables 5 7 9and 11 show the best minima found by HSBA algorithm over100Monte Carlo runs Tables 6 8 10 and 12 show the averageminima found byHSBA algorithm averaged over 100MonteCarlo runs In other words Tables 5 7 9 11 and Tables 68 10 12 show the best and average performance of HSBAalgorithm respectively In each table the last row is the totalnumber of functions on which HSBA performs the best withsome parameters

421 Loudness 119860 Tables 5 and 6 recorded the resultsperformed in the benchmark problem with the loudness 119860 =0 01 02 09 10 and fixed pulse rate 119903 = 06 harmonymemory consideration rate HMCR= 095 and pitch adjust-ment rate PAR= 01 From Tables 5 and 6 obviously it can beseen that (i) for the three benchmark functions F02 F03 andF04 HSBA performs slightly differently that is to say thesethree benchmark functions are insensitive to the parameter119860 (ii) For benchmark functions F01 F06 F11 and F12HSBA performs better on smaller119860(lt05) (iii) However forfunctions F05 F07ndash10 F13 and F14 HSBA performs betteron bigger119860(gt05) In sum HSBA performs the best when119860is equal or very close to 10 So we set119860 = 095 which is veryclose to 09 and 10 in other experiments

422 Pulse Rate 119903 Tables 7 and 8 recorded the resultsperformed on the benchmark problem with the pulse rate119903 = 0 01 02 09 10 and fixed loudness119860 = 095harmony memory consideration rate HMCR=095 andpitch adjustment rate PAR= 01 From Tables 7 and 8 wecan evidently conclude that HSBA performs significantlybetter on bigger 119903(gt05) than on smaller 119903(lt05) and HSBAperforms the best when r is equal or very close to 06 So weset 119903 = 06 in other experiments

423 Harmony Memory Consideration Rate119867119872119862119877 Tables9 and 10 recorded the results performed in the bench-mark problem with the harmony memory consideration rateHMCR = 0 01 02 09 10 fixed loudness 119860 = 095pulse rate 119903 = 06 and pitch adjustment rate PAR = 01 FromTables 9 and 10 we can recognize that the function valuesevaluated by HSBA are bettersmaller on bigger HMCR(gt05) than on smaller r (lt05) and most benchmarks reachminimum when HMCR is equal or very close to 1 So we setHMCR=095 in other experiments

424 Pitch Adjustment Rate 119875119860119877 Tables 11 and 12 recordedthe results performed on the benchmark problems with thepitch adjustment rate PAR = 0 01 02 09 10 fixedloudness119860 = 095 pulse rate 119903 = 06 and harmonymemory consideration rate HMCR = 095 From Tables 11and 12 we can recognize that the function values for HSBAvary little with the increasing PAR and HSBA reachesoptimumminimum in most benchmarks when PAR is equalor very close to 01 So we set PAR = 01 in other experiments

43 Discussion In the HSBA the bats fly in the sky usingecholocation to find foodprey (ie best solutions) Fourother parameters are the loudness (119860) the rate of pulseemission (119903) harmony memory consideration rate (HMCR)

16 Journal of Applied Mathematics

Table 6 Mean normalized optimization results in fourteen benchmark functions with different A The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119860

0 01 02 03 04 05 06 07 08 09 10F01 101 101 100 100 100 100 100 100 100 100 100F02 152 100 146 117 108 137 188 165 167 179 148F03 669 1041 106 101 103 103 101 102 105 100 100F04 20131 126 410 421 333 272 100 519 291 100 311F05 59122 1682 425 587 101 414 2404 100 368 733 900F06 100 490E4 63799 25871 495E4 218 217 217 218 217 217F07 857 221E6 643 27234 11048 211E4 102 101 100 100 100F08 4980 114E4 172E4 16160 22459 9119 174E4 103 111 100 100F09 8237 173E6 106 314 7445 10310 4226 794E3 100 137 129F10 9031 145E3 245E5 232E3 2334 2234 3138 1289 234E3 140 100F11 498 100 315E4 233 1441 52023 44365 61626 24991 478E4 214F12 369 210E4 457E4 100 212E4 26635 23313 19880 27614 11201 214E4F13 190 825 448E6 214E6 100 615 25451 22275 18996 26386 10701F14 6691 195E5 117E4 124E3 2104 186E3 2940 2392 100 1835 2475

1 2 1 2 2 1 2 2 4 5 5

Table 7 Best normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 184 184 484 159 171 100 134 109 116 101F02 944E3 100 100 100 100 100 100 100 100 100 100F03 373 546 186 186 186 186 100 172 121 161 101F04 1142 111 2429 123 126 100 129 129 129 129 129F05 177E5 230 115 109E4 100 100 100 100 100 100 100F06 6244 2996 4858 1560 2634 780 182 100 3520 239 155F07 831 448 258 129 129 349 100 100 100 100 100F08 961 118 420 137 137 137 695 101 101 100 100F09 37394 44422 168 335 168 168 100 29198 101 101 101F10 38693 316 1397 463 158 158 100 158 30931 158 101F11 4255 100 1829 2135 2018 1111 1904 2135 1581 1876 1155F12 11440 100 14184 4448 12547 4448 4448 4448 4448 4448 6943F13 639 637 601 296 302 100 152 139 100 257 249F14 12052 404 233 100 116 340 100 100 116 116 116

0 3 1 2 2 4 8 5 4 4 4

and pitch adjustment rate (PAR) The appropriate update forthe pitch adjustment rate (PAR) and the rate of pulse emission(119903) balances the exploration and exploitation behavior of eachbat respectively as the loudness usually decreases once a bathas found its preysolution while the rate of pulse emissionincreases in order to raise the attack accuracy

For all of the standard benchmark functions that havebeen considered the HSBA has been demonstrated toperform better than or be equal to the standard BA and otheracclaimed state-of-the-art population-based algorithms withthe HSBA performing significantly better in some functionsThe HSBA performs excellently and efficiently because ofits ability to simultaneously carry out a local search still

searching globally at the same time It succeeds in doingthis due to the local search via harmony search algorithmand global search via bat algorithm concurrently A similarbehavior may be performed in the PSO by using multiswarmfrom a particle population initially [44] However HSBArsquosadvantages include performing simply and easily and onlyhaving four parameters to regulate The work carried outhere demonstrates the HSBA to be robust over all kinds ofbenchmark functions

Benchmark evaluation is a good way for verifying theperformance of the metaheuristic algorithms but it also haslimitations First we did not make any special effort to tunethe optimization algorithms in this section Different tuning

Journal of Applied Mathematics 17

Table 8 Mean normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 199 186 194 159 100 142 134 109 116 171F02 331 538 457 308 382 114 100 289 279 402 311F03 373 546 533 229 336 241 100 172 121 161 136F04 1142 111 2429 149E4 113E3 394E3 278E4 1966 17522 114 100F05 1609 12862 3269 2446 6440 1277 2959 7617 456 100 446F06 6244 2696 4858 1560 2634 780 182 100 3520 239 155F07 330 178 134 103 123 139 155 100 119 136 349F08 488 393 394 194 219 383 353 433 100 227 314F09 139 166 114 100 121 115 115 109 108 104 109F10 721 994 875 360 846 611 483 414 576 100 271F11 382 423 373 205 181 100 171 220 142 168 189F12 164 217 204 147 180 186 100 121 113 134 156F13 639 637 601 296 302 190 152 139 100 257 249F14 574 1543 742 682 644 317 100 485 221 174 209

0 0 0 1 0 2 4 2 2 2 1

Table 9 Best normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 164 164 164 164 164 164 164 151 128 100 163F02 187E4 100 100 100 100 100 100 100 100 100 100F03 1982 2649 371 371 371 371 371 371 371 206 100F04 126E5 100 187E4 100 100 100 100 100 100 100 100F05 607E5 534 100 187E4 100 100 100 100 100 100 100F06 10892 24631 36504 34540 33857 23465 14349 4528 2330 100 2575F07 922 1428 534 100 100 930 100 100 100 100 100F08 1810 108 772 108 108 108 3594 100 100 100 100F09 28004 39161 127 681 127 127 127 22795 100 100 100F10 55165 876 176E3 1170 164 164 164 164 27448 100 100F11 1467 100 618 618 1399 618 618 618 440 269 616F12 768 100 1110 273 1021 273 273 273 273 273 473F13 772 2675 1590 1776 612 1040 612 595 255 100 225F14 53716 1428 534 100 100 713 100 100 100 100 100

0 4 2 4 5 3 5 6 7 11 9

parameter values in the optimization algorithmsmight resultin significant differences in their performance Second real-world optimization problems may not have much of arelationship to benchmark functionsThird benchmark testsmight result in different conclusions if the grading criteria orproblem setup change In our work we examined the meanand best results obtained with a certain population size andafter a certain number of generations However we mightarrive at different conclusions if (for example) we change thegeneration limit or look at how many generations it takes toreach a certain function value or if we change the populationsize In spite of these caveats the benchmark results shown

here are promising for HSBA and indicate that this novelmethod might be able to find a niche among the plethora ofpopulation-based optimization algorithms

We note that CPU time is a bottleneck to the implemen-tation of many population-based optimization algorithmsIf an algorithm cannot converge fast it will be impracticalsince it would take too long to find an optimal or suboptimalsolution HSBA does not seem to require an unreasonableamount of computational effort of the ten optimization algo-rithms compared in this paper HSBA was the third fastestNevertheless findingmechanisms to accelerate HSBA couldbe an important area for further research

18 Journal of Applied Mathematics

Table 10 Mean normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 101 101 101 101 101 101 101 101 101 100 101F02 74756 787 525 343 509 710 307 187 160 100 101F03 913 312E4 109 107 105 106 105 102 101 101 100F04 2101198646 1001198644 3951198644 9481198643 5971198643 2951198643 1511198643 1501198643 4694 100 237F05 3241198646 3271198644 2091198644 4141198644 7131198643 1541198643 8931198643 1531198643 62948 100 20373F06 100 5941198644 42204 63215 6001198644 192 191 191 191 191 191F07 1063 2071198646 900 21672 32455 307E4 104 103 102 101 100F08 5950 9881198643 304E4 14180 21701 32468 307E4 107 107 103 100F09 22687 4951198646 308 891 11422 17360 25941 245E4 145 100 171F10 25737 2831198644 935E5 137E4 9709 6656 9925 14940 139E4 100 273F11 607 100 183E4 202 1661 39040 26299 40300 60361 572E4 184F12 300 279E4 360E4 100 282E4 27095 19414 13077 20040 30016 284E4F13 232 966 561E6 189E6 100 815 26778 19186 12923 19805 29665F14 18466 384E5 117E4 185E3 100 571E3 2511 5539 3960 2657 4149

1 1 0 1 2 0 0 0 0 6 3

Table 11 Best normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 391E3 100 100 100 100 100 100 100 100 100 100F03 100 425 151 220 220 212 220 199 220 128 180F04 100 255 6554 255 255 255 255 255 255 255 255F05 252 100 171 169E3 171 171 171 171 171 171 171F06 100 5987 3418 2285 4652 2344 4354 4484 3688 1899 812F07 807 100 148 255 255 1306 255 255 255 255 255F08 543 100 192 100 100 100 1145 100 100 100 100F09 7672 252 117 100 171 171 171 15556 171 171 171F10 92910 148 100 491 255 255 255 222 235E3 255 255F11 318E3 100 582E3 399E3 339E3 582E3 491E3 582E3 582E3 958E3 582E3F12 30506 100 53728 9734 18758 9734 9734 9734 9734 9734 39827F13 192 422 587 1007 1298 466 148 401 317 100 409F14 8812 100 148 255 255 491 255 255 255 255 255

4 8 3 4 3 3 2 3 3 4 3

5 Conclusion and Future Work

This paper proposed a hybrid metaheuristic HSBA methodfor optimization problemWe improved the BAby combiningoriginal harmony search (HS) algorithm and evaluating theHSBA on multimodal numerical optimization problemsA novel type of BA model has been presented and animprovement is applied to mutate between bats using har-mony search algorithm during the process of bats updatingUsing the original configuration of the bat algorithm wegenerate the new harmonies based on the newly generatedbat each iteration after batrsquos position has been updated Thenew harmony vector substitutes the newly generated bat only

if it has better fitness This selection scheme is rather greedywhich often overtakes original HS and BA The HSBAattempts to take merits of the BA and the HS in order toavoid all bats getting trapped in inferior local optimal regionsThe HSBA enables the bats to have more diverse exemplarsto learn from as the bats are updated each iteration andalso form new harmonies to search in a larger search spaceThis new method can speed up the global convergence ratewithout losing the strong robustness of the basic BA Fromthe analysis of the experimental results we observe that theproposed HSBA makes good use of the information in pastsolutions more effectively to generate better quality solutionsfrequently when compared to the other population-based

Journal of Applied Mathematics 19

Table 12Mean normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 407 100 100 452 277 257 341 536 369 489 103F03 100 127E4 134 135 135 135 134 135 134 134 134F04 115 8127 939E3 100 101 101 129 100 100 101 101F05 34573 5061 8619 907E3 2501 108 101 191 333 100 118F06 100 542E6 427E4 474E4 548E6 57788 57788 57788 57788 57787 57787F07 486 153 100 9543 10596 122E4 134 136 132 133 135F08 1269 7600 876E3 1703 6913 7672 885E3 110 105 104 100F09 6345 23047 182 167 1278 4860 5349 609E3 111 130 100F10 13355 1251 126 197E3 1148 464 1645 1893 199E3 100 188F11 6302 100 639E3 7944 5924 396E3 142E3 581E3 645E3 745E5 7981F12 390 881E3 890E3 100 890E3 4440 4778 1725 7011 7784 899E3F13 100 2166 207E3 669 580 430 27167 28246 10539 42926 47662F14 4614 100 3610 5593 122 640E3 4215 3289 3481 1272 5129

4 4 3 3 1 1 1 2 2 3 3

optimization algorithms such as ACO BA BBO DE ESGA HS PSO and SGA Based on the results of the tenapproaches on the test problems we can conclude that theHSBA significantly improves the performances of the HSand the BA on most multimodal and unimodal problems

In this work 14 benchmark functions are used to evaluatethe performance of our approach we will test our approachon more problems such as the high-dimensional (119863 ge20) CEC 2010 test suit [45] and the real-world problemsMoreover we will compare HSBA with other metaheuristicmethod such as DLHS [46 47] SGHS [48] adaptive DE[49] CLPSO [50] and DMS-PSO [51] In addition we onlyconsider the unconstrained function optimization in thiswork Our future work consists on adding the diversity rulesinto HSBA for constrained optimization problems such asconstrained real-parameter optimization CEC 2010 test suit[52]

In the field of optimization there are many issues worthyof further study and efficient optimizationmethod should bedeveloped depending on the analysis of specific engineeringproblem Our future work will focus on the two issues onthe one hand we would apply our proposed approach HSBA to solve practical engineering optimization problemsand obviously HSBA can become a fascinating method forreal-world engineering optimization problems on the otherhand we would develop new meta-hybrid approach to solveoptimization problem

Acknowledgments

This work was supported by State Key Laboratory of LaserInteraction with Material Research Fund under Grant noSKLLIM0902-01 and Key Research Technology of Electric-discharge Non-chain Pulsed DF Laser under Grant no LXJJ-11-Q80

References

[1] G Wang L Guo A H Gandomi et al ldquoLevy-flight krill herdalgorithmrdquo Mathematical Problems in Engineering vol 2013Article ID 682073 14 pages 2013

[2] A H Gandomi X S Yang S Talatahari and A H AlaviMetaheuristic Applications in Structures and InfrastructuresElsevier Waltham Mass USA 2013

[3] D H Ackley ldquoAn empirical study of bit vector functionoptimizationrdquo in Genetic Algorithms and Simulated AnnealingL Davis Ed pp 170ndash204 Pitman London UK 1987

[4] R Fletcher and M J D Powell ldquoA rapidly convergent descentmethod for minimizationrdquoThe Computer Journal vol 6 no 2pp 163ndash168 1963

[5] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[6] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[7] K A De Jong Analysis of the Behavior of a Class of GeneticAdaptive Systems University of Michigan 1975

[8] L A Rastrigin Extremal Control Systems NaukaMoscow Rus-sian 1974Theoretical Foundations of Engineering CyberneticsSeries

[9] S-K S Fan and J-M Chang ldquoDynamic multi-swarm particleswarm optimizer using parallel PC cluster systems for globaloptimization of large-scale multimodal functionsrdquo EngineeringOptimization vol 42 no 5 pp 431ndash451 2010

[10] W Gong Z Cai and C X Ling ldquoDEBBO a hybrid differentialevolution with biogeography-based optimization for globalnumerical optimizationrdquo Soft Computing vol 15 no 4 pp 645ndash665 2010

[11] G Wang L Guo H Duan L Liu and H Wang ldquoA modifiedfirefly algorithm for UCAV path planningrdquo International Jour-nal of Hybrid Information Technology vol 5 no 3 pp 123ndash1442012

20 Journal of Applied Mathematics

[12] GWang LGuoHDuan L Liu andHWang ldquoAhybridmeta-heuristic DECS algorithm for UCAVpath planningrdquo Journal ofInformation and Computational Science vol 9 no 16 pp 1ndash82012

[13] GWang L Guo H Duan L Liu HWang andM Shao ldquoPathplanning for uninhabited combat aerial vehicle using hybridmeta-heuristic DEBBO algorithmrdquo Advanced Science Engi-neering and Medicine vol 4 no 6 pp 550ndash564 2012

[14] G Wang L Guo H Duan H Wang L Liu and M Shao ldquoAhybrid meta-heuristic DECS algorithm for UCAV three-dimension path planningrdquo The Scientific World Journal vol2012 Article ID 583973 11 pages 2012

[15] H DuanW Zhao GWang and X Feng ldquoTest-sheet composi-tion using analytic hierarchy process and hybrid metaheuristicalgorithmTSBBOrdquoMathematical Problems in Engineering vol2012 Article ID 712752 22 pages 2012

[16] G Wang L Guo H Duan L Liu and H Wang ldquoDynamicdeployment of wireless sensor networks by biogeography basedoptimization algorithmrdquo Journal of Sensor and Actuator Net-works vol 1 no 2 pp 86ndash96 2012

[17] X S Yang A H Gandomi S Talatahari and A H AlaviMetaheuristics in Water Geotechnical and Transport Engineer-ing Elsevier Waltham Mass USA 2013

[18] D E GoldbergGenetic Algorithms in Search Optimization andMachine Learning Addison-Wesley New York NY USA 1998

[19] R Storn and K Price ldquoDifferential evolutionmdasha simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997

[20] S Das and P N Suganthan ldquoDifferential evolution a survey ofthe state-of-the-artrdquo IEEE Transactions on Evolutionary Com-putation vol 15 no 1 pp 4ndash31 2011

[21] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[22] A H Gandomi G J Yun X-S Yang and S Talatahari ldquoChaos-enhanced accelerated particle swarm optimizationrdquo Communi-cations in Nonlinear Science and Numerical Simulation vol 18no 2 pp 327ndash340 2013

[23] S Talatahari X S Yang R Sheikholeslami andAH GandomildquoOptimal parameter estimation for muskingum model usinghybrid CSS and PSO methodrdquo Journal of Computational andTheoretical Nanoscience In press

[24] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[25] G Wang L Guo H Duan L Liu H Wang and M ShaoldquoHybridizing harmony search with biogeography based opti-mization for global numerical optimizationrdquo Journal of Com-putational and Theoretical Nanoscience In press

[26] Z W Geem J H Kim and G V Loganathan ldquoA new heuristicoptimization algorithm harmony searchrdquo Simulation vol 76no 2 pp 60ndash68 2001

[27] G Wang L Guo H Duan H Wang L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[28] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress Frome UK 2nd edition 2010

[29] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing amp Applications In press

[30] X S Yang andAHGandomi ldquoBat algorithm a novel approachfor global engineering optimizationrdquo Engineering Computa-tions vol 29 no 5 pp 464ndash483 2012

[31] G Wang L Guo H Duan L Liu and H Wang ldquoA Bat algo-rithm with mutation for UCAV path planningrdquo The ScientificWorld Journal vol 2012 Article ID 418946 15 pages 2012

[32] R Parpinelli and H Lopes ldquoNew inspirations in swarm intelli-gence a surveyrdquo International Journal of Bio-Inspired Computa-tion vol 3 no 1 pp 1ndash16 2011

[33] P W Tsai J S Pan B Y Liao M J Tsai and V Istanda ldquoBatalgorithm inspired algorithm for solving numerical optimiza-tion problemsrdquo Applied Mechanics and Materials vol 148 pp134ndash137 2012

[34] Wikipedia ldquoMathematical optimizationrdquo httpenwikipediaorgwikiNumerical optimization

[35] W Y Zhang S Xu and S J Li ldquoNecessary conditions for weaksharp minima in cone-constrained optimization problemsrdquoAbstract and Applied Analysis vol 2011 Article ID 909520 11pages 2012

[36] Wikipedia ldquoGlobal optimizationrdquo httpenwikipediaorgwikiGlobal optimization

[37] S Kirkpatrick C D Gelatt Jr and M P Vecchi ldquoOptimizationby simulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983

[38] M Dorigo and T Stutzle Ant Colony Optimization The MITPress Cambridge Mass USA 2004

[39] M Dorigo Optimization Learning and Natural AlgorithmsPolitecnico di Milano Italy 1992

[40] H G Beyer The Theory of Evolution Strategies Springer NewYork NY USA 2001

[41] W Khatib and P Fleming ldquoThe stud GA a mini revolutionrdquo inParallel Problem Solving From Nature pp 683ndash691 1998

[42] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[43] M Friedman ldquoA comparison of alternative tests of significancefor the problem of 119898 rankingsrdquo The Annals of MathematicalStatistics vol 11 no 1 pp 86ndash92 1940

[44] R Brits A P Engelbrecht and F van den Bergh ldquoLocatingmultiple optima using particle swarm optimizationrdquo AppliedMathematics and Computation vol 189 no 2 pp 1859ndash18832007

[45] K Tang X Li P N Suganthan Z Yang and T Weise ldquoBench-mark functions for the CECrsquo2010 special session and competi-tion on large scale global optimizationrdquo Nature Inspired Com-putation and Applications Laboratory USTC Hefei China2010

[46] Q-K Pan P N Suganthan J J Liang and M F Tasgetiren ldquoAlocal-best harmony search algorithm with dynamic subpopula-tionsrdquo Engineering Optimization vol 42 no 2 pp 101ndash117 2010

[47] Q-K Pan P N Suganthan J J Liang and M F TasgetirenldquoA local-best harmony search algorithm with dynamic sub-harmony memories for lot-streaming flow shop schedulingproblemrdquo Expert Systems with Applications vol 38 no 4 pp3252ndash3259 2011

[48] Q-K Pan P N Suganthan M F Tasgetiren and J J LiangldquoA self-adaptive global best harmony search algorithm forcontinuous optimization problemsrdquo Applied Mathematics andComputation vol 216 no 3 pp 830ndash848 2010

Journal of Applied Mathematics 21

[49] S M Islam S Das S Ghosh S Roy and P N Suganthan ldquoAnadaptive differential evolution algorithm with novel mutationand crossover strategies for global numerical optimizationrdquoIEEE Transactions on Systems Man and Cybernetics B vol 42no 2 pp 482ndash500 2012

[50] J J Liang A K Qin P N Suganthan and S Baskar ldquoCompre-hensive learning particle swarm optimizer for global optimiza-tion of multimodal functionsrdquo IEEE Transactions on Evolution-ary Computation vol 10 no 3 pp 281ndash295 2006

[51] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[52] R Mallipeddi and P Suganthan Problem Definitions and Eval-uation Criteria for the CEC 2010 Competition on ConstrainedReal-Parameter Optimization Nanyang Technological Univer-sity Singapore 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Journal of Applied Mathematics 13B

ench

mar

k fu

ncti

on v

alue

0 10 20 30 40 50

50

100

150

200

250

Number of generations

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 7 Comparison of the performance of the different methodsfor the F07 Rastrigin function

0 10 20 30 40 50

500

1000

1500

2000

2500

3000

3500

4000

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 8 Comparison of the performance of the different methodsfor the F08 Rosenbrock function

to HSBA and performs the second best in this unimodalfunction

Figure 12 shows the results for F12 Schwefel 221 functionVery clearly HSBAhas the fastest convergence rate at findingthe global minimum and significantly outperforms all otherapproaches

0 10 20 30 40 50

1000

2000

3000

4000

5000

6000

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 9 Comparison of the performance of the different methodsfor the F09 Schwefel 226 function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

104

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 10 Comparison of the performance of the differentmethodsfor the F10 Schwefel 12 function

Figure 13 shows the results for F13 Sphere function FromTable 3 and Figure 13 HSBA has the fastest convergencerate at finding the global minimum and outperforms all otherapproaches Looking carefully at Figure 13 for BBO and SGAwe can see that BBO has a faster convergence rate than SGAbut SGA does finally converge to the value of BBO that isapproaching to HSBA

Figure 14 shows the results for F14 Step function Appar-ently HSBA shows the fastest convergence rate at findingthe global minimum and significantly outperforms all otherapproaches Looking carefully at Figure 14 for BBO and SGA

14 Journal of Applied MathematicsB

ench

mar

k fu

ncti

on v

alue

0 5 10 15 20 25 30 35 40 45 50Number of generations

102

101

100

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 11 Comparison of the performance of the different methodsfor the F11 Schwefel 222 function

0 10 20 30 40 5030

35

40

45

50

55

60

65

70

75

80

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 12 Comparison of the performance of the different methodsfor the F12 Schwefel 221 function

we can see that BBO has a faster convergence rate than SGAbut SGA does finally converge to the value of BBO

From the above-analyses about Figures 1ndash14 we can cometo the conclusion that our proposed hybrid metaheuristicalgorithm HSBA significantly outperforms the other ninealgorithms In general ACO BBO and SGA are only inferiorto the HSBA and ACO BBO and SGA perform betterthanHSBA in the benchmarks F04-F05 the benchmark F12and the benchmark F01 respectively Further benchmarksF04 F05 F06 F08 and F10 illustrate that PSO has a faster

0 10 20 30 40 500

10

20

30

40

50

60

70

80

90

100

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 13 Comparison of the performance of the different methodsfor the F13 Sphere function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

104

103

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 14 Comparison of the performance of the differentmethodsfor the F14 Step function

convergence rate initially while later it converges slowerand slower to the true objective function value At lastwe must point out that in [24] Simon compared BBOwith seven state-of-the-art EAs over 14 benchmark func-tions and a real-world sensor selection problem The resultsdemonstrated the good performance of BBO It is also indi-rectly demonstrated that our proposed hybrid metaheuristicmethod HSBA is a more powerful and efficient optimiza-tion algorithm than other population-based optimizationalgorithms

Journal of Applied Mathematics 15

Table 5 Best normalized optimization results in fourteen benchmark functions with different 119860 The numbers shown are the best resultsfound after 100 Monte Carlo simulations HSBA algorithm

119860

0 01 02 03 04 05 06 07 08 09 10F01 166 133 155 140 100 133 121 122 116 113 114F02 125E4 100 100 100 100 100 100 100 100 100 100F03 341 1115 100 100 100 100 100 100 100 100 100F04 273E4 100 125E4 104 104 104 104 104 104 100 100F05 503E5 530 133 168E4 100 100 100 100 100 100 100F06 100 256 547 473 1781 990 289 674 990 260 557F07 1014 1492 435 110 110 1537 101 100 100 100 100F08 3849 108 1406 108 108 108 9901 101 100 100 100F09 28518 40458 138 474 119 119 119 36791 101 100 100F10 66522 468 169E3 1527 118 118 118 118 121 100 100F11 2003 100 996 1154 3922 996 996 996 996 3897 849F12 1032 100 1322 323 1469 279 279 279 279 279 1931F13 146 415 284 447 115 156 100 100 100 100 100F14 52752 1357 395 101 115 1291 100 100 100 100 100

1 4 2 2 3 3 5 6 7 10 10

42 Influence of Control Parameter In [28] Dr Yang con-cluded that if the parameters in BA can be adjusted properlyit can outperform GA HS (harmony search) [26] and PSOThe choice of the control parameters is of vital importancefor different problems To investigate the influence of theloudness119860 pulse rate 119903 harmonymemory consideration rateHMCR and pitch adjustment rate PAR on the performanceof HSBA we carry out this experiment in the benchmarkproblem with different parameters All other parametersettings are kept unchanged (unless noted otherwise in thefollowing paragraph) The results are recorded in Tables 5ndash12 after 100 Monte Carlo runs Among them Tables 5 7 9and 11 show the best minima found by HSBA algorithm over100Monte Carlo runs Tables 6 8 10 and 12 show the averageminima found byHSBA algorithm averaged over 100MonteCarlo runs In other words Tables 5 7 9 11 and Tables 68 10 12 show the best and average performance of HSBAalgorithm respectively In each table the last row is the totalnumber of functions on which HSBA performs the best withsome parameters

421 Loudness 119860 Tables 5 and 6 recorded the resultsperformed in the benchmark problem with the loudness 119860 =0 01 02 09 10 and fixed pulse rate 119903 = 06 harmonymemory consideration rate HMCR= 095 and pitch adjust-ment rate PAR= 01 From Tables 5 and 6 obviously it can beseen that (i) for the three benchmark functions F02 F03 andF04 HSBA performs slightly differently that is to say thesethree benchmark functions are insensitive to the parameter119860 (ii) For benchmark functions F01 F06 F11 and F12HSBA performs better on smaller119860(lt05) (iii) However forfunctions F05 F07ndash10 F13 and F14 HSBA performs betteron bigger119860(gt05) In sum HSBA performs the best when119860is equal or very close to 10 So we set119860 = 095 which is veryclose to 09 and 10 in other experiments

422 Pulse Rate 119903 Tables 7 and 8 recorded the resultsperformed on the benchmark problem with the pulse rate119903 = 0 01 02 09 10 and fixed loudness119860 = 095harmony memory consideration rate HMCR=095 andpitch adjustment rate PAR= 01 From Tables 7 and 8 wecan evidently conclude that HSBA performs significantlybetter on bigger 119903(gt05) than on smaller 119903(lt05) and HSBAperforms the best when r is equal or very close to 06 So weset 119903 = 06 in other experiments

423 Harmony Memory Consideration Rate119867119872119862119877 Tables9 and 10 recorded the results performed in the bench-mark problem with the harmony memory consideration rateHMCR = 0 01 02 09 10 fixed loudness 119860 = 095pulse rate 119903 = 06 and pitch adjustment rate PAR = 01 FromTables 9 and 10 we can recognize that the function valuesevaluated by HSBA are bettersmaller on bigger HMCR(gt05) than on smaller r (lt05) and most benchmarks reachminimum when HMCR is equal or very close to 1 So we setHMCR=095 in other experiments

424 Pitch Adjustment Rate 119875119860119877 Tables 11 and 12 recordedthe results performed on the benchmark problems with thepitch adjustment rate PAR = 0 01 02 09 10 fixedloudness119860 = 095 pulse rate 119903 = 06 and harmonymemory consideration rate HMCR = 095 From Tables 11and 12 we can recognize that the function values for HSBAvary little with the increasing PAR and HSBA reachesoptimumminimum in most benchmarks when PAR is equalor very close to 01 So we set PAR = 01 in other experiments

43 Discussion In the HSBA the bats fly in the sky usingecholocation to find foodprey (ie best solutions) Fourother parameters are the loudness (119860) the rate of pulseemission (119903) harmony memory consideration rate (HMCR)

16 Journal of Applied Mathematics

Table 6 Mean normalized optimization results in fourteen benchmark functions with different A The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119860

0 01 02 03 04 05 06 07 08 09 10F01 101 101 100 100 100 100 100 100 100 100 100F02 152 100 146 117 108 137 188 165 167 179 148F03 669 1041 106 101 103 103 101 102 105 100 100F04 20131 126 410 421 333 272 100 519 291 100 311F05 59122 1682 425 587 101 414 2404 100 368 733 900F06 100 490E4 63799 25871 495E4 218 217 217 218 217 217F07 857 221E6 643 27234 11048 211E4 102 101 100 100 100F08 4980 114E4 172E4 16160 22459 9119 174E4 103 111 100 100F09 8237 173E6 106 314 7445 10310 4226 794E3 100 137 129F10 9031 145E3 245E5 232E3 2334 2234 3138 1289 234E3 140 100F11 498 100 315E4 233 1441 52023 44365 61626 24991 478E4 214F12 369 210E4 457E4 100 212E4 26635 23313 19880 27614 11201 214E4F13 190 825 448E6 214E6 100 615 25451 22275 18996 26386 10701F14 6691 195E5 117E4 124E3 2104 186E3 2940 2392 100 1835 2475

1 2 1 2 2 1 2 2 4 5 5

Table 7 Best normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 184 184 484 159 171 100 134 109 116 101F02 944E3 100 100 100 100 100 100 100 100 100 100F03 373 546 186 186 186 186 100 172 121 161 101F04 1142 111 2429 123 126 100 129 129 129 129 129F05 177E5 230 115 109E4 100 100 100 100 100 100 100F06 6244 2996 4858 1560 2634 780 182 100 3520 239 155F07 831 448 258 129 129 349 100 100 100 100 100F08 961 118 420 137 137 137 695 101 101 100 100F09 37394 44422 168 335 168 168 100 29198 101 101 101F10 38693 316 1397 463 158 158 100 158 30931 158 101F11 4255 100 1829 2135 2018 1111 1904 2135 1581 1876 1155F12 11440 100 14184 4448 12547 4448 4448 4448 4448 4448 6943F13 639 637 601 296 302 100 152 139 100 257 249F14 12052 404 233 100 116 340 100 100 116 116 116

0 3 1 2 2 4 8 5 4 4 4

and pitch adjustment rate (PAR) The appropriate update forthe pitch adjustment rate (PAR) and the rate of pulse emission(119903) balances the exploration and exploitation behavior of eachbat respectively as the loudness usually decreases once a bathas found its preysolution while the rate of pulse emissionincreases in order to raise the attack accuracy

For all of the standard benchmark functions that havebeen considered the HSBA has been demonstrated toperform better than or be equal to the standard BA and otheracclaimed state-of-the-art population-based algorithms withthe HSBA performing significantly better in some functionsThe HSBA performs excellently and efficiently because ofits ability to simultaneously carry out a local search still

searching globally at the same time It succeeds in doingthis due to the local search via harmony search algorithmand global search via bat algorithm concurrently A similarbehavior may be performed in the PSO by using multiswarmfrom a particle population initially [44] However HSBArsquosadvantages include performing simply and easily and onlyhaving four parameters to regulate The work carried outhere demonstrates the HSBA to be robust over all kinds ofbenchmark functions

Benchmark evaluation is a good way for verifying theperformance of the metaheuristic algorithms but it also haslimitations First we did not make any special effort to tunethe optimization algorithms in this section Different tuning

Journal of Applied Mathematics 17

Table 8 Mean normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 199 186 194 159 100 142 134 109 116 171F02 331 538 457 308 382 114 100 289 279 402 311F03 373 546 533 229 336 241 100 172 121 161 136F04 1142 111 2429 149E4 113E3 394E3 278E4 1966 17522 114 100F05 1609 12862 3269 2446 6440 1277 2959 7617 456 100 446F06 6244 2696 4858 1560 2634 780 182 100 3520 239 155F07 330 178 134 103 123 139 155 100 119 136 349F08 488 393 394 194 219 383 353 433 100 227 314F09 139 166 114 100 121 115 115 109 108 104 109F10 721 994 875 360 846 611 483 414 576 100 271F11 382 423 373 205 181 100 171 220 142 168 189F12 164 217 204 147 180 186 100 121 113 134 156F13 639 637 601 296 302 190 152 139 100 257 249F14 574 1543 742 682 644 317 100 485 221 174 209

0 0 0 1 0 2 4 2 2 2 1

Table 9 Best normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 164 164 164 164 164 164 164 151 128 100 163F02 187E4 100 100 100 100 100 100 100 100 100 100F03 1982 2649 371 371 371 371 371 371 371 206 100F04 126E5 100 187E4 100 100 100 100 100 100 100 100F05 607E5 534 100 187E4 100 100 100 100 100 100 100F06 10892 24631 36504 34540 33857 23465 14349 4528 2330 100 2575F07 922 1428 534 100 100 930 100 100 100 100 100F08 1810 108 772 108 108 108 3594 100 100 100 100F09 28004 39161 127 681 127 127 127 22795 100 100 100F10 55165 876 176E3 1170 164 164 164 164 27448 100 100F11 1467 100 618 618 1399 618 618 618 440 269 616F12 768 100 1110 273 1021 273 273 273 273 273 473F13 772 2675 1590 1776 612 1040 612 595 255 100 225F14 53716 1428 534 100 100 713 100 100 100 100 100

0 4 2 4 5 3 5 6 7 11 9

parameter values in the optimization algorithmsmight resultin significant differences in their performance Second real-world optimization problems may not have much of arelationship to benchmark functionsThird benchmark testsmight result in different conclusions if the grading criteria orproblem setup change In our work we examined the meanand best results obtained with a certain population size andafter a certain number of generations However we mightarrive at different conclusions if (for example) we change thegeneration limit or look at how many generations it takes toreach a certain function value or if we change the populationsize In spite of these caveats the benchmark results shown

here are promising for HSBA and indicate that this novelmethod might be able to find a niche among the plethora ofpopulation-based optimization algorithms

We note that CPU time is a bottleneck to the implemen-tation of many population-based optimization algorithmsIf an algorithm cannot converge fast it will be impracticalsince it would take too long to find an optimal or suboptimalsolution HSBA does not seem to require an unreasonableamount of computational effort of the ten optimization algo-rithms compared in this paper HSBA was the third fastestNevertheless findingmechanisms to accelerate HSBA couldbe an important area for further research

18 Journal of Applied Mathematics

Table 10 Mean normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 101 101 101 101 101 101 101 101 101 100 101F02 74756 787 525 343 509 710 307 187 160 100 101F03 913 312E4 109 107 105 106 105 102 101 101 100F04 2101198646 1001198644 3951198644 9481198643 5971198643 2951198643 1511198643 1501198643 4694 100 237F05 3241198646 3271198644 2091198644 4141198644 7131198643 1541198643 8931198643 1531198643 62948 100 20373F06 100 5941198644 42204 63215 6001198644 192 191 191 191 191 191F07 1063 2071198646 900 21672 32455 307E4 104 103 102 101 100F08 5950 9881198643 304E4 14180 21701 32468 307E4 107 107 103 100F09 22687 4951198646 308 891 11422 17360 25941 245E4 145 100 171F10 25737 2831198644 935E5 137E4 9709 6656 9925 14940 139E4 100 273F11 607 100 183E4 202 1661 39040 26299 40300 60361 572E4 184F12 300 279E4 360E4 100 282E4 27095 19414 13077 20040 30016 284E4F13 232 966 561E6 189E6 100 815 26778 19186 12923 19805 29665F14 18466 384E5 117E4 185E3 100 571E3 2511 5539 3960 2657 4149

1 1 0 1 2 0 0 0 0 6 3

Table 11 Best normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 391E3 100 100 100 100 100 100 100 100 100 100F03 100 425 151 220 220 212 220 199 220 128 180F04 100 255 6554 255 255 255 255 255 255 255 255F05 252 100 171 169E3 171 171 171 171 171 171 171F06 100 5987 3418 2285 4652 2344 4354 4484 3688 1899 812F07 807 100 148 255 255 1306 255 255 255 255 255F08 543 100 192 100 100 100 1145 100 100 100 100F09 7672 252 117 100 171 171 171 15556 171 171 171F10 92910 148 100 491 255 255 255 222 235E3 255 255F11 318E3 100 582E3 399E3 339E3 582E3 491E3 582E3 582E3 958E3 582E3F12 30506 100 53728 9734 18758 9734 9734 9734 9734 9734 39827F13 192 422 587 1007 1298 466 148 401 317 100 409F14 8812 100 148 255 255 491 255 255 255 255 255

4 8 3 4 3 3 2 3 3 4 3

5 Conclusion and Future Work

This paper proposed a hybrid metaheuristic HSBA methodfor optimization problemWe improved the BAby combiningoriginal harmony search (HS) algorithm and evaluating theHSBA on multimodal numerical optimization problemsA novel type of BA model has been presented and animprovement is applied to mutate between bats using har-mony search algorithm during the process of bats updatingUsing the original configuration of the bat algorithm wegenerate the new harmonies based on the newly generatedbat each iteration after batrsquos position has been updated Thenew harmony vector substitutes the newly generated bat only

if it has better fitness This selection scheme is rather greedywhich often overtakes original HS and BA The HSBAattempts to take merits of the BA and the HS in order toavoid all bats getting trapped in inferior local optimal regionsThe HSBA enables the bats to have more diverse exemplarsto learn from as the bats are updated each iteration andalso form new harmonies to search in a larger search spaceThis new method can speed up the global convergence ratewithout losing the strong robustness of the basic BA Fromthe analysis of the experimental results we observe that theproposed HSBA makes good use of the information in pastsolutions more effectively to generate better quality solutionsfrequently when compared to the other population-based

Journal of Applied Mathematics 19

Table 12Mean normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 407 100 100 452 277 257 341 536 369 489 103F03 100 127E4 134 135 135 135 134 135 134 134 134F04 115 8127 939E3 100 101 101 129 100 100 101 101F05 34573 5061 8619 907E3 2501 108 101 191 333 100 118F06 100 542E6 427E4 474E4 548E6 57788 57788 57788 57788 57787 57787F07 486 153 100 9543 10596 122E4 134 136 132 133 135F08 1269 7600 876E3 1703 6913 7672 885E3 110 105 104 100F09 6345 23047 182 167 1278 4860 5349 609E3 111 130 100F10 13355 1251 126 197E3 1148 464 1645 1893 199E3 100 188F11 6302 100 639E3 7944 5924 396E3 142E3 581E3 645E3 745E5 7981F12 390 881E3 890E3 100 890E3 4440 4778 1725 7011 7784 899E3F13 100 2166 207E3 669 580 430 27167 28246 10539 42926 47662F14 4614 100 3610 5593 122 640E3 4215 3289 3481 1272 5129

4 4 3 3 1 1 1 2 2 3 3

optimization algorithms such as ACO BA BBO DE ESGA HS PSO and SGA Based on the results of the tenapproaches on the test problems we can conclude that theHSBA significantly improves the performances of the HSand the BA on most multimodal and unimodal problems

In this work 14 benchmark functions are used to evaluatethe performance of our approach we will test our approachon more problems such as the high-dimensional (119863 ge20) CEC 2010 test suit [45] and the real-world problemsMoreover we will compare HSBA with other metaheuristicmethod such as DLHS [46 47] SGHS [48] adaptive DE[49] CLPSO [50] and DMS-PSO [51] In addition we onlyconsider the unconstrained function optimization in thiswork Our future work consists on adding the diversity rulesinto HSBA for constrained optimization problems such asconstrained real-parameter optimization CEC 2010 test suit[52]

In the field of optimization there are many issues worthyof further study and efficient optimizationmethod should bedeveloped depending on the analysis of specific engineeringproblem Our future work will focus on the two issues onthe one hand we would apply our proposed approach HSBA to solve practical engineering optimization problemsand obviously HSBA can become a fascinating method forreal-world engineering optimization problems on the otherhand we would develop new meta-hybrid approach to solveoptimization problem

Acknowledgments

This work was supported by State Key Laboratory of LaserInteraction with Material Research Fund under Grant noSKLLIM0902-01 and Key Research Technology of Electric-discharge Non-chain Pulsed DF Laser under Grant no LXJJ-11-Q80

References

[1] G Wang L Guo A H Gandomi et al ldquoLevy-flight krill herdalgorithmrdquo Mathematical Problems in Engineering vol 2013Article ID 682073 14 pages 2013

[2] A H Gandomi X S Yang S Talatahari and A H AlaviMetaheuristic Applications in Structures and InfrastructuresElsevier Waltham Mass USA 2013

[3] D H Ackley ldquoAn empirical study of bit vector functionoptimizationrdquo in Genetic Algorithms and Simulated AnnealingL Davis Ed pp 170ndash204 Pitman London UK 1987

[4] R Fletcher and M J D Powell ldquoA rapidly convergent descentmethod for minimizationrdquoThe Computer Journal vol 6 no 2pp 163ndash168 1963

[5] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[6] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[7] K A De Jong Analysis of the Behavior of a Class of GeneticAdaptive Systems University of Michigan 1975

[8] L A Rastrigin Extremal Control Systems NaukaMoscow Rus-sian 1974Theoretical Foundations of Engineering CyberneticsSeries

[9] S-K S Fan and J-M Chang ldquoDynamic multi-swarm particleswarm optimizer using parallel PC cluster systems for globaloptimization of large-scale multimodal functionsrdquo EngineeringOptimization vol 42 no 5 pp 431ndash451 2010

[10] W Gong Z Cai and C X Ling ldquoDEBBO a hybrid differentialevolution with biogeography-based optimization for globalnumerical optimizationrdquo Soft Computing vol 15 no 4 pp 645ndash665 2010

[11] G Wang L Guo H Duan L Liu and H Wang ldquoA modifiedfirefly algorithm for UCAV path planningrdquo International Jour-nal of Hybrid Information Technology vol 5 no 3 pp 123ndash1442012

20 Journal of Applied Mathematics

[12] GWang LGuoHDuan L Liu andHWang ldquoAhybridmeta-heuristic DECS algorithm for UCAVpath planningrdquo Journal ofInformation and Computational Science vol 9 no 16 pp 1ndash82012

[13] GWang L Guo H Duan L Liu HWang andM Shao ldquoPathplanning for uninhabited combat aerial vehicle using hybridmeta-heuristic DEBBO algorithmrdquo Advanced Science Engi-neering and Medicine vol 4 no 6 pp 550ndash564 2012

[14] G Wang L Guo H Duan H Wang L Liu and M Shao ldquoAhybrid meta-heuristic DECS algorithm for UCAV three-dimension path planningrdquo The Scientific World Journal vol2012 Article ID 583973 11 pages 2012

[15] H DuanW Zhao GWang and X Feng ldquoTest-sheet composi-tion using analytic hierarchy process and hybrid metaheuristicalgorithmTSBBOrdquoMathematical Problems in Engineering vol2012 Article ID 712752 22 pages 2012

[16] G Wang L Guo H Duan L Liu and H Wang ldquoDynamicdeployment of wireless sensor networks by biogeography basedoptimization algorithmrdquo Journal of Sensor and Actuator Net-works vol 1 no 2 pp 86ndash96 2012

[17] X S Yang A H Gandomi S Talatahari and A H AlaviMetaheuristics in Water Geotechnical and Transport Engineer-ing Elsevier Waltham Mass USA 2013

[18] D E GoldbergGenetic Algorithms in Search Optimization andMachine Learning Addison-Wesley New York NY USA 1998

[19] R Storn and K Price ldquoDifferential evolutionmdasha simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997

[20] S Das and P N Suganthan ldquoDifferential evolution a survey ofthe state-of-the-artrdquo IEEE Transactions on Evolutionary Com-putation vol 15 no 1 pp 4ndash31 2011

[21] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[22] A H Gandomi G J Yun X-S Yang and S Talatahari ldquoChaos-enhanced accelerated particle swarm optimizationrdquo Communi-cations in Nonlinear Science and Numerical Simulation vol 18no 2 pp 327ndash340 2013

[23] S Talatahari X S Yang R Sheikholeslami andAH GandomildquoOptimal parameter estimation for muskingum model usinghybrid CSS and PSO methodrdquo Journal of Computational andTheoretical Nanoscience In press

[24] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[25] G Wang L Guo H Duan L Liu H Wang and M ShaoldquoHybridizing harmony search with biogeography based opti-mization for global numerical optimizationrdquo Journal of Com-putational and Theoretical Nanoscience In press

[26] Z W Geem J H Kim and G V Loganathan ldquoA new heuristicoptimization algorithm harmony searchrdquo Simulation vol 76no 2 pp 60ndash68 2001

[27] G Wang L Guo H Duan H Wang L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[28] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress Frome UK 2nd edition 2010

[29] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing amp Applications In press

[30] X S Yang andAHGandomi ldquoBat algorithm a novel approachfor global engineering optimizationrdquo Engineering Computa-tions vol 29 no 5 pp 464ndash483 2012

[31] G Wang L Guo H Duan L Liu and H Wang ldquoA Bat algo-rithm with mutation for UCAV path planningrdquo The ScientificWorld Journal vol 2012 Article ID 418946 15 pages 2012

[32] R Parpinelli and H Lopes ldquoNew inspirations in swarm intelli-gence a surveyrdquo International Journal of Bio-Inspired Computa-tion vol 3 no 1 pp 1ndash16 2011

[33] P W Tsai J S Pan B Y Liao M J Tsai and V Istanda ldquoBatalgorithm inspired algorithm for solving numerical optimiza-tion problemsrdquo Applied Mechanics and Materials vol 148 pp134ndash137 2012

[34] Wikipedia ldquoMathematical optimizationrdquo httpenwikipediaorgwikiNumerical optimization

[35] W Y Zhang S Xu and S J Li ldquoNecessary conditions for weaksharp minima in cone-constrained optimization problemsrdquoAbstract and Applied Analysis vol 2011 Article ID 909520 11pages 2012

[36] Wikipedia ldquoGlobal optimizationrdquo httpenwikipediaorgwikiGlobal optimization

[37] S Kirkpatrick C D Gelatt Jr and M P Vecchi ldquoOptimizationby simulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983

[38] M Dorigo and T Stutzle Ant Colony Optimization The MITPress Cambridge Mass USA 2004

[39] M Dorigo Optimization Learning and Natural AlgorithmsPolitecnico di Milano Italy 1992

[40] H G Beyer The Theory of Evolution Strategies Springer NewYork NY USA 2001

[41] W Khatib and P Fleming ldquoThe stud GA a mini revolutionrdquo inParallel Problem Solving From Nature pp 683ndash691 1998

[42] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[43] M Friedman ldquoA comparison of alternative tests of significancefor the problem of 119898 rankingsrdquo The Annals of MathematicalStatistics vol 11 no 1 pp 86ndash92 1940

[44] R Brits A P Engelbrecht and F van den Bergh ldquoLocatingmultiple optima using particle swarm optimizationrdquo AppliedMathematics and Computation vol 189 no 2 pp 1859ndash18832007

[45] K Tang X Li P N Suganthan Z Yang and T Weise ldquoBench-mark functions for the CECrsquo2010 special session and competi-tion on large scale global optimizationrdquo Nature Inspired Com-putation and Applications Laboratory USTC Hefei China2010

[46] Q-K Pan P N Suganthan J J Liang and M F Tasgetiren ldquoAlocal-best harmony search algorithm with dynamic subpopula-tionsrdquo Engineering Optimization vol 42 no 2 pp 101ndash117 2010

[47] Q-K Pan P N Suganthan J J Liang and M F TasgetirenldquoA local-best harmony search algorithm with dynamic sub-harmony memories for lot-streaming flow shop schedulingproblemrdquo Expert Systems with Applications vol 38 no 4 pp3252ndash3259 2011

[48] Q-K Pan P N Suganthan M F Tasgetiren and J J LiangldquoA self-adaptive global best harmony search algorithm forcontinuous optimization problemsrdquo Applied Mathematics andComputation vol 216 no 3 pp 830ndash848 2010

Journal of Applied Mathematics 21

[49] S M Islam S Das S Ghosh S Roy and P N Suganthan ldquoAnadaptive differential evolution algorithm with novel mutationand crossover strategies for global numerical optimizationrdquoIEEE Transactions on Systems Man and Cybernetics B vol 42no 2 pp 482ndash500 2012

[50] J J Liang A K Qin P N Suganthan and S Baskar ldquoCompre-hensive learning particle swarm optimizer for global optimiza-tion of multimodal functionsrdquo IEEE Transactions on Evolution-ary Computation vol 10 no 3 pp 281ndash295 2006

[51] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[52] R Mallipeddi and P Suganthan Problem Definitions and Eval-uation Criteria for the CEC 2010 Competition on ConstrainedReal-Parameter Optimization Nanyang Technological Univer-sity Singapore 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

14 Journal of Applied MathematicsB

ench

mar

k fu

ncti

on v

alue

0 5 10 15 20 25 30 35 40 45 50Number of generations

102

101

100

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 11 Comparison of the performance of the different methodsfor the F11 Schwefel 222 function

0 10 20 30 40 5030

35

40

45

50

55

60

65

70

75

80

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 12 Comparison of the performance of the different methodsfor the F12 Schwefel 221 function

we can see that BBO has a faster convergence rate than SGAbut SGA does finally converge to the value of BBO

From the above-analyses about Figures 1ndash14 we can cometo the conclusion that our proposed hybrid metaheuristicalgorithm HSBA significantly outperforms the other ninealgorithms In general ACO BBO and SGA are only inferiorto the HSBA and ACO BBO and SGA perform betterthanHSBA in the benchmarks F04-F05 the benchmark F12and the benchmark F01 respectively Further benchmarksF04 F05 F06 F08 and F10 illustrate that PSO has a faster

0 10 20 30 40 500

10

20

30

40

50

60

70

80

90

100

Number of generations

Ben

chm

ark

func

tion

val

ue

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 13 Comparison of the performance of the different methodsfor the F13 Sphere function

0 5 10 15 20 25 30 35 40 45 50Number of generations

Ben

chm

ark

func

tion

val

ue

104

103

ACOBABBODEES

GAHSHSBAPSOSGA

Figure 14 Comparison of the performance of the differentmethodsfor the F14 Step function

convergence rate initially while later it converges slowerand slower to the true objective function value At lastwe must point out that in [24] Simon compared BBOwith seven state-of-the-art EAs over 14 benchmark func-tions and a real-world sensor selection problem The resultsdemonstrated the good performance of BBO It is also indi-rectly demonstrated that our proposed hybrid metaheuristicmethod HSBA is a more powerful and efficient optimiza-tion algorithm than other population-based optimizationalgorithms

Journal of Applied Mathematics 15

Table 5 Best normalized optimization results in fourteen benchmark functions with different 119860 The numbers shown are the best resultsfound after 100 Monte Carlo simulations HSBA algorithm

119860

0 01 02 03 04 05 06 07 08 09 10F01 166 133 155 140 100 133 121 122 116 113 114F02 125E4 100 100 100 100 100 100 100 100 100 100F03 341 1115 100 100 100 100 100 100 100 100 100F04 273E4 100 125E4 104 104 104 104 104 104 100 100F05 503E5 530 133 168E4 100 100 100 100 100 100 100F06 100 256 547 473 1781 990 289 674 990 260 557F07 1014 1492 435 110 110 1537 101 100 100 100 100F08 3849 108 1406 108 108 108 9901 101 100 100 100F09 28518 40458 138 474 119 119 119 36791 101 100 100F10 66522 468 169E3 1527 118 118 118 118 121 100 100F11 2003 100 996 1154 3922 996 996 996 996 3897 849F12 1032 100 1322 323 1469 279 279 279 279 279 1931F13 146 415 284 447 115 156 100 100 100 100 100F14 52752 1357 395 101 115 1291 100 100 100 100 100

1 4 2 2 3 3 5 6 7 10 10

42 Influence of Control Parameter In [28] Dr Yang con-cluded that if the parameters in BA can be adjusted properlyit can outperform GA HS (harmony search) [26] and PSOThe choice of the control parameters is of vital importancefor different problems To investigate the influence of theloudness119860 pulse rate 119903 harmonymemory consideration rateHMCR and pitch adjustment rate PAR on the performanceof HSBA we carry out this experiment in the benchmarkproblem with different parameters All other parametersettings are kept unchanged (unless noted otherwise in thefollowing paragraph) The results are recorded in Tables 5ndash12 after 100 Monte Carlo runs Among them Tables 5 7 9and 11 show the best minima found by HSBA algorithm over100Monte Carlo runs Tables 6 8 10 and 12 show the averageminima found byHSBA algorithm averaged over 100MonteCarlo runs In other words Tables 5 7 9 11 and Tables 68 10 12 show the best and average performance of HSBAalgorithm respectively In each table the last row is the totalnumber of functions on which HSBA performs the best withsome parameters

421 Loudness 119860 Tables 5 and 6 recorded the resultsperformed in the benchmark problem with the loudness 119860 =0 01 02 09 10 and fixed pulse rate 119903 = 06 harmonymemory consideration rate HMCR= 095 and pitch adjust-ment rate PAR= 01 From Tables 5 and 6 obviously it can beseen that (i) for the three benchmark functions F02 F03 andF04 HSBA performs slightly differently that is to say thesethree benchmark functions are insensitive to the parameter119860 (ii) For benchmark functions F01 F06 F11 and F12HSBA performs better on smaller119860(lt05) (iii) However forfunctions F05 F07ndash10 F13 and F14 HSBA performs betteron bigger119860(gt05) In sum HSBA performs the best when119860is equal or very close to 10 So we set119860 = 095 which is veryclose to 09 and 10 in other experiments

422 Pulse Rate 119903 Tables 7 and 8 recorded the resultsperformed on the benchmark problem with the pulse rate119903 = 0 01 02 09 10 and fixed loudness119860 = 095harmony memory consideration rate HMCR=095 andpitch adjustment rate PAR= 01 From Tables 7 and 8 wecan evidently conclude that HSBA performs significantlybetter on bigger 119903(gt05) than on smaller 119903(lt05) and HSBAperforms the best when r is equal or very close to 06 So weset 119903 = 06 in other experiments

423 Harmony Memory Consideration Rate119867119872119862119877 Tables9 and 10 recorded the results performed in the bench-mark problem with the harmony memory consideration rateHMCR = 0 01 02 09 10 fixed loudness 119860 = 095pulse rate 119903 = 06 and pitch adjustment rate PAR = 01 FromTables 9 and 10 we can recognize that the function valuesevaluated by HSBA are bettersmaller on bigger HMCR(gt05) than on smaller r (lt05) and most benchmarks reachminimum when HMCR is equal or very close to 1 So we setHMCR=095 in other experiments

424 Pitch Adjustment Rate 119875119860119877 Tables 11 and 12 recordedthe results performed on the benchmark problems with thepitch adjustment rate PAR = 0 01 02 09 10 fixedloudness119860 = 095 pulse rate 119903 = 06 and harmonymemory consideration rate HMCR = 095 From Tables 11and 12 we can recognize that the function values for HSBAvary little with the increasing PAR and HSBA reachesoptimumminimum in most benchmarks when PAR is equalor very close to 01 So we set PAR = 01 in other experiments

43 Discussion In the HSBA the bats fly in the sky usingecholocation to find foodprey (ie best solutions) Fourother parameters are the loudness (119860) the rate of pulseemission (119903) harmony memory consideration rate (HMCR)

16 Journal of Applied Mathematics

Table 6 Mean normalized optimization results in fourteen benchmark functions with different A The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119860

0 01 02 03 04 05 06 07 08 09 10F01 101 101 100 100 100 100 100 100 100 100 100F02 152 100 146 117 108 137 188 165 167 179 148F03 669 1041 106 101 103 103 101 102 105 100 100F04 20131 126 410 421 333 272 100 519 291 100 311F05 59122 1682 425 587 101 414 2404 100 368 733 900F06 100 490E4 63799 25871 495E4 218 217 217 218 217 217F07 857 221E6 643 27234 11048 211E4 102 101 100 100 100F08 4980 114E4 172E4 16160 22459 9119 174E4 103 111 100 100F09 8237 173E6 106 314 7445 10310 4226 794E3 100 137 129F10 9031 145E3 245E5 232E3 2334 2234 3138 1289 234E3 140 100F11 498 100 315E4 233 1441 52023 44365 61626 24991 478E4 214F12 369 210E4 457E4 100 212E4 26635 23313 19880 27614 11201 214E4F13 190 825 448E6 214E6 100 615 25451 22275 18996 26386 10701F14 6691 195E5 117E4 124E3 2104 186E3 2940 2392 100 1835 2475

1 2 1 2 2 1 2 2 4 5 5

Table 7 Best normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 184 184 484 159 171 100 134 109 116 101F02 944E3 100 100 100 100 100 100 100 100 100 100F03 373 546 186 186 186 186 100 172 121 161 101F04 1142 111 2429 123 126 100 129 129 129 129 129F05 177E5 230 115 109E4 100 100 100 100 100 100 100F06 6244 2996 4858 1560 2634 780 182 100 3520 239 155F07 831 448 258 129 129 349 100 100 100 100 100F08 961 118 420 137 137 137 695 101 101 100 100F09 37394 44422 168 335 168 168 100 29198 101 101 101F10 38693 316 1397 463 158 158 100 158 30931 158 101F11 4255 100 1829 2135 2018 1111 1904 2135 1581 1876 1155F12 11440 100 14184 4448 12547 4448 4448 4448 4448 4448 6943F13 639 637 601 296 302 100 152 139 100 257 249F14 12052 404 233 100 116 340 100 100 116 116 116

0 3 1 2 2 4 8 5 4 4 4

and pitch adjustment rate (PAR) The appropriate update forthe pitch adjustment rate (PAR) and the rate of pulse emission(119903) balances the exploration and exploitation behavior of eachbat respectively as the loudness usually decreases once a bathas found its preysolution while the rate of pulse emissionincreases in order to raise the attack accuracy

For all of the standard benchmark functions that havebeen considered the HSBA has been demonstrated toperform better than or be equal to the standard BA and otheracclaimed state-of-the-art population-based algorithms withthe HSBA performing significantly better in some functionsThe HSBA performs excellently and efficiently because ofits ability to simultaneously carry out a local search still

searching globally at the same time It succeeds in doingthis due to the local search via harmony search algorithmand global search via bat algorithm concurrently A similarbehavior may be performed in the PSO by using multiswarmfrom a particle population initially [44] However HSBArsquosadvantages include performing simply and easily and onlyhaving four parameters to regulate The work carried outhere demonstrates the HSBA to be robust over all kinds ofbenchmark functions

Benchmark evaluation is a good way for verifying theperformance of the metaheuristic algorithms but it also haslimitations First we did not make any special effort to tunethe optimization algorithms in this section Different tuning

Journal of Applied Mathematics 17

Table 8 Mean normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 199 186 194 159 100 142 134 109 116 171F02 331 538 457 308 382 114 100 289 279 402 311F03 373 546 533 229 336 241 100 172 121 161 136F04 1142 111 2429 149E4 113E3 394E3 278E4 1966 17522 114 100F05 1609 12862 3269 2446 6440 1277 2959 7617 456 100 446F06 6244 2696 4858 1560 2634 780 182 100 3520 239 155F07 330 178 134 103 123 139 155 100 119 136 349F08 488 393 394 194 219 383 353 433 100 227 314F09 139 166 114 100 121 115 115 109 108 104 109F10 721 994 875 360 846 611 483 414 576 100 271F11 382 423 373 205 181 100 171 220 142 168 189F12 164 217 204 147 180 186 100 121 113 134 156F13 639 637 601 296 302 190 152 139 100 257 249F14 574 1543 742 682 644 317 100 485 221 174 209

0 0 0 1 0 2 4 2 2 2 1

Table 9 Best normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 164 164 164 164 164 164 164 151 128 100 163F02 187E4 100 100 100 100 100 100 100 100 100 100F03 1982 2649 371 371 371 371 371 371 371 206 100F04 126E5 100 187E4 100 100 100 100 100 100 100 100F05 607E5 534 100 187E4 100 100 100 100 100 100 100F06 10892 24631 36504 34540 33857 23465 14349 4528 2330 100 2575F07 922 1428 534 100 100 930 100 100 100 100 100F08 1810 108 772 108 108 108 3594 100 100 100 100F09 28004 39161 127 681 127 127 127 22795 100 100 100F10 55165 876 176E3 1170 164 164 164 164 27448 100 100F11 1467 100 618 618 1399 618 618 618 440 269 616F12 768 100 1110 273 1021 273 273 273 273 273 473F13 772 2675 1590 1776 612 1040 612 595 255 100 225F14 53716 1428 534 100 100 713 100 100 100 100 100

0 4 2 4 5 3 5 6 7 11 9

parameter values in the optimization algorithmsmight resultin significant differences in their performance Second real-world optimization problems may not have much of arelationship to benchmark functionsThird benchmark testsmight result in different conclusions if the grading criteria orproblem setup change In our work we examined the meanand best results obtained with a certain population size andafter a certain number of generations However we mightarrive at different conclusions if (for example) we change thegeneration limit or look at how many generations it takes toreach a certain function value or if we change the populationsize In spite of these caveats the benchmark results shown

here are promising for HSBA and indicate that this novelmethod might be able to find a niche among the plethora ofpopulation-based optimization algorithms

We note that CPU time is a bottleneck to the implemen-tation of many population-based optimization algorithmsIf an algorithm cannot converge fast it will be impracticalsince it would take too long to find an optimal or suboptimalsolution HSBA does not seem to require an unreasonableamount of computational effort of the ten optimization algo-rithms compared in this paper HSBA was the third fastestNevertheless findingmechanisms to accelerate HSBA couldbe an important area for further research

18 Journal of Applied Mathematics

Table 10 Mean normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 101 101 101 101 101 101 101 101 101 100 101F02 74756 787 525 343 509 710 307 187 160 100 101F03 913 312E4 109 107 105 106 105 102 101 101 100F04 2101198646 1001198644 3951198644 9481198643 5971198643 2951198643 1511198643 1501198643 4694 100 237F05 3241198646 3271198644 2091198644 4141198644 7131198643 1541198643 8931198643 1531198643 62948 100 20373F06 100 5941198644 42204 63215 6001198644 192 191 191 191 191 191F07 1063 2071198646 900 21672 32455 307E4 104 103 102 101 100F08 5950 9881198643 304E4 14180 21701 32468 307E4 107 107 103 100F09 22687 4951198646 308 891 11422 17360 25941 245E4 145 100 171F10 25737 2831198644 935E5 137E4 9709 6656 9925 14940 139E4 100 273F11 607 100 183E4 202 1661 39040 26299 40300 60361 572E4 184F12 300 279E4 360E4 100 282E4 27095 19414 13077 20040 30016 284E4F13 232 966 561E6 189E6 100 815 26778 19186 12923 19805 29665F14 18466 384E5 117E4 185E3 100 571E3 2511 5539 3960 2657 4149

1 1 0 1 2 0 0 0 0 6 3

Table 11 Best normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 391E3 100 100 100 100 100 100 100 100 100 100F03 100 425 151 220 220 212 220 199 220 128 180F04 100 255 6554 255 255 255 255 255 255 255 255F05 252 100 171 169E3 171 171 171 171 171 171 171F06 100 5987 3418 2285 4652 2344 4354 4484 3688 1899 812F07 807 100 148 255 255 1306 255 255 255 255 255F08 543 100 192 100 100 100 1145 100 100 100 100F09 7672 252 117 100 171 171 171 15556 171 171 171F10 92910 148 100 491 255 255 255 222 235E3 255 255F11 318E3 100 582E3 399E3 339E3 582E3 491E3 582E3 582E3 958E3 582E3F12 30506 100 53728 9734 18758 9734 9734 9734 9734 9734 39827F13 192 422 587 1007 1298 466 148 401 317 100 409F14 8812 100 148 255 255 491 255 255 255 255 255

4 8 3 4 3 3 2 3 3 4 3

5 Conclusion and Future Work

This paper proposed a hybrid metaheuristic HSBA methodfor optimization problemWe improved the BAby combiningoriginal harmony search (HS) algorithm and evaluating theHSBA on multimodal numerical optimization problemsA novel type of BA model has been presented and animprovement is applied to mutate between bats using har-mony search algorithm during the process of bats updatingUsing the original configuration of the bat algorithm wegenerate the new harmonies based on the newly generatedbat each iteration after batrsquos position has been updated Thenew harmony vector substitutes the newly generated bat only

if it has better fitness This selection scheme is rather greedywhich often overtakes original HS and BA The HSBAattempts to take merits of the BA and the HS in order toavoid all bats getting trapped in inferior local optimal regionsThe HSBA enables the bats to have more diverse exemplarsto learn from as the bats are updated each iteration andalso form new harmonies to search in a larger search spaceThis new method can speed up the global convergence ratewithout losing the strong robustness of the basic BA Fromthe analysis of the experimental results we observe that theproposed HSBA makes good use of the information in pastsolutions more effectively to generate better quality solutionsfrequently when compared to the other population-based

Journal of Applied Mathematics 19

Table 12Mean normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 407 100 100 452 277 257 341 536 369 489 103F03 100 127E4 134 135 135 135 134 135 134 134 134F04 115 8127 939E3 100 101 101 129 100 100 101 101F05 34573 5061 8619 907E3 2501 108 101 191 333 100 118F06 100 542E6 427E4 474E4 548E6 57788 57788 57788 57788 57787 57787F07 486 153 100 9543 10596 122E4 134 136 132 133 135F08 1269 7600 876E3 1703 6913 7672 885E3 110 105 104 100F09 6345 23047 182 167 1278 4860 5349 609E3 111 130 100F10 13355 1251 126 197E3 1148 464 1645 1893 199E3 100 188F11 6302 100 639E3 7944 5924 396E3 142E3 581E3 645E3 745E5 7981F12 390 881E3 890E3 100 890E3 4440 4778 1725 7011 7784 899E3F13 100 2166 207E3 669 580 430 27167 28246 10539 42926 47662F14 4614 100 3610 5593 122 640E3 4215 3289 3481 1272 5129

4 4 3 3 1 1 1 2 2 3 3

optimization algorithms such as ACO BA BBO DE ESGA HS PSO and SGA Based on the results of the tenapproaches on the test problems we can conclude that theHSBA significantly improves the performances of the HSand the BA on most multimodal and unimodal problems

In this work 14 benchmark functions are used to evaluatethe performance of our approach we will test our approachon more problems such as the high-dimensional (119863 ge20) CEC 2010 test suit [45] and the real-world problemsMoreover we will compare HSBA with other metaheuristicmethod such as DLHS [46 47] SGHS [48] adaptive DE[49] CLPSO [50] and DMS-PSO [51] In addition we onlyconsider the unconstrained function optimization in thiswork Our future work consists on adding the diversity rulesinto HSBA for constrained optimization problems such asconstrained real-parameter optimization CEC 2010 test suit[52]

In the field of optimization there are many issues worthyof further study and efficient optimizationmethod should bedeveloped depending on the analysis of specific engineeringproblem Our future work will focus on the two issues onthe one hand we would apply our proposed approach HSBA to solve practical engineering optimization problemsand obviously HSBA can become a fascinating method forreal-world engineering optimization problems on the otherhand we would develop new meta-hybrid approach to solveoptimization problem

Acknowledgments

This work was supported by State Key Laboratory of LaserInteraction with Material Research Fund under Grant noSKLLIM0902-01 and Key Research Technology of Electric-discharge Non-chain Pulsed DF Laser under Grant no LXJJ-11-Q80

References

[1] G Wang L Guo A H Gandomi et al ldquoLevy-flight krill herdalgorithmrdquo Mathematical Problems in Engineering vol 2013Article ID 682073 14 pages 2013

[2] A H Gandomi X S Yang S Talatahari and A H AlaviMetaheuristic Applications in Structures and InfrastructuresElsevier Waltham Mass USA 2013

[3] D H Ackley ldquoAn empirical study of bit vector functionoptimizationrdquo in Genetic Algorithms and Simulated AnnealingL Davis Ed pp 170ndash204 Pitman London UK 1987

[4] R Fletcher and M J D Powell ldquoA rapidly convergent descentmethod for minimizationrdquoThe Computer Journal vol 6 no 2pp 163ndash168 1963

[5] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[6] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[7] K A De Jong Analysis of the Behavior of a Class of GeneticAdaptive Systems University of Michigan 1975

[8] L A Rastrigin Extremal Control Systems NaukaMoscow Rus-sian 1974Theoretical Foundations of Engineering CyberneticsSeries

[9] S-K S Fan and J-M Chang ldquoDynamic multi-swarm particleswarm optimizer using parallel PC cluster systems for globaloptimization of large-scale multimodal functionsrdquo EngineeringOptimization vol 42 no 5 pp 431ndash451 2010

[10] W Gong Z Cai and C X Ling ldquoDEBBO a hybrid differentialevolution with biogeography-based optimization for globalnumerical optimizationrdquo Soft Computing vol 15 no 4 pp 645ndash665 2010

[11] G Wang L Guo H Duan L Liu and H Wang ldquoA modifiedfirefly algorithm for UCAV path planningrdquo International Jour-nal of Hybrid Information Technology vol 5 no 3 pp 123ndash1442012

20 Journal of Applied Mathematics

[12] GWang LGuoHDuan L Liu andHWang ldquoAhybridmeta-heuristic DECS algorithm for UCAVpath planningrdquo Journal ofInformation and Computational Science vol 9 no 16 pp 1ndash82012

[13] GWang L Guo H Duan L Liu HWang andM Shao ldquoPathplanning for uninhabited combat aerial vehicle using hybridmeta-heuristic DEBBO algorithmrdquo Advanced Science Engi-neering and Medicine vol 4 no 6 pp 550ndash564 2012

[14] G Wang L Guo H Duan H Wang L Liu and M Shao ldquoAhybrid meta-heuristic DECS algorithm for UCAV three-dimension path planningrdquo The Scientific World Journal vol2012 Article ID 583973 11 pages 2012

[15] H DuanW Zhao GWang and X Feng ldquoTest-sheet composi-tion using analytic hierarchy process and hybrid metaheuristicalgorithmTSBBOrdquoMathematical Problems in Engineering vol2012 Article ID 712752 22 pages 2012

[16] G Wang L Guo H Duan L Liu and H Wang ldquoDynamicdeployment of wireless sensor networks by biogeography basedoptimization algorithmrdquo Journal of Sensor and Actuator Net-works vol 1 no 2 pp 86ndash96 2012

[17] X S Yang A H Gandomi S Talatahari and A H AlaviMetaheuristics in Water Geotechnical and Transport Engineer-ing Elsevier Waltham Mass USA 2013

[18] D E GoldbergGenetic Algorithms in Search Optimization andMachine Learning Addison-Wesley New York NY USA 1998

[19] R Storn and K Price ldquoDifferential evolutionmdasha simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997

[20] S Das and P N Suganthan ldquoDifferential evolution a survey ofthe state-of-the-artrdquo IEEE Transactions on Evolutionary Com-putation vol 15 no 1 pp 4ndash31 2011

[21] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[22] A H Gandomi G J Yun X-S Yang and S Talatahari ldquoChaos-enhanced accelerated particle swarm optimizationrdquo Communi-cations in Nonlinear Science and Numerical Simulation vol 18no 2 pp 327ndash340 2013

[23] S Talatahari X S Yang R Sheikholeslami andAH GandomildquoOptimal parameter estimation for muskingum model usinghybrid CSS and PSO methodrdquo Journal of Computational andTheoretical Nanoscience In press

[24] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[25] G Wang L Guo H Duan L Liu H Wang and M ShaoldquoHybridizing harmony search with biogeography based opti-mization for global numerical optimizationrdquo Journal of Com-putational and Theoretical Nanoscience In press

[26] Z W Geem J H Kim and G V Loganathan ldquoA new heuristicoptimization algorithm harmony searchrdquo Simulation vol 76no 2 pp 60ndash68 2001

[27] G Wang L Guo H Duan H Wang L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[28] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress Frome UK 2nd edition 2010

[29] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing amp Applications In press

[30] X S Yang andAHGandomi ldquoBat algorithm a novel approachfor global engineering optimizationrdquo Engineering Computa-tions vol 29 no 5 pp 464ndash483 2012

[31] G Wang L Guo H Duan L Liu and H Wang ldquoA Bat algo-rithm with mutation for UCAV path planningrdquo The ScientificWorld Journal vol 2012 Article ID 418946 15 pages 2012

[32] R Parpinelli and H Lopes ldquoNew inspirations in swarm intelli-gence a surveyrdquo International Journal of Bio-Inspired Computa-tion vol 3 no 1 pp 1ndash16 2011

[33] P W Tsai J S Pan B Y Liao M J Tsai and V Istanda ldquoBatalgorithm inspired algorithm for solving numerical optimiza-tion problemsrdquo Applied Mechanics and Materials vol 148 pp134ndash137 2012

[34] Wikipedia ldquoMathematical optimizationrdquo httpenwikipediaorgwikiNumerical optimization

[35] W Y Zhang S Xu and S J Li ldquoNecessary conditions for weaksharp minima in cone-constrained optimization problemsrdquoAbstract and Applied Analysis vol 2011 Article ID 909520 11pages 2012

[36] Wikipedia ldquoGlobal optimizationrdquo httpenwikipediaorgwikiGlobal optimization

[37] S Kirkpatrick C D Gelatt Jr and M P Vecchi ldquoOptimizationby simulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983

[38] M Dorigo and T Stutzle Ant Colony Optimization The MITPress Cambridge Mass USA 2004

[39] M Dorigo Optimization Learning and Natural AlgorithmsPolitecnico di Milano Italy 1992

[40] H G Beyer The Theory of Evolution Strategies Springer NewYork NY USA 2001

[41] W Khatib and P Fleming ldquoThe stud GA a mini revolutionrdquo inParallel Problem Solving From Nature pp 683ndash691 1998

[42] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[43] M Friedman ldquoA comparison of alternative tests of significancefor the problem of 119898 rankingsrdquo The Annals of MathematicalStatistics vol 11 no 1 pp 86ndash92 1940

[44] R Brits A P Engelbrecht and F van den Bergh ldquoLocatingmultiple optima using particle swarm optimizationrdquo AppliedMathematics and Computation vol 189 no 2 pp 1859ndash18832007

[45] K Tang X Li P N Suganthan Z Yang and T Weise ldquoBench-mark functions for the CECrsquo2010 special session and competi-tion on large scale global optimizationrdquo Nature Inspired Com-putation and Applications Laboratory USTC Hefei China2010

[46] Q-K Pan P N Suganthan J J Liang and M F Tasgetiren ldquoAlocal-best harmony search algorithm with dynamic subpopula-tionsrdquo Engineering Optimization vol 42 no 2 pp 101ndash117 2010

[47] Q-K Pan P N Suganthan J J Liang and M F TasgetirenldquoA local-best harmony search algorithm with dynamic sub-harmony memories for lot-streaming flow shop schedulingproblemrdquo Expert Systems with Applications vol 38 no 4 pp3252ndash3259 2011

[48] Q-K Pan P N Suganthan M F Tasgetiren and J J LiangldquoA self-adaptive global best harmony search algorithm forcontinuous optimization problemsrdquo Applied Mathematics andComputation vol 216 no 3 pp 830ndash848 2010

Journal of Applied Mathematics 21

[49] S M Islam S Das S Ghosh S Roy and P N Suganthan ldquoAnadaptive differential evolution algorithm with novel mutationand crossover strategies for global numerical optimizationrdquoIEEE Transactions on Systems Man and Cybernetics B vol 42no 2 pp 482ndash500 2012

[50] J J Liang A K Qin P N Suganthan and S Baskar ldquoCompre-hensive learning particle swarm optimizer for global optimiza-tion of multimodal functionsrdquo IEEE Transactions on Evolution-ary Computation vol 10 no 3 pp 281ndash295 2006

[51] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[52] R Mallipeddi and P Suganthan Problem Definitions and Eval-uation Criteria for the CEC 2010 Competition on ConstrainedReal-Parameter Optimization Nanyang Technological Univer-sity Singapore 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Journal of Applied Mathematics 15

Table 5 Best normalized optimization results in fourteen benchmark functions with different 119860 The numbers shown are the best resultsfound after 100 Monte Carlo simulations HSBA algorithm

119860

0 01 02 03 04 05 06 07 08 09 10F01 166 133 155 140 100 133 121 122 116 113 114F02 125E4 100 100 100 100 100 100 100 100 100 100F03 341 1115 100 100 100 100 100 100 100 100 100F04 273E4 100 125E4 104 104 104 104 104 104 100 100F05 503E5 530 133 168E4 100 100 100 100 100 100 100F06 100 256 547 473 1781 990 289 674 990 260 557F07 1014 1492 435 110 110 1537 101 100 100 100 100F08 3849 108 1406 108 108 108 9901 101 100 100 100F09 28518 40458 138 474 119 119 119 36791 101 100 100F10 66522 468 169E3 1527 118 118 118 118 121 100 100F11 2003 100 996 1154 3922 996 996 996 996 3897 849F12 1032 100 1322 323 1469 279 279 279 279 279 1931F13 146 415 284 447 115 156 100 100 100 100 100F14 52752 1357 395 101 115 1291 100 100 100 100 100

1 4 2 2 3 3 5 6 7 10 10

42 Influence of Control Parameter In [28] Dr Yang con-cluded that if the parameters in BA can be adjusted properlyit can outperform GA HS (harmony search) [26] and PSOThe choice of the control parameters is of vital importancefor different problems To investigate the influence of theloudness119860 pulse rate 119903 harmonymemory consideration rateHMCR and pitch adjustment rate PAR on the performanceof HSBA we carry out this experiment in the benchmarkproblem with different parameters All other parametersettings are kept unchanged (unless noted otherwise in thefollowing paragraph) The results are recorded in Tables 5ndash12 after 100 Monte Carlo runs Among them Tables 5 7 9and 11 show the best minima found by HSBA algorithm over100Monte Carlo runs Tables 6 8 10 and 12 show the averageminima found byHSBA algorithm averaged over 100MonteCarlo runs In other words Tables 5 7 9 11 and Tables 68 10 12 show the best and average performance of HSBAalgorithm respectively In each table the last row is the totalnumber of functions on which HSBA performs the best withsome parameters

421 Loudness 119860 Tables 5 and 6 recorded the resultsperformed in the benchmark problem with the loudness 119860 =0 01 02 09 10 and fixed pulse rate 119903 = 06 harmonymemory consideration rate HMCR= 095 and pitch adjust-ment rate PAR= 01 From Tables 5 and 6 obviously it can beseen that (i) for the three benchmark functions F02 F03 andF04 HSBA performs slightly differently that is to say thesethree benchmark functions are insensitive to the parameter119860 (ii) For benchmark functions F01 F06 F11 and F12HSBA performs better on smaller119860(lt05) (iii) However forfunctions F05 F07ndash10 F13 and F14 HSBA performs betteron bigger119860(gt05) In sum HSBA performs the best when119860is equal or very close to 10 So we set119860 = 095 which is veryclose to 09 and 10 in other experiments

422 Pulse Rate 119903 Tables 7 and 8 recorded the resultsperformed on the benchmark problem with the pulse rate119903 = 0 01 02 09 10 and fixed loudness119860 = 095harmony memory consideration rate HMCR=095 andpitch adjustment rate PAR= 01 From Tables 7 and 8 wecan evidently conclude that HSBA performs significantlybetter on bigger 119903(gt05) than on smaller 119903(lt05) and HSBAperforms the best when r is equal or very close to 06 So weset 119903 = 06 in other experiments

423 Harmony Memory Consideration Rate119867119872119862119877 Tables9 and 10 recorded the results performed in the bench-mark problem with the harmony memory consideration rateHMCR = 0 01 02 09 10 fixed loudness 119860 = 095pulse rate 119903 = 06 and pitch adjustment rate PAR = 01 FromTables 9 and 10 we can recognize that the function valuesevaluated by HSBA are bettersmaller on bigger HMCR(gt05) than on smaller r (lt05) and most benchmarks reachminimum when HMCR is equal or very close to 1 So we setHMCR=095 in other experiments

424 Pitch Adjustment Rate 119875119860119877 Tables 11 and 12 recordedthe results performed on the benchmark problems with thepitch adjustment rate PAR = 0 01 02 09 10 fixedloudness119860 = 095 pulse rate 119903 = 06 and harmonymemory consideration rate HMCR = 095 From Tables 11and 12 we can recognize that the function values for HSBAvary little with the increasing PAR and HSBA reachesoptimumminimum in most benchmarks when PAR is equalor very close to 01 So we set PAR = 01 in other experiments

43 Discussion In the HSBA the bats fly in the sky usingecholocation to find foodprey (ie best solutions) Fourother parameters are the loudness (119860) the rate of pulseemission (119903) harmony memory consideration rate (HMCR)

16 Journal of Applied Mathematics

Table 6 Mean normalized optimization results in fourteen benchmark functions with different A The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119860

0 01 02 03 04 05 06 07 08 09 10F01 101 101 100 100 100 100 100 100 100 100 100F02 152 100 146 117 108 137 188 165 167 179 148F03 669 1041 106 101 103 103 101 102 105 100 100F04 20131 126 410 421 333 272 100 519 291 100 311F05 59122 1682 425 587 101 414 2404 100 368 733 900F06 100 490E4 63799 25871 495E4 218 217 217 218 217 217F07 857 221E6 643 27234 11048 211E4 102 101 100 100 100F08 4980 114E4 172E4 16160 22459 9119 174E4 103 111 100 100F09 8237 173E6 106 314 7445 10310 4226 794E3 100 137 129F10 9031 145E3 245E5 232E3 2334 2234 3138 1289 234E3 140 100F11 498 100 315E4 233 1441 52023 44365 61626 24991 478E4 214F12 369 210E4 457E4 100 212E4 26635 23313 19880 27614 11201 214E4F13 190 825 448E6 214E6 100 615 25451 22275 18996 26386 10701F14 6691 195E5 117E4 124E3 2104 186E3 2940 2392 100 1835 2475

1 2 1 2 2 1 2 2 4 5 5

Table 7 Best normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 184 184 484 159 171 100 134 109 116 101F02 944E3 100 100 100 100 100 100 100 100 100 100F03 373 546 186 186 186 186 100 172 121 161 101F04 1142 111 2429 123 126 100 129 129 129 129 129F05 177E5 230 115 109E4 100 100 100 100 100 100 100F06 6244 2996 4858 1560 2634 780 182 100 3520 239 155F07 831 448 258 129 129 349 100 100 100 100 100F08 961 118 420 137 137 137 695 101 101 100 100F09 37394 44422 168 335 168 168 100 29198 101 101 101F10 38693 316 1397 463 158 158 100 158 30931 158 101F11 4255 100 1829 2135 2018 1111 1904 2135 1581 1876 1155F12 11440 100 14184 4448 12547 4448 4448 4448 4448 4448 6943F13 639 637 601 296 302 100 152 139 100 257 249F14 12052 404 233 100 116 340 100 100 116 116 116

0 3 1 2 2 4 8 5 4 4 4

and pitch adjustment rate (PAR) The appropriate update forthe pitch adjustment rate (PAR) and the rate of pulse emission(119903) balances the exploration and exploitation behavior of eachbat respectively as the loudness usually decreases once a bathas found its preysolution while the rate of pulse emissionincreases in order to raise the attack accuracy

For all of the standard benchmark functions that havebeen considered the HSBA has been demonstrated toperform better than or be equal to the standard BA and otheracclaimed state-of-the-art population-based algorithms withthe HSBA performing significantly better in some functionsThe HSBA performs excellently and efficiently because ofits ability to simultaneously carry out a local search still

searching globally at the same time It succeeds in doingthis due to the local search via harmony search algorithmand global search via bat algorithm concurrently A similarbehavior may be performed in the PSO by using multiswarmfrom a particle population initially [44] However HSBArsquosadvantages include performing simply and easily and onlyhaving four parameters to regulate The work carried outhere demonstrates the HSBA to be robust over all kinds ofbenchmark functions

Benchmark evaluation is a good way for verifying theperformance of the metaheuristic algorithms but it also haslimitations First we did not make any special effort to tunethe optimization algorithms in this section Different tuning

Journal of Applied Mathematics 17

Table 8 Mean normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 199 186 194 159 100 142 134 109 116 171F02 331 538 457 308 382 114 100 289 279 402 311F03 373 546 533 229 336 241 100 172 121 161 136F04 1142 111 2429 149E4 113E3 394E3 278E4 1966 17522 114 100F05 1609 12862 3269 2446 6440 1277 2959 7617 456 100 446F06 6244 2696 4858 1560 2634 780 182 100 3520 239 155F07 330 178 134 103 123 139 155 100 119 136 349F08 488 393 394 194 219 383 353 433 100 227 314F09 139 166 114 100 121 115 115 109 108 104 109F10 721 994 875 360 846 611 483 414 576 100 271F11 382 423 373 205 181 100 171 220 142 168 189F12 164 217 204 147 180 186 100 121 113 134 156F13 639 637 601 296 302 190 152 139 100 257 249F14 574 1543 742 682 644 317 100 485 221 174 209

0 0 0 1 0 2 4 2 2 2 1

Table 9 Best normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 164 164 164 164 164 164 164 151 128 100 163F02 187E4 100 100 100 100 100 100 100 100 100 100F03 1982 2649 371 371 371 371 371 371 371 206 100F04 126E5 100 187E4 100 100 100 100 100 100 100 100F05 607E5 534 100 187E4 100 100 100 100 100 100 100F06 10892 24631 36504 34540 33857 23465 14349 4528 2330 100 2575F07 922 1428 534 100 100 930 100 100 100 100 100F08 1810 108 772 108 108 108 3594 100 100 100 100F09 28004 39161 127 681 127 127 127 22795 100 100 100F10 55165 876 176E3 1170 164 164 164 164 27448 100 100F11 1467 100 618 618 1399 618 618 618 440 269 616F12 768 100 1110 273 1021 273 273 273 273 273 473F13 772 2675 1590 1776 612 1040 612 595 255 100 225F14 53716 1428 534 100 100 713 100 100 100 100 100

0 4 2 4 5 3 5 6 7 11 9

parameter values in the optimization algorithmsmight resultin significant differences in their performance Second real-world optimization problems may not have much of arelationship to benchmark functionsThird benchmark testsmight result in different conclusions if the grading criteria orproblem setup change In our work we examined the meanand best results obtained with a certain population size andafter a certain number of generations However we mightarrive at different conclusions if (for example) we change thegeneration limit or look at how many generations it takes toreach a certain function value or if we change the populationsize In spite of these caveats the benchmark results shown

here are promising for HSBA and indicate that this novelmethod might be able to find a niche among the plethora ofpopulation-based optimization algorithms

We note that CPU time is a bottleneck to the implemen-tation of many population-based optimization algorithmsIf an algorithm cannot converge fast it will be impracticalsince it would take too long to find an optimal or suboptimalsolution HSBA does not seem to require an unreasonableamount of computational effort of the ten optimization algo-rithms compared in this paper HSBA was the third fastestNevertheless findingmechanisms to accelerate HSBA couldbe an important area for further research

18 Journal of Applied Mathematics

Table 10 Mean normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 101 101 101 101 101 101 101 101 101 100 101F02 74756 787 525 343 509 710 307 187 160 100 101F03 913 312E4 109 107 105 106 105 102 101 101 100F04 2101198646 1001198644 3951198644 9481198643 5971198643 2951198643 1511198643 1501198643 4694 100 237F05 3241198646 3271198644 2091198644 4141198644 7131198643 1541198643 8931198643 1531198643 62948 100 20373F06 100 5941198644 42204 63215 6001198644 192 191 191 191 191 191F07 1063 2071198646 900 21672 32455 307E4 104 103 102 101 100F08 5950 9881198643 304E4 14180 21701 32468 307E4 107 107 103 100F09 22687 4951198646 308 891 11422 17360 25941 245E4 145 100 171F10 25737 2831198644 935E5 137E4 9709 6656 9925 14940 139E4 100 273F11 607 100 183E4 202 1661 39040 26299 40300 60361 572E4 184F12 300 279E4 360E4 100 282E4 27095 19414 13077 20040 30016 284E4F13 232 966 561E6 189E6 100 815 26778 19186 12923 19805 29665F14 18466 384E5 117E4 185E3 100 571E3 2511 5539 3960 2657 4149

1 1 0 1 2 0 0 0 0 6 3

Table 11 Best normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 391E3 100 100 100 100 100 100 100 100 100 100F03 100 425 151 220 220 212 220 199 220 128 180F04 100 255 6554 255 255 255 255 255 255 255 255F05 252 100 171 169E3 171 171 171 171 171 171 171F06 100 5987 3418 2285 4652 2344 4354 4484 3688 1899 812F07 807 100 148 255 255 1306 255 255 255 255 255F08 543 100 192 100 100 100 1145 100 100 100 100F09 7672 252 117 100 171 171 171 15556 171 171 171F10 92910 148 100 491 255 255 255 222 235E3 255 255F11 318E3 100 582E3 399E3 339E3 582E3 491E3 582E3 582E3 958E3 582E3F12 30506 100 53728 9734 18758 9734 9734 9734 9734 9734 39827F13 192 422 587 1007 1298 466 148 401 317 100 409F14 8812 100 148 255 255 491 255 255 255 255 255

4 8 3 4 3 3 2 3 3 4 3

5 Conclusion and Future Work

This paper proposed a hybrid metaheuristic HSBA methodfor optimization problemWe improved the BAby combiningoriginal harmony search (HS) algorithm and evaluating theHSBA on multimodal numerical optimization problemsA novel type of BA model has been presented and animprovement is applied to mutate between bats using har-mony search algorithm during the process of bats updatingUsing the original configuration of the bat algorithm wegenerate the new harmonies based on the newly generatedbat each iteration after batrsquos position has been updated Thenew harmony vector substitutes the newly generated bat only

if it has better fitness This selection scheme is rather greedywhich often overtakes original HS and BA The HSBAattempts to take merits of the BA and the HS in order toavoid all bats getting trapped in inferior local optimal regionsThe HSBA enables the bats to have more diverse exemplarsto learn from as the bats are updated each iteration andalso form new harmonies to search in a larger search spaceThis new method can speed up the global convergence ratewithout losing the strong robustness of the basic BA Fromthe analysis of the experimental results we observe that theproposed HSBA makes good use of the information in pastsolutions more effectively to generate better quality solutionsfrequently when compared to the other population-based

Journal of Applied Mathematics 19

Table 12Mean normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 407 100 100 452 277 257 341 536 369 489 103F03 100 127E4 134 135 135 135 134 135 134 134 134F04 115 8127 939E3 100 101 101 129 100 100 101 101F05 34573 5061 8619 907E3 2501 108 101 191 333 100 118F06 100 542E6 427E4 474E4 548E6 57788 57788 57788 57788 57787 57787F07 486 153 100 9543 10596 122E4 134 136 132 133 135F08 1269 7600 876E3 1703 6913 7672 885E3 110 105 104 100F09 6345 23047 182 167 1278 4860 5349 609E3 111 130 100F10 13355 1251 126 197E3 1148 464 1645 1893 199E3 100 188F11 6302 100 639E3 7944 5924 396E3 142E3 581E3 645E3 745E5 7981F12 390 881E3 890E3 100 890E3 4440 4778 1725 7011 7784 899E3F13 100 2166 207E3 669 580 430 27167 28246 10539 42926 47662F14 4614 100 3610 5593 122 640E3 4215 3289 3481 1272 5129

4 4 3 3 1 1 1 2 2 3 3

optimization algorithms such as ACO BA BBO DE ESGA HS PSO and SGA Based on the results of the tenapproaches on the test problems we can conclude that theHSBA significantly improves the performances of the HSand the BA on most multimodal and unimodal problems

In this work 14 benchmark functions are used to evaluatethe performance of our approach we will test our approachon more problems such as the high-dimensional (119863 ge20) CEC 2010 test suit [45] and the real-world problemsMoreover we will compare HSBA with other metaheuristicmethod such as DLHS [46 47] SGHS [48] adaptive DE[49] CLPSO [50] and DMS-PSO [51] In addition we onlyconsider the unconstrained function optimization in thiswork Our future work consists on adding the diversity rulesinto HSBA for constrained optimization problems such asconstrained real-parameter optimization CEC 2010 test suit[52]

In the field of optimization there are many issues worthyof further study and efficient optimizationmethod should bedeveloped depending on the analysis of specific engineeringproblem Our future work will focus on the two issues onthe one hand we would apply our proposed approach HSBA to solve practical engineering optimization problemsand obviously HSBA can become a fascinating method forreal-world engineering optimization problems on the otherhand we would develop new meta-hybrid approach to solveoptimization problem

Acknowledgments

This work was supported by State Key Laboratory of LaserInteraction with Material Research Fund under Grant noSKLLIM0902-01 and Key Research Technology of Electric-discharge Non-chain Pulsed DF Laser under Grant no LXJJ-11-Q80

References

[1] G Wang L Guo A H Gandomi et al ldquoLevy-flight krill herdalgorithmrdquo Mathematical Problems in Engineering vol 2013Article ID 682073 14 pages 2013

[2] A H Gandomi X S Yang S Talatahari and A H AlaviMetaheuristic Applications in Structures and InfrastructuresElsevier Waltham Mass USA 2013

[3] D H Ackley ldquoAn empirical study of bit vector functionoptimizationrdquo in Genetic Algorithms and Simulated AnnealingL Davis Ed pp 170ndash204 Pitman London UK 1987

[4] R Fletcher and M J D Powell ldquoA rapidly convergent descentmethod for minimizationrdquoThe Computer Journal vol 6 no 2pp 163ndash168 1963

[5] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[6] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[7] K A De Jong Analysis of the Behavior of a Class of GeneticAdaptive Systems University of Michigan 1975

[8] L A Rastrigin Extremal Control Systems NaukaMoscow Rus-sian 1974Theoretical Foundations of Engineering CyberneticsSeries

[9] S-K S Fan and J-M Chang ldquoDynamic multi-swarm particleswarm optimizer using parallel PC cluster systems for globaloptimization of large-scale multimodal functionsrdquo EngineeringOptimization vol 42 no 5 pp 431ndash451 2010

[10] W Gong Z Cai and C X Ling ldquoDEBBO a hybrid differentialevolution with biogeography-based optimization for globalnumerical optimizationrdquo Soft Computing vol 15 no 4 pp 645ndash665 2010

[11] G Wang L Guo H Duan L Liu and H Wang ldquoA modifiedfirefly algorithm for UCAV path planningrdquo International Jour-nal of Hybrid Information Technology vol 5 no 3 pp 123ndash1442012

20 Journal of Applied Mathematics

[12] GWang LGuoHDuan L Liu andHWang ldquoAhybridmeta-heuristic DECS algorithm for UCAVpath planningrdquo Journal ofInformation and Computational Science vol 9 no 16 pp 1ndash82012

[13] GWang L Guo H Duan L Liu HWang andM Shao ldquoPathplanning for uninhabited combat aerial vehicle using hybridmeta-heuristic DEBBO algorithmrdquo Advanced Science Engi-neering and Medicine vol 4 no 6 pp 550ndash564 2012

[14] G Wang L Guo H Duan H Wang L Liu and M Shao ldquoAhybrid meta-heuristic DECS algorithm for UCAV three-dimension path planningrdquo The Scientific World Journal vol2012 Article ID 583973 11 pages 2012

[15] H DuanW Zhao GWang and X Feng ldquoTest-sheet composi-tion using analytic hierarchy process and hybrid metaheuristicalgorithmTSBBOrdquoMathematical Problems in Engineering vol2012 Article ID 712752 22 pages 2012

[16] G Wang L Guo H Duan L Liu and H Wang ldquoDynamicdeployment of wireless sensor networks by biogeography basedoptimization algorithmrdquo Journal of Sensor and Actuator Net-works vol 1 no 2 pp 86ndash96 2012

[17] X S Yang A H Gandomi S Talatahari and A H AlaviMetaheuristics in Water Geotechnical and Transport Engineer-ing Elsevier Waltham Mass USA 2013

[18] D E GoldbergGenetic Algorithms in Search Optimization andMachine Learning Addison-Wesley New York NY USA 1998

[19] R Storn and K Price ldquoDifferential evolutionmdasha simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997

[20] S Das and P N Suganthan ldquoDifferential evolution a survey ofthe state-of-the-artrdquo IEEE Transactions on Evolutionary Com-putation vol 15 no 1 pp 4ndash31 2011

[21] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[22] A H Gandomi G J Yun X-S Yang and S Talatahari ldquoChaos-enhanced accelerated particle swarm optimizationrdquo Communi-cations in Nonlinear Science and Numerical Simulation vol 18no 2 pp 327ndash340 2013

[23] S Talatahari X S Yang R Sheikholeslami andAH GandomildquoOptimal parameter estimation for muskingum model usinghybrid CSS and PSO methodrdquo Journal of Computational andTheoretical Nanoscience In press

[24] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[25] G Wang L Guo H Duan L Liu H Wang and M ShaoldquoHybridizing harmony search with biogeography based opti-mization for global numerical optimizationrdquo Journal of Com-putational and Theoretical Nanoscience In press

[26] Z W Geem J H Kim and G V Loganathan ldquoA new heuristicoptimization algorithm harmony searchrdquo Simulation vol 76no 2 pp 60ndash68 2001

[27] G Wang L Guo H Duan H Wang L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[28] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress Frome UK 2nd edition 2010

[29] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing amp Applications In press

[30] X S Yang andAHGandomi ldquoBat algorithm a novel approachfor global engineering optimizationrdquo Engineering Computa-tions vol 29 no 5 pp 464ndash483 2012

[31] G Wang L Guo H Duan L Liu and H Wang ldquoA Bat algo-rithm with mutation for UCAV path planningrdquo The ScientificWorld Journal vol 2012 Article ID 418946 15 pages 2012

[32] R Parpinelli and H Lopes ldquoNew inspirations in swarm intelli-gence a surveyrdquo International Journal of Bio-Inspired Computa-tion vol 3 no 1 pp 1ndash16 2011

[33] P W Tsai J S Pan B Y Liao M J Tsai and V Istanda ldquoBatalgorithm inspired algorithm for solving numerical optimiza-tion problemsrdquo Applied Mechanics and Materials vol 148 pp134ndash137 2012

[34] Wikipedia ldquoMathematical optimizationrdquo httpenwikipediaorgwikiNumerical optimization

[35] W Y Zhang S Xu and S J Li ldquoNecessary conditions for weaksharp minima in cone-constrained optimization problemsrdquoAbstract and Applied Analysis vol 2011 Article ID 909520 11pages 2012

[36] Wikipedia ldquoGlobal optimizationrdquo httpenwikipediaorgwikiGlobal optimization

[37] S Kirkpatrick C D Gelatt Jr and M P Vecchi ldquoOptimizationby simulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983

[38] M Dorigo and T Stutzle Ant Colony Optimization The MITPress Cambridge Mass USA 2004

[39] M Dorigo Optimization Learning and Natural AlgorithmsPolitecnico di Milano Italy 1992

[40] H G Beyer The Theory of Evolution Strategies Springer NewYork NY USA 2001

[41] W Khatib and P Fleming ldquoThe stud GA a mini revolutionrdquo inParallel Problem Solving From Nature pp 683ndash691 1998

[42] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[43] M Friedman ldquoA comparison of alternative tests of significancefor the problem of 119898 rankingsrdquo The Annals of MathematicalStatistics vol 11 no 1 pp 86ndash92 1940

[44] R Brits A P Engelbrecht and F van den Bergh ldquoLocatingmultiple optima using particle swarm optimizationrdquo AppliedMathematics and Computation vol 189 no 2 pp 1859ndash18832007

[45] K Tang X Li P N Suganthan Z Yang and T Weise ldquoBench-mark functions for the CECrsquo2010 special session and competi-tion on large scale global optimizationrdquo Nature Inspired Com-putation and Applications Laboratory USTC Hefei China2010

[46] Q-K Pan P N Suganthan J J Liang and M F Tasgetiren ldquoAlocal-best harmony search algorithm with dynamic subpopula-tionsrdquo Engineering Optimization vol 42 no 2 pp 101ndash117 2010

[47] Q-K Pan P N Suganthan J J Liang and M F TasgetirenldquoA local-best harmony search algorithm with dynamic sub-harmony memories for lot-streaming flow shop schedulingproblemrdquo Expert Systems with Applications vol 38 no 4 pp3252ndash3259 2011

[48] Q-K Pan P N Suganthan M F Tasgetiren and J J LiangldquoA self-adaptive global best harmony search algorithm forcontinuous optimization problemsrdquo Applied Mathematics andComputation vol 216 no 3 pp 830ndash848 2010

Journal of Applied Mathematics 21

[49] S M Islam S Das S Ghosh S Roy and P N Suganthan ldquoAnadaptive differential evolution algorithm with novel mutationand crossover strategies for global numerical optimizationrdquoIEEE Transactions on Systems Man and Cybernetics B vol 42no 2 pp 482ndash500 2012

[50] J J Liang A K Qin P N Suganthan and S Baskar ldquoCompre-hensive learning particle swarm optimizer for global optimiza-tion of multimodal functionsrdquo IEEE Transactions on Evolution-ary Computation vol 10 no 3 pp 281ndash295 2006

[51] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[52] R Mallipeddi and P Suganthan Problem Definitions and Eval-uation Criteria for the CEC 2010 Competition on ConstrainedReal-Parameter Optimization Nanyang Technological Univer-sity Singapore 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

16 Journal of Applied Mathematics

Table 6 Mean normalized optimization results in fourteen benchmark functions with different A The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119860

0 01 02 03 04 05 06 07 08 09 10F01 101 101 100 100 100 100 100 100 100 100 100F02 152 100 146 117 108 137 188 165 167 179 148F03 669 1041 106 101 103 103 101 102 105 100 100F04 20131 126 410 421 333 272 100 519 291 100 311F05 59122 1682 425 587 101 414 2404 100 368 733 900F06 100 490E4 63799 25871 495E4 218 217 217 218 217 217F07 857 221E6 643 27234 11048 211E4 102 101 100 100 100F08 4980 114E4 172E4 16160 22459 9119 174E4 103 111 100 100F09 8237 173E6 106 314 7445 10310 4226 794E3 100 137 129F10 9031 145E3 245E5 232E3 2334 2234 3138 1289 234E3 140 100F11 498 100 315E4 233 1441 52023 44365 61626 24991 478E4 214F12 369 210E4 457E4 100 212E4 26635 23313 19880 27614 11201 214E4F13 190 825 448E6 214E6 100 615 25451 22275 18996 26386 10701F14 6691 195E5 117E4 124E3 2104 186E3 2940 2392 100 1835 2475

1 2 1 2 2 1 2 2 4 5 5

Table 7 Best normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 184 184 484 159 171 100 134 109 116 101F02 944E3 100 100 100 100 100 100 100 100 100 100F03 373 546 186 186 186 186 100 172 121 161 101F04 1142 111 2429 123 126 100 129 129 129 129 129F05 177E5 230 115 109E4 100 100 100 100 100 100 100F06 6244 2996 4858 1560 2634 780 182 100 3520 239 155F07 831 448 258 129 129 349 100 100 100 100 100F08 961 118 420 137 137 137 695 101 101 100 100F09 37394 44422 168 335 168 168 100 29198 101 101 101F10 38693 316 1397 463 158 158 100 158 30931 158 101F11 4255 100 1829 2135 2018 1111 1904 2135 1581 1876 1155F12 11440 100 14184 4448 12547 4448 4448 4448 4448 4448 6943F13 639 637 601 296 302 100 152 139 100 257 249F14 12052 404 233 100 116 340 100 100 116 116 116

0 3 1 2 2 4 8 5 4 4 4

and pitch adjustment rate (PAR) The appropriate update forthe pitch adjustment rate (PAR) and the rate of pulse emission(119903) balances the exploration and exploitation behavior of eachbat respectively as the loudness usually decreases once a bathas found its preysolution while the rate of pulse emissionincreases in order to raise the attack accuracy

For all of the standard benchmark functions that havebeen considered the HSBA has been demonstrated toperform better than or be equal to the standard BA and otheracclaimed state-of-the-art population-based algorithms withthe HSBA performing significantly better in some functionsThe HSBA performs excellently and efficiently because ofits ability to simultaneously carry out a local search still

searching globally at the same time It succeeds in doingthis due to the local search via harmony search algorithmand global search via bat algorithm concurrently A similarbehavior may be performed in the PSO by using multiswarmfrom a particle population initially [44] However HSBArsquosadvantages include performing simply and easily and onlyhaving four parameters to regulate The work carried outhere demonstrates the HSBA to be robust over all kinds ofbenchmark functions

Benchmark evaluation is a good way for verifying theperformance of the metaheuristic algorithms but it also haslimitations First we did not make any special effort to tunethe optimization algorithms in this section Different tuning

Journal of Applied Mathematics 17

Table 8 Mean normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 199 186 194 159 100 142 134 109 116 171F02 331 538 457 308 382 114 100 289 279 402 311F03 373 546 533 229 336 241 100 172 121 161 136F04 1142 111 2429 149E4 113E3 394E3 278E4 1966 17522 114 100F05 1609 12862 3269 2446 6440 1277 2959 7617 456 100 446F06 6244 2696 4858 1560 2634 780 182 100 3520 239 155F07 330 178 134 103 123 139 155 100 119 136 349F08 488 393 394 194 219 383 353 433 100 227 314F09 139 166 114 100 121 115 115 109 108 104 109F10 721 994 875 360 846 611 483 414 576 100 271F11 382 423 373 205 181 100 171 220 142 168 189F12 164 217 204 147 180 186 100 121 113 134 156F13 639 637 601 296 302 190 152 139 100 257 249F14 574 1543 742 682 644 317 100 485 221 174 209

0 0 0 1 0 2 4 2 2 2 1

Table 9 Best normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 164 164 164 164 164 164 164 151 128 100 163F02 187E4 100 100 100 100 100 100 100 100 100 100F03 1982 2649 371 371 371 371 371 371 371 206 100F04 126E5 100 187E4 100 100 100 100 100 100 100 100F05 607E5 534 100 187E4 100 100 100 100 100 100 100F06 10892 24631 36504 34540 33857 23465 14349 4528 2330 100 2575F07 922 1428 534 100 100 930 100 100 100 100 100F08 1810 108 772 108 108 108 3594 100 100 100 100F09 28004 39161 127 681 127 127 127 22795 100 100 100F10 55165 876 176E3 1170 164 164 164 164 27448 100 100F11 1467 100 618 618 1399 618 618 618 440 269 616F12 768 100 1110 273 1021 273 273 273 273 273 473F13 772 2675 1590 1776 612 1040 612 595 255 100 225F14 53716 1428 534 100 100 713 100 100 100 100 100

0 4 2 4 5 3 5 6 7 11 9

parameter values in the optimization algorithmsmight resultin significant differences in their performance Second real-world optimization problems may not have much of arelationship to benchmark functionsThird benchmark testsmight result in different conclusions if the grading criteria orproblem setup change In our work we examined the meanand best results obtained with a certain population size andafter a certain number of generations However we mightarrive at different conclusions if (for example) we change thegeneration limit or look at how many generations it takes toreach a certain function value or if we change the populationsize In spite of these caveats the benchmark results shown

here are promising for HSBA and indicate that this novelmethod might be able to find a niche among the plethora ofpopulation-based optimization algorithms

We note that CPU time is a bottleneck to the implemen-tation of many population-based optimization algorithmsIf an algorithm cannot converge fast it will be impracticalsince it would take too long to find an optimal or suboptimalsolution HSBA does not seem to require an unreasonableamount of computational effort of the ten optimization algo-rithms compared in this paper HSBA was the third fastestNevertheless findingmechanisms to accelerate HSBA couldbe an important area for further research

18 Journal of Applied Mathematics

Table 10 Mean normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 101 101 101 101 101 101 101 101 101 100 101F02 74756 787 525 343 509 710 307 187 160 100 101F03 913 312E4 109 107 105 106 105 102 101 101 100F04 2101198646 1001198644 3951198644 9481198643 5971198643 2951198643 1511198643 1501198643 4694 100 237F05 3241198646 3271198644 2091198644 4141198644 7131198643 1541198643 8931198643 1531198643 62948 100 20373F06 100 5941198644 42204 63215 6001198644 192 191 191 191 191 191F07 1063 2071198646 900 21672 32455 307E4 104 103 102 101 100F08 5950 9881198643 304E4 14180 21701 32468 307E4 107 107 103 100F09 22687 4951198646 308 891 11422 17360 25941 245E4 145 100 171F10 25737 2831198644 935E5 137E4 9709 6656 9925 14940 139E4 100 273F11 607 100 183E4 202 1661 39040 26299 40300 60361 572E4 184F12 300 279E4 360E4 100 282E4 27095 19414 13077 20040 30016 284E4F13 232 966 561E6 189E6 100 815 26778 19186 12923 19805 29665F14 18466 384E5 117E4 185E3 100 571E3 2511 5539 3960 2657 4149

1 1 0 1 2 0 0 0 0 6 3

Table 11 Best normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 391E3 100 100 100 100 100 100 100 100 100 100F03 100 425 151 220 220 212 220 199 220 128 180F04 100 255 6554 255 255 255 255 255 255 255 255F05 252 100 171 169E3 171 171 171 171 171 171 171F06 100 5987 3418 2285 4652 2344 4354 4484 3688 1899 812F07 807 100 148 255 255 1306 255 255 255 255 255F08 543 100 192 100 100 100 1145 100 100 100 100F09 7672 252 117 100 171 171 171 15556 171 171 171F10 92910 148 100 491 255 255 255 222 235E3 255 255F11 318E3 100 582E3 399E3 339E3 582E3 491E3 582E3 582E3 958E3 582E3F12 30506 100 53728 9734 18758 9734 9734 9734 9734 9734 39827F13 192 422 587 1007 1298 466 148 401 317 100 409F14 8812 100 148 255 255 491 255 255 255 255 255

4 8 3 4 3 3 2 3 3 4 3

5 Conclusion and Future Work

This paper proposed a hybrid metaheuristic HSBA methodfor optimization problemWe improved the BAby combiningoriginal harmony search (HS) algorithm and evaluating theHSBA on multimodal numerical optimization problemsA novel type of BA model has been presented and animprovement is applied to mutate between bats using har-mony search algorithm during the process of bats updatingUsing the original configuration of the bat algorithm wegenerate the new harmonies based on the newly generatedbat each iteration after batrsquos position has been updated Thenew harmony vector substitutes the newly generated bat only

if it has better fitness This selection scheme is rather greedywhich often overtakes original HS and BA The HSBAattempts to take merits of the BA and the HS in order toavoid all bats getting trapped in inferior local optimal regionsThe HSBA enables the bats to have more diverse exemplarsto learn from as the bats are updated each iteration andalso form new harmonies to search in a larger search spaceThis new method can speed up the global convergence ratewithout losing the strong robustness of the basic BA Fromthe analysis of the experimental results we observe that theproposed HSBA makes good use of the information in pastsolutions more effectively to generate better quality solutionsfrequently when compared to the other population-based

Journal of Applied Mathematics 19

Table 12Mean normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 407 100 100 452 277 257 341 536 369 489 103F03 100 127E4 134 135 135 135 134 135 134 134 134F04 115 8127 939E3 100 101 101 129 100 100 101 101F05 34573 5061 8619 907E3 2501 108 101 191 333 100 118F06 100 542E6 427E4 474E4 548E6 57788 57788 57788 57788 57787 57787F07 486 153 100 9543 10596 122E4 134 136 132 133 135F08 1269 7600 876E3 1703 6913 7672 885E3 110 105 104 100F09 6345 23047 182 167 1278 4860 5349 609E3 111 130 100F10 13355 1251 126 197E3 1148 464 1645 1893 199E3 100 188F11 6302 100 639E3 7944 5924 396E3 142E3 581E3 645E3 745E5 7981F12 390 881E3 890E3 100 890E3 4440 4778 1725 7011 7784 899E3F13 100 2166 207E3 669 580 430 27167 28246 10539 42926 47662F14 4614 100 3610 5593 122 640E3 4215 3289 3481 1272 5129

4 4 3 3 1 1 1 2 2 3 3

optimization algorithms such as ACO BA BBO DE ESGA HS PSO and SGA Based on the results of the tenapproaches on the test problems we can conclude that theHSBA significantly improves the performances of the HSand the BA on most multimodal and unimodal problems

In this work 14 benchmark functions are used to evaluatethe performance of our approach we will test our approachon more problems such as the high-dimensional (119863 ge20) CEC 2010 test suit [45] and the real-world problemsMoreover we will compare HSBA with other metaheuristicmethod such as DLHS [46 47] SGHS [48] adaptive DE[49] CLPSO [50] and DMS-PSO [51] In addition we onlyconsider the unconstrained function optimization in thiswork Our future work consists on adding the diversity rulesinto HSBA for constrained optimization problems such asconstrained real-parameter optimization CEC 2010 test suit[52]

In the field of optimization there are many issues worthyof further study and efficient optimizationmethod should bedeveloped depending on the analysis of specific engineeringproblem Our future work will focus on the two issues onthe one hand we would apply our proposed approach HSBA to solve practical engineering optimization problemsand obviously HSBA can become a fascinating method forreal-world engineering optimization problems on the otherhand we would develop new meta-hybrid approach to solveoptimization problem

Acknowledgments

This work was supported by State Key Laboratory of LaserInteraction with Material Research Fund under Grant noSKLLIM0902-01 and Key Research Technology of Electric-discharge Non-chain Pulsed DF Laser under Grant no LXJJ-11-Q80

References

[1] G Wang L Guo A H Gandomi et al ldquoLevy-flight krill herdalgorithmrdquo Mathematical Problems in Engineering vol 2013Article ID 682073 14 pages 2013

[2] A H Gandomi X S Yang S Talatahari and A H AlaviMetaheuristic Applications in Structures and InfrastructuresElsevier Waltham Mass USA 2013

[3] D H Ackley ldquoAn empirical study of bit vector functionoptimizationrdquo in Genetic Algorithms and Simulated AnnealingL Davis Ed pp 170ndash204 Pitman London UK 1987

[4] R Fletcher and M J D Powell ldquoA rapidly convergent descentmethod for minimizationrdquoThe Computer Journal vol 6 no 2pp 163ndash168 1963

[5] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[6] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[7] K A De Jong Analysis of the Behavior of a Class of GeneticAdaptive Systems University of Michigan 1975

[8] L A Rastrigin Extremal Control Systems NaukaMoscow Rus-sian 1974Theoretical Foundations of Engineering CyberneticsSeries

[9] S-K S Fan and J-M Chang ldquoDynamic multi-swarm particleswarm optimizer using parallel PC cluster systems for globaloptimization of large-scale multimodal functionsrdquo EngineeringOptimization vol 42 no 5 pp 431ndash451 2010

[10] W Gong Z Cai and C X Ling ldquoDEBBO a hybrid differentialevolution with biogeography-based optimization for globalnumerical optimizationrdquo Soft Computing vol 15 no 4 pp 645ndash665 2010

[11] G Wang L Guo H Duan L Liu and H Wang ldquoA modifiedfirefly algorithm for UCAV path planningrdquo International Jour-nal of Hybrid Information Technology vol 5 no 3 pp 123ndash1442012

20 Journal of Applied Mathematics

[12] GWang LGuoHDuan L Liu andHWang ldquoAhybridmeta-heuristic DECS algorithm for UCAVpath planningrdquo Journal ofInformation and Computational Science vol 9 no 16 pp 1ndash82012

[13] GWang L Guo H Duan L Liu HWang andM Shao ldquoPathplanning for uninhabited combat aerial vehicle using hybridmeta-heuristic DEBBO algorithmrdquo Advanced Science Engi-neering and Medicine vol 4 no 6 pp 550ndash564 2012

[14] G Wang L Guo H Duan H Wang L Liu and M Shao ldquoAhybrid meta-heuristic DECS algorithm for UCAV three-dimension path planningrdquo The Scientific World Journal vol2012 Article ID 583973 11 pages 2012

[15] H DuanW Zhao GWang and X Feng ldquoTest-sheet composi-tion using analytic hierarchy process and hybrid metaheuristicalgorithmTSBBOrdquoMathematical Problems in Engineering vol2012 Article ID 712752 22 pages 2012

[16] G Wang L Guo H Duan L Liu and H Wang ldquoDynamicdeployment of wireless sensor networks by biogeography basedoptimization algorithmrdquo Journal of Sensor and Actuator Net-works vol 1 no 2 pp 86ndash96 2012

[17] X S Yang A H Gandomi S Talatahari and A H AlaviMetaheuristics in Water Geotechnical and Transport Engineer-ing Elsevier Waltham Mass USA 2013

[18] D E GoldbergGenetic Algorithms in Search Optimization andMachine Learning Addison-Wesley New York NY USA 1998

[19] R Storn and K Price ldquoDifferential evolutionmdasha simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997

[20] S Das and P N Suganthan ldquoDifferential evolution a survey ofthe state-of-the-artrdquo IEEE Transactions on Evolutionary Com-putation vol 15 no 1 pp 4ndash31 2011

[21] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[22] A H Gandomi G J Yun X-S Yang and S Talatahari ldquoChaos-enhanced accelerated particle swarm optimizationrdquo Communi-cations in Nonlinear Science and Numerical Simulation vol 18no 2 pp 327ndash340 2013

[23] S Talatahari X S Yang R Sheikholeslami andAH GandomildquoOptimal parameter estimation for muskingum model usinghybrid CSS and PSO methodrdquo Journal of Computational andTheoretical Nanoscience In press

[24] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[25] G Wang L Guo H Duan L Liu H Wang and M ShaoldquoHybridizing harmony search with biogeography based opti-mization for global numerical optimizationrdquo Journal of Com-putational and Theoretical Nanoscience In press

[26] Z W Geem J H Kim and G V Loganathan ldquoA new heuristicoptimization algorithm harmony searchrdquo Simulation vol 76no 2 pp 60ndash68 2001

[27] G Wang L Guo H Duan H Wang L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[28] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress Frome UK 2nd edition 2010

[29] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing amp Applications In press

[30] X S Yang andAHGandomi ldquoBat algorithm a novel approachfor global engineering optimizationrdquo Engineering Computa-tions vol 29 no 5 pp 464ndash483 2012

[31] G Wang L Guo H Duan L Liu and H Wang ldquoA Bat algo-rithm with mutation for UCAV path planningrdquo The ScientificWorld Journal vol 2012 Article ID 418946 15 pages 2012

[32] R Parpinelli and H Lopes ldquoNew inspirations in swarm intelli-gence a surveyrdquo International Journal of Bio-Inspired Computa-tion vol 3 no 1 pp 1ndash16 2011

[33] P W Tsai J S Pan B Y Liao M J Tsai and V Istanda ldquoBatalgorithm inspired algorithm for solving numerical optimiza-tion problemsrdquo Applied Mechanics and Materials vol 148 pp134ndash137 2012

[34] Wikipedia ldquoMathematical optimizationrdquo httpenwikipediaorgwikiNumerical optimization

[35] W Y Zhang S Xu and S J Li ldquoNecessary conditions for weaksharp minima in cone-constrained optimization problemsrdquoAbstract and Applied Analysis vol 2011 Article ID 909520 11pages 2012

[36] Wikipedia ldquoGlobal optimizationrdquo httpenwikipediaorgwikiGlobal optimization

[37] S Kirkpatrick C D Gelatt Jr and M P Vecchi ldquoOptimizationby simulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983

[38] M Dorigo and T Stutzle Ant Colony Optimization The MITPress Cambridge Mass USA 2004

[39] M Dorigo Optimization Learning and Natural AlgorithmsPolitecnico di Milano Italy 1992

[40] H G Beyer The Theory of Evolution Strategies Springer NewYork NY USA 2001

[41] W Khatib and P Fleming ldquoThe stud GA a mini revolutionrdquo inParallel Problem Solving From Nature pp 683ndash691 1998

[42] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[43] M Friedman ldquoA comparison of alternative tests of significancefor the problem of 119898 rankingsrdquo The Annals of MathematicalStatistics vol 11 no 1 pp 86ndash92 1940

[44] R Brits A P Engelbrecht and F van den Bergh ldquoLocatingmultiple optima using particle swarm optimizationrdquo AppliedMathematics and Computation vol 189 no 2 pp 1859ndash18832007

[45] K Tang X Li P N Suganthan Z Yang and T Weise ldquoBench-mark functions for the CECrsquo2010 special session and competi-tion on large scale global optimizationrdquo Nature Inspired Com-putation and Applications Laboratory USTC Hefei China2010

[46] Q-K Pan P N Suganthan J J Liang and M F Tasgetiren ldquoAlocal-best harmony search algorithm with dynamic subpopula-tionsrdquo Engineering Optimization vol 42 no 2 pp 101ndash117 2010

[47] Q-K Pan P N Suganthan J J Liang and M F TasgetirenldquoA local-best harmony search algorithm with dynamic sub-harmony memories for lot-streaming flow shop schedulingproblemrdquo Expert Systems with Applications vol 38 no 4 pp3252ndash3259 2011

[48] Q-K Pan P N Suganthan M F Tasgetiren and J J LiangldquoA self-adaptive global best harmony search algorithm forcontinuous optimization problemsrdquo Applied Mathematics andComputation vol 216 no 3 pp 830ndash848 2010

Journal of Applied Mathematics 21

[49] S M Islam S Das S Ghosh S Roy and P N Suganthan ldquoAnadaptive differential evolution algorithm with novel mutationand crossover strategies for global numerical optimizationrdquoIEEE Transactions on Systems Man and Cybernetics B vol 42no 2 pp 482ndash500 2012

[50] J J Liang A K Qin P N Suganthan and S Baskar ldquoCompre-hensive learning particle swarm optimizer for global optimiza-tion of multimodal functionsrdquo IEEE Transactions on Evolution-ary Computation vol 10 no 3 pp 281ndash295 2006

[51] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[52] R Mallipeddi and P Suganthan Problem Definitions and Eval-uation Criteria for the CEC 2010 Competition on ConstrainedReal-Parameter Optimization Nanyang Technological Univer-sity Singapore 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Journal of Applied Mathematics 17

Table 8 Mean normalized optimization results in fourteen benchmark functions with different 119903 The numbers shown are the minimumobjective function values found by HSBA algorithm averaged over 100 Monte Carlo simulations

119903

0 01 02 03 04 05 06 07 08 09 10F01 184 199 186 194 159 100 142 134 109 116 171F02 331 538 457 308 382 114 100 289 279 402 311F03 373 546 533 229 336 241 100 172 121 161 136F04 1142 111 2429 149E4 113E3 394E3 278E4 1966 17522 114 100F05 1609 12862 3269 2446 6440 1277 2959 7617 456 100 446F06 6244 2696 4858 1560 2634 780 182 100 3520 239 155F07 330 178 134 103 123 139 155 100 119 136 349F08 488 393 394 194 219 383 353 433 100 227 314F09 139 166 114 100 121 115 115 109 108 104 109F10 721 994 875 360 846 611 483 414 576 100 271F11 382 423 373 205 181 100 171 220 142 168 189F12 164 217 204 147 180 186 100 121 113 134 156F13 639 637 601 296 302 190 152 139 100 257 249F14 574 1543 742 682 644 317 100 485 221 174 209

0 0 0 1 0 2 4 2 2 2 1

Table 9 Best normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 164 164 164 164 164 164 164 151 128 100 163F02 187E4 100 100 100 100 100 100 100 100 100 100F03 1982 2649 371 371 371 371 371 371 371 206 100F04 126E5 100 187E4 100 100 100 100 100 100 100 100F05 607E5 534 100 187E4 100 100 100 100 100 100 100F06 10892 24631 36504 34540 33857 23465 14349 4528 2330 100 2575F07 922 1428 534 100 100 930 100 100 100 100 100F08 1810 108 772 108 108 108 3594 100 100 100 100F09 28004 39161 127 681 127 127 127 22795 100 100 100F10 55165 876 176E3 1170 164 164 164 164 27448 100 100F11 1467 100 618 618 1399 618 618 618 440 269 616F12 768 100 1110 273 1021 273 273 273 273 273 473F13 772 2675 1590 1776 612 1040 612 595 255 100 225F14 53716 1428 534 100 100 713 100 100 100 100 100

0 4 2 4 5 3 5 6 7 11 9

parameter values in the optimization algorithmsmight resultin significant differences in their performance Second real-world optimization problems may not have much of arelationship to benchmark functionsThird benchmark testsmight result in different conclusions if the grading criteria orproblem setup change In our work we examined the meanand best results obtained with a certain population size andafter a certain number of generations However we mightarrive at different conclusions if (for example) we change thegeneration limit or look at how many generations it takes toreach a certain function value or if we change the populationsize In spite of these caveats the benchmark results shown

here are promising for HSBA and indicate that this novelmethod might be able to find a niche among the plethora ofpopulation-based optimization algorithms

We note that CPU time is a bottleneck to the implemen-tation of many population-based optimization algorithmsIf an algorithm cannot converge fast it will be impracticalsince it would take too long to find an optimal or suboptimalsolution HSBA does not seem to require an unreasonableamount of computational effort of the ten optimization algo-rithms compared in this paper HSBA was the third fastestNevertheless findingmechanisms to accelerate HSBA couldbe an important area for further research

18 Journal of Applied Mathematics

Table 10 Mean normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 101 101 101 101 101 101 101 101 101 100 101F02 74756 787 525 343 509 710 307 187 160 100 101F03 913 312E4 109 107 105 106 105 102 101 101 100F04 2101198646 1001198644 3951198644 9481198643 5971198643 2951198643 1511198643 1501198643 4694 100 237F05 3241198646 3271198644 2091198644 4141198644 7131198643 1541198643 8931198643 1531198643 62948 100 20373F06 100 5941198644 42204 63215 6001198644 192 191 191 191 191 191F07 1063 2071198646 900 21672 32455 307E4 104 103 102 101 100F08 5950 9881198643 304E4 14180 21701 32468 307E4 107 107 103 100F09 22687 4951198646 308 891 11422 17360 25941 245E4 145 100 171F10 25737 2831198644 935E5 137E4 9709 6656 9925 14940 139E4 100 273F11 607 100 183E4 202 1661 39040 26299 40300 60361 572E4 184F12 300 279E4 360E4 100 282E4 27095 19414 13077 20040 30016 284E4F13 232 966 561E6 189E6 100 815 26778 19186 12923 19805 29665F14 18466 384E5 117E4 185E3 100 571E3 2511 5539 3960 2657 4149

1 1 0 1 2 0 0 0 0 6 3

Table 11 Best normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 391E3 100 100 100 100 100 100 100 100 100 100F03 100 425 151 220 220 212 220 199 220 128 180F04 100 255 6554 255 255 255 255 255 255 255 255F05 252 100 171 169E3 171 171 171 171 171 171 171F06 100 5987 3418 2285 4652 2344 4354 4484 3688 1899 812F07 807 100 148 255 255 1306 255 255 255 255 255F08 543 100 192 100 100 100 1145 100 100 100 100F09 7672 252 117 100 171 171 171 15556 171 171 171F10 92910 148 100 491 255 255 255 222 235E3 255 255F11 318E3 100 582E3 399E3 339E3 582E3 491E3 582E3 582E3 958E3 582E3F12 30506 100 53728 9734 18758 9734 9734 9734 9734 9734 39827F13 192 422 587 1007 1298 466 148 401 317 100 409F14 8812 100 148 255 255 491 255 255 255 255 255

4 8 3 4 3 3 2 3 3 4 3

5 Conclusion and Future Work

This paper proposed a hybrid metaheuristic HSBA methodfor optimization problemWe improved the BAby combiningoriginal harmony search (HS) algorithm and evaluating theHSBA on multimodal numerical optimization problemsA novel type of BA model has been presented and animprovement is applied to mutate between bats using har-mony search algorithm during the process of bats updatingUsing the original configuration of the bat algorithm wegenerate the new harmonies based on the newly generatedbat each iteration after batrsquos position has been updated Thenew harmony vector substitutes the newly generated bat only

if it has better fitness This selection scheme is rather greedywhich often overtakes original HS and BA The HSBAattempts to take merits of the BA and the HS in order toavoid all bats getting trapped in inferior local optimal regionsThe HSBA enables the bats to have more diverse exemplarsto learn from as the bats are updated each iteration andalso form new harmonies to search in a larger search spaceThis new method can speed up the global convergence ratewithout losing the strong robustness of the basic BA Fromthe analysis of the experimental results we observe that theproposed HSBA makes good use of the information in pastsolutions more effectively to generate better quality solutionsfrequently when compared to the other population-based

Journal of Applied Mathematics 19

Table 12Mean normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 407 100 100 452 277 257 341 536 369 489 103F03 100 127E4 134 135 135 135 134 135 134 134 134F04 115 8127 939E3 100 101 101 129 100 100 101 101F05 34573 5061 8619 907E3 2501 108 101 191 333 100 118F06 100 542E6 427E4 474E4 548E6 57788 57788 57788 57788 57787 57787F07 486 153 100 9543 10596 122E4 134 136 132 133 135F08 1269 7600 876E3 1703 6913 7672 885E3 110 105 104 100F09 6345 23047 182 167 1278 4860 5349 609E3 111 130 100F10 13355 1251 126 197E3 1148 464 1645 1893 199E3 100 188F11 6302 100 639E3 7944 5924 396E3 142E3 581E3 645E3 745E5 7981F12 390 881E3 890E3 100 890E3 4440 4778 1725 7011 7784 899E3F13 100 2166 207E3 669 580 430 27167 28246 10539 42926 47662F14 4614 100 3610 5593 122 640E3 4215 3289 3481 1272 5129

4 4 3 3 1 1 1 2 2 3 3

optimization algorithms such as ACO BA BBO DE ESGA HS PSO and SGA Based on the results of the tenapproaches on the test problems we can conclude that theHSBA significantly improves the performances of the HSand the BA on most multimodal and unimodal problems

In this work 14 benchmark functions are used to evaluatethe performance of our approach we will test our approachon more problems such as the high-dimensional (119863 ge20) CEC 2010 test suit [45] and the real-world problemsMoreover we will compare HSBA with other metaheuristicmethod such as DLHS [46 47] SGHS [48] adaptive DE[49] CLPSO [50] and DMS-PSO [51] In addition we onlyconsider the unconstrained function optimization in thiswork Our future work consists on adding the diversity rulesinto HSBA for constrained optimization problems such asconstrained real-parameter optimization CEC 2010 test suit[52]

In the field of optimization there are many issues worthyof further study and efficient optimizationmethod should bedeveloped depending on the analysis of specific engineeringproblem Our future work will focus on the two issues onthe one hand we would apply our proposed approach HSBA to solve practical engineering optimization problemsand obviously HSBA can become a fascinating method forreal-world engineering optimization problems on the otherhand we would develop new meta-hybrid approach to solveoptimization problem

Acknowledgments

This work was supported by State Key Laboratory of LaserInteraction with Material Research Fund under Grant noSKLLIM0902-01 and Key Research Technology of Electric-discharge Non-chain Pulsed DF Laser under Grant no LXJJ-11-Q80

References

[1] G Wang L Guo A H Gandomi et al ldquoLevy-flight krill herdalgorithmrdquo Mathematical Problems in Engineering vol 2013Article ID 682073 14 pages 2013

[2] A H Gandomi X S Yang S Talatahari and A H AlaviMetaheuristic Applications in Structures and InfrastructuresElsevier Waltham Mass USA 2013

[3] D H Ackley ldquoAn empirical study of bit vector functionoptimizationrdquo in Genetic Algorithms and Simulated AnnealingL Davis Ed pp 170ndash204 Pitman London UK 1987

[4] R Fletcher and M J D Powell ldquoA rapidly convergent descentmethod for minimizationrdquoThe Computer Journal vol 6 no 2pp 163ndash168 1963

[5] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[6] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[7] K A De Jong Analysis of the Behavior of a Class of GeneticAdaptive Systems University of Michigan 1975

[8] L A Rastrigin Extremal Control Systems NaukaMoscow Rus-sian 1974Theoretical Foundations of Engineering CyberneticsSeries

[9] S-K S Fan and J-M Chang ldquoDynamic multi-swarm particleswarm optimizer using parallel PC cluster systems for globaloptimization of large-scale multimodal functionsrdquo EngineeringOptimization vol 42 no 5 pp 431ndash451 2010

[10] W Gong Z Cai and C X Ling ldquoDEBBO a hybrid differentialevolution with biogeography-based optimization for globalnumerical optimizationrdquo Soft Computing vol 15 no 4 pp 645ndash665 2010

[11] G Wang L Guo H Duan L Liu and H Wang ldquoA modifiedfirefly algorithm for UCAV path planningrdquo International Jour-nal of Hybrid Information Technology vol 5 no 3 pp 123ndash1442012

20 Journal of Applied Mathematics

[12] GWang LGuoHDuan L Liu andHWang ldquoAhybridmeta-heuristic DECS algorithm for UCAVpath planningrdquo Journal ofInformation and Computational Science vol 9 no 16 pp 1ndash82012

[13] GWang L Guo H Duan L Liu HWang andM Shao ldquoPathplanning for uninhabited combat aerial vehicle using hybridmeta-heuristic DEBBO algorithmrdquo Advanced Science Engi-neering and Medicine vol 4 no 6 pp 550ndash564 2012

[14] G Wang L Guo H Duan H Wang L Liu and M Shao ldquoAhybrid meta-heuristic DECS algorithm for UCAV three-dimension path planningrdquo The Scientific World Journal vol2012 Article ID 583973 11 pages 2012

[15] H DuanW Zhao GWang and X Feng ldquoTest-sheet composi-tion using analytic hierarchy process and hybrid metaheuristicalgorithmTSBBOrdquoMathematical Problems in Engineering vol2012 Article ID 712752 22 pages 2012

[16] G Wang L Guo H Duan L Liu and H Wang ldquoDynamicdeployment of wireless sensor networks by biogeography basedoptimization algorithmrdquo Journal of Sensor and Actuator Net-works vol 1 no 2 pp 86ndash96 2012

[17] X S Yang A H Gandomi S Talatahari and A H AlaviMetaheuristics in Water Geotechnical and Transport Engineer-ing Elsevier Waltham Mass USA 2013

[18] D E GoldbergGenetic Algorithms in Search Optimization andMachine Learning Addison-Wesley New York NY USA 1998

[19] R Storn and K Price ldquoDifferential evolutionmdasha simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997

[20] S Das and P N Suganthan ldquoDifferential evolution a survey ofthe state-of-the-artrdquo IEEE Transactions on Evolutionary Com-putation vol 15 no 1 pp 4ndash31 2011

[21] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[22] A H Gandomi G J Yun X-S Yang and S Talatahari ldquoChaos-enhanced accelerated particle swarm optimizationrdquo Communi-cations in Nonlinear Science and Numerical Simulation vol 18no 2 pp 327ndash340 2013

[23] S Talatahari X S Yang R Sheikholeslami andAH GandomildquoOptimal parameter estimation for muskingum model usinghybrid CSS and PSO methodrdquo Journal of Computational andTheoretical Nanoscience In press

[24] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[25] G Wang L Guo H Duan L Liu H Wang and M ShaoldquoHybridizing harmony search with biogeography based opti-mization for global numerical optimizationrdquo Journal of Com-putational and Theoretical Nanoscience In press

[26] Z W Geem J H Kim and G V Loganathan ldquoA new heuristicoptimization algorithm harmony searchrdquo Simulation vol 76no 2 pp 60ndash68 2001

[27] G Wang L Guo H Duan H Wang L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[28] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress Frome UK 2nd edition 2010

[29] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing amp Applications In press

[30] X S Yang andAHGandomi ldquoBat algorithm a novel approachfor global engineering optimizationrdquo Engineering Computa-tions vol 29 no 5 pp 464ndash483 2012

[31] G Wang L Guo H Duan L Liu and H Wang ldquoA Bat algo-rithm with mutation for UCAV path planningrdquo The ScientificWorld Journal vol 2012 Article ID 418946 15 pages 2012

[32] R Parpinelli and H Lopes ldquoNew inspirations in swarm intelli-gence a surveyrdquo International Journal of Bio-Inspired Computa-tion vol 3 no 1 pp 1ndash16 2011

[33] P W Tsai J S Pan B Y Liao M J Tsai and V Istanda ldquoBatalgorithm inspired algorithm for solving numerical optimiza-tion problemsrdquo Applied Mechanics and Materials vol 148 pp134ndash137 2012

[34] Wikipedia ldquoMathematical optimizationrdquo httpenwikipediaorgwikiNumerical optimization

[35] W Y Zhang S Xu and S J Li ldquoNecessary conditions for weaksharp minima in cone-constrained optimization problemsrdquoAbstract and Applied Analysis vol 2011 Article ID 909520 11pages 2012

[36] Wikipedia ldquoGlobal optimizationrdquo httpenwikipediaorgwikiGlobal optimization

[37] S Kirkpatrick C D Gelatt Jr and M P Vecchi ldquoOptimizationby simulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983

[38] M Dorigo and T Stutzle Ant Colony Optimization The MITPress Cambridge Mass USA 2004

[39] M Dorigo Optimization Learning and Natural AlgorithmsPolitecnico di Milano Italy 1992

[40] H G Beyer The Theory of Evolution Strategies Springer NewYork NY USA 2001

[41] W Khatib and P Fleming ldquoThe stud GA a mini revolutionrdquo inParallel Problem Solving From Nature pp 683ndash691 1998

[42] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[43] M Friedman ldquoA comparison of alternative tests of significancefor the problem of 119898 rankingsrdquo The Annals of MathematicalStatistics vol 11 no 1 pp 86ndash92 1940

[44] R Brits A P Engelbrecht and F van den Bergh ldquoLocatingmultiple optima using particle swarm optimizationrdquo AppliedMathematics and Computation vol 189 no 2 pp 1859ndash18832007

[45] K Tang X Li P N Suganthan Z Yang and T Weise ldquoBench-mark functions for the CECrsquo2010 special session and competi-tion on large scale global optimizationrdquo Nature Inspired Com-putation and Applications Laboratory USTC Hefei China2010

[46] Q-K Pan P N Suganthan J J Liang and M F Tasgetiren ldquoAlocal-best harmony search algorithm with dynamic subpopula-tionsrdquo Engineering Optimization vol 42 no 2 pp 101ndash117 2010

[47] Q-K Pan P N Suganthan J J Liang and M F TasgetirenldquoA local-best harmony search algorithm with dynamic sub-harmony memories for lot-streaming flow shop schedulingproblemrdquo Expert Systems with Applications vol 38 no 4 pp3252ndash3259 2011

[48] Q-K Pan P N Suganthan M F Tasgetiren and J J LiangldquoA self-adaptive global best harmony search algorithm forcontinuous optimization problemsrdquo Applied Mathematics andComputation vol 216 no 3 pp 830ndash848 2010

Journal of Applied Mathematics 21

[49] S M Islam S Das S Ghosh S Roy and P N Suganthan ldquoAnadaptive differential evolution algorithm with novel mutationand crossover strategies for global numerical optimizationrdquoIEEE Transactions on Systems Man and Cybernetics B vol 42no 2 pp 482ndash500 2012

[50] J J Liang A K Qin P N Suganthan and S Baskar ldquoCompre-hensive learning particle swarm optimizer for global optimiza-tion of multimodal functionsrdquo IEEE Transactions on Evolution-ary Computation vol 10 no 3 pp 281ndash295 2006

[51] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[52] R Mallipeddi and P Suganthan Problem Definitions and Eval-uation Criteria for the CEC 2010 Competition on ConstrainedReal-Parameter Optimization Nanyang Technological Univer-sity Singapore 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

18 Journal of Applied Mathematics

Table 10 Mean normalized optimization results in fourteen benchmark functions with different HMCR The numbers shown are the bestresults found after 100 Monte Carlo simulations of HSBA algorithm

HMCR0 01 02 03 04 05 06 07 08 09 10

F01 101 101 101 101 101 101 101 101 101 100 101F02 74756 787 525 343 509 710 307 187 160 100 101F03 913 312E4 109 107 105 106 105 102 101 101 100F04 2101198646 1001198644 3951198644 9481198643 5971198643 2951198643 1511198643 1501198643 4694 100 237F05 3241198646 3271198644 2091198644 4141198644 7131198643 1541198643 8931198643 1531198643 62948 100 20373F06 100 5941198644 42204 63215 6001198644 192 191 191 191 191 191F07 1063 2071198646 900 21672 32455 307E4 104 103 102 101 100F08 5950 9881198643 304E4 14180 21701 32468 307E4 107 107 103 100F09 22687 4951198646 308 891 11422 17360 25941 245E4 145 100 171F10 25737 2831198644 935E5 137E4 9709 6656 9925 14940 139E4 100 273F11 607 100 183E4 202 1661 39040 26299 40300 60361 572E4 184F12 300 279E4 360E4 100 282E4 27095 19414 13077 20040 30016 284E4F13 232 966 561E6 189E6 100 815 26778 19186 12923 19805 29665F14 18466 384E5 117E4 185E3 100 571E3 2511 5539 3960 2657 4149

1 1 0 1 2 0 0 0 0 6 3

Table 11 Best normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 391E3 100 100 100 100 100 100 100 100 100 100F03 100 425 151 220 220 212 220 199 220 128 180F04 100 255 6554 255 255 255 255 255 255 255 255F05 252 100 171 169E3 171 171 171 171 171 171 171F06 100 5987 3418 2285 4652 2344 4354 4484 3688 1899 812F07 807 100 148 255 255 1306 255 255 255 255 255F08 543 100 192 100 100 100 1145 100 100 100 100F09 7672 252 117 100 171 171 171 15556 171 171 171F10 92910 148 100 491 255 255 255 222 235E3 255 255F11 318E3 100 582E3 399E3 339E3 582E3 491E3 582E3 582E3 958E3 582E3F12 30506 100 53728 9734 18758 9734 9734 9734 9734 9734 39827F13 192 422 587 1007 1298 466 148 401 317 100 409F14 8812 100 148 255 255 491 255 255 255 255 255

4 8 3 4 3 3 2 3 3 4 3

5 Conclusion and Future Work

This paper proposed a hybrid metaheuristic HSBA methodfor optimization problemWe improved the BAby combiningoriginal harmony search (HS) algorithm and evaluating theHSBA on multimodal numerical optimization problemsA novel type of BA model has been presented and animprovement is applied to mutate between bats using har-mony search algorithm during the process of bats updatingUsing the original configuration of the bat algorithm wegenerate the new harmonies based on the newly generatedbat each iteration after batrsquos position has been updated Thenew harmony vector substitutes the newly generated bat only

if it has better fitness This selection scheme is rather greedywhich often overtakes original HS and BA The HSBAattempts to take merits of the BA and the HS in order toavoid all bats getting trapped in inferior local optimal regionsThe HSBA enables the bats to have more diverse exemplarsto learn from as the bats are updated each iteration andalso form new harmonies to search in a larger search spaceThis new method can speed up the global convergence ratewithout losing the strong robustness of the basic BA Fromthe analysis of the experimental results we observe that theproposed HSBA makes good use of the information in pastsolutions more effectively to generate better quality solutionsfrequently when compared to the other population-based

Journal of Applied Mathematics 19

Table 12Mean normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 407 100 100 452 277 257 341 536 369 489 103F03 100 127E4 134 135 135 135 134 135 134 134 134F04 115 8127 939E3 100 101 101 129 100 100 101 101F05 34573 5061 8619 907E3 2501 108 101 191 333 100 118F06 100 542E6 427E4 474E4 548E6 57788 57788 57788 57788 57787 57787F07 486 153 100 9543 10596 122E4 134 136 132 133 135F08 1269 7600 876E3 1703 6913 7672 885E3 110 105 104 100F09 6345 23047 182 167 1278 4860 5349 609E3 111 130 100F10 13355 1251 126 197E3 1148 464 1645 1893 199E3 100 188F11 6302 100 639E3 7944 5924 396E3 142E3 581E3 645E3 745E5 7981F12 390 881E3 890E3 100 890E3 4440 4778 1725 7011 7784 899E3F13 100 2166 207E3 669 580 430 27167 28246 10539 42926 47662F14 4614 100 3610 5593 122 640E3 4215 3289 3481 1272 5129

4 4 3 3 1 1 1 2 2 3 3

optimization algorithms such as ACO BA BBO DE ESGA HS PSO and SGA Based on the results of the tenapproaches on the test problems we can conclude that theHSBA significantly improves the performances of the HSand the BA on most multimodal and unimodal problems

In this work 14 benchmark functions are used to evaluatethe performance of our approach we will test our approachon more problems such as the high-dimensional (119863 ge20) CEC 2010 test suit [45] and the real-world problemsMoreover we will compare HSBA with other metaheuristicmethod such as DLHS [46 47] SGHS [48] adaptive DE[49] CLPSO [50] and DMS-PSO [51] In addition we onlyconsider the unconstrained function optimization in thiswork Our future work consists on adding the diversity rulesinto HSBA for constrained optimization problems such asconstrained real-parameter optimization CEC 2010 test suit[52]

In the field of optimization there are many issues worthyof further study and efficient optimizationmethod should bedeveloped depending on the analysis of specific engineeringproblem Our future work will focus on the two issues onthe one hand we would apply our proposed approach HSBA to solve practical engineering optimization problemsand obviously HSBA can become a fascinating method forreal-world engineering optimization problems on the otherhand we would develop new meta-hybrid approach to solveoptimization problem

Acknowledgments

This work was supported by State Key Laboratory of LaserInteraction with Material Research Fund under Grant noSKLLIM0902-01 and Key Research Technology of Electric-discharge Non-chain Pulsed DF Laser under Grant no LXJJ-11-Q80

References

[1] G Wang L Guo A H Gandomi et al ldquoLevy-flight krill herdalgorithmrdquo Mathematical Problems in Engineering vol 2013Article ID 682073 14 pages 2013

[2] A H Gandomi X S Yang S Talatahari and A H AlaviMetaheuristic Applications in Structures and InfrastructuresElsevier Waltham Mass USA 2013

[3] D H Ackley ldquoAn empirical study of bit vector functionoptimizationrdquo in Genetic Algorithms and Simulated AnnealingL Davis Ed pp 170ndash204 Pitman London UK 1987

[4] R Fletcher and M J D Powell ldquoA rapidly convergent descentmethod for minimizationrdquoThe Computer Journal vol 6 no 2pp 163ndash168 1963

[5] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[6] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[7] K A De Jong Analysis of the Behavior of a Class of GeneticAdaptive Systems University of Michigan 1975

[8] L A Rastrigin Extremal Control Systems NaukaMoscow Rus-sian 1974Theoretical Foundations of Engineering CyberneticsSeries

[9] S-K S Fan and J-M Chang ldquoDynamic multi-swarm particleswarm optimizer using parallel PC cluster systems for globaloptimization of large-scale multimodal functionsrdquo EngineeringOptimization vol 42 no 5 pp 431ndash451 2010

[10] W Gong Z Cai and C X Ling ldquoDEBBO a hybrid differentialevolution with biogeography-based optimization for globalnumerical optimizationrdquo Soft Computing vol 15 no 4 pp 645ndash665 2010

[11] G Wang L Guo H Duan L Liu and H Wang ldquoA modifiedfirefly algorithm for UCAV path planningrdquo International Jour-nal of Hybrid Information Technology vol 5 no 3 pp 123ndash1442012

20 Journal of Applied Mathematics

[12] GWang LGuoHDuan L Liu andHWang ldquoAhybridmeta-heuristic DECS algorithm for UCAVpath planningrdquo Journal ofInformation and Computational Science vol 9 no 16 pp 1ndash82012

[13] GWang L Guo H Duan L Liu HWang andM Shao ldquoPathplanning for uninhabited combat aerial vehicle using hybridmeta-heuristic DEBBO algorithmrdquo Advanced Science Engi-neering and Medicine vol 4 no 6 pp 550ndash564 2012

[14] G Wang L Guo H Duan H Wang L Liu and M Shao ldquoAhybrid meta-heuristic DECS algorithm for UCAV three-dimension path planningrdquo The Scientific World Journal vol2012 Article ID 583973 11 pages 2012

[15] H DuanW Zhao GWang and X Feng ldquoTest-sheet composi-tion using analytic hierarchy process and hybrid metaheuristicalgorithmTSBBOrdquoMathematical Problems in Engineering vol2012 Article ID 712752 22 pages 2012

[16] G Wang L Guo H Duan L Liu and H Wang ldquoDynamicdeployment of wireless sensor networks by biogeography basedoptimization algorithmrdquo Journal of Sensor and Actuator Net-works vol 1 no 2 pp 86ndash96 2012

[17] X S Yang A H Gandomi S Talatahari and A H AlaviMetaheuristics in Water Geotechnical and Transport Engineer-ing Elsevier Waltham Mass USA 2013

[18] D E GoldbergGenetic Algorithms in Search Optimization andMachine Learning Addison-Wesley New York NY USA 1998

[19] R Storn and K Price ldquoDifferential evolutionmdasha simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997

[20] S Das and P N Suganthan ldquoDifferential evolution a survey ofthe state-of-the-artrdquo IEEE Transactions on Evolutionary Com-putation vol 15 no 1 pp 4ndash31 2011

[21] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[22] A H Gandomi G J Yun X-S Yang and S Talatahari ldquoChaos-enhanced accelerated particle swarm optimizationrdquo Communi-cations in Nonlinear Science and Numerical Simulation vol 18no 2 pp 327ndash340 2013

[23] S Talatahari X S Yang R Sheikholeslami andAH GandomildquoOptimal parameter estimation for muskingum model usinghybrid CSS and PSO methodrdquo Journal of Computational andTheoretical Nanoscience In press

[24] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[25] G Wang L Guo H Duan L Liu H Wang and M ShaoldquoHybridizing harmony search with biogeography based opti-mization for global numerical optimizationrdquo Journal of Com-putational and Theoretical Nanoscience In press

[26] Z W Geem J H Kim and G V Loganathan ldquoA new heuristicoptimization algorithm harmony searchrdquo Simulation vol 76no 2 pp 60ndash68 2001

[27] G Wang L Guo H Duan H Wang L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[28] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress Frome UK 2nd edition 2010

[29] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing amp Applications In press

[30] X S Yang andAHGandomi ldquoBat algorithm a novel approachfor global engineering optimizationrdquo Engineering Computa-tions vol 29 no 5 pp 464ndash483 2012

[31] G Wang L Guo H Duan L Liu and H Wang ldquoA Bat algo-rithm with mutation for UCAV path planningrdquo The ScientificWorld Journal vol 2012 Article ID 418946 15 pages 2012

[32] R Parpinelli and H Lopes ldquoNew inspirations in swarm intelli-gence a surveyrdquo International Journal of Bio-Inspired Computa-tion vol 3 no 1 pp 1ndash16 2011

[33] P W Tsai J S Pan B Y Liao M J Tsai and V Istanda ldquoBatalgorithm inspired algorithm for solving numerical optimiza-tion problemsrdquo Applied Mechanics and Materials vol 148 pp134ndash137 2012

[34] Wikipedia ldquoMathematical optimizationrdquo httpenwikipediaorgwikiNumerical optimization

[35] W Y Zhang S Xu and S J Li ldquoNecessary conditions for weaksharp minima in cone-constrained optimization problemsrdquoAbstract and Applied Analysis vol 2011 Article ID 909520 11pages 2012

[36] Wikipedia ldquoGlobal optimizationrdquo httpenwikipediaorgwikiGlobal optimization

[37] S Kirkpatrick C D Gelatt Jr and M P Vecchi ldquoOptimizationby simulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983

[38] M Dorigo and T Stutzle Ant Colony Optimization The MITPress Cambridge Mass USA 2004

[39] M Dorigo Optimization Learning and Natural AlgorithmsPolitecnico di Milano Italy 1992

[40] H G Beyer The Theory of Evolution Strategies Springer NewYork NY USA 2001

[41] W Khatib and P Fleming ldquoThe stud GA a mini revolutionrdquo inParallel Problem Solving From Nature pp 683ndash691 1998

[42] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[43] M Friedman ldquoA comparison of alternative tests of significancefor the problem of 119898 rankingsrdquo The Annals of MathematicalStatistics vol 11 no 1 pp 86ndash92 1940

[44] R Brits A P Engelbrecht and F van den Bergh ldquoLocatingmultiple optima using particle swarm optimizationrdquo AppliedMathematics and Computation vol 189 no 2 pp 1859ndash18832007

[45] K Tang X Li P N Suganthan Z Yang and T Weise ldquoBench-mark functions for the CECrsquo2010 special session and competi-tion on large scale global optimizationrdquo Nature Inspired Com-putation and Applications Laboratory USTC Hefei China2010

[46] Q-K Pan P N Suganthan J J Liang and M F Tasgetiren ldquoAlocal-best harmony search algorithm with dynamic subpopula-tionsrdquo Engineering Optimization vol 42 no 2 pp 101ndash117 2010

[47] Q-K Pan P N Suganthan J J Liang and M F TasgetirenldquoA local-best harmony search algorithm with dynamic sub-harmony memories for lot-streaming flow shop schedulingproblemrdquo Expert Systems with Applications vol 38 no 4 pp3252ndash3259 2011

[48] Q-K Pan P N Suganthan M F Tasgetiren and J J LiangldquoA self-adaptive global best harmony search algorithm forcontinuous optimization problemsrdquo Applied Mathematics andComputation vol 216 no 3 pp 830ndash848 2010

Journal of Applied Mathematics 21

[49] S M Islam S Das S Ghosh S Roy and P N Suganthan ldquoAnadaptive differential evolution algorithm with novel mutationand crossover strategies for global numerical optimizationrdquoIEEE Transactions on Systems Man and Cybernetics B vol 42no 2 pp 482ndash500 2012

[50] J J Liang A K Qin P N Suganthan and S Baskar ldquoCompre-hensive learning particle swarm optimizer for global optimiza-tion of multimodal functionsrdquo IEEE Transactions on Evolution-ary Computation vol 10 no 3 pp 281ndash295 2006

[51] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[52] R Mallipeddi and P Suganthan Problem Definitions and Eval-uation Criteria for the CEC 2010 Competition on ConstrainedReal-Parameter Optimization Nanyang Technological Univer-sity Singapore 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Journal of Applied Mathematics 19

Table 12Mean normalized optimization results in fourteen benchmark functions with different PARThe numbers shown are the best resultsfound after 100 Monte Carlo simulations of HSBA algorithm

PAR0 01 02 03 04 05 06 07 08 09 10

F01 100 100 100 100 100 100 100 100 100 100 100F02 407 100 100 452 277 257 341 536 369 489 103F03 100 127E4 134 135 135 135 134 135 134 134 134F04 115 8127 939E3 100 101 101 129 100 100 101 101F05 34573 5061 8619 907E3 2501 108 101 191 333 100 118F06 100 542E6 427E4 474E4 548E6 57788 57788 57788 57788 57787 57787F07 486 153 100 9543 10596 122E4 134 136 132 133 135F08 1269 7600 876E3 1703 6913 7672 885E3 110 105 104 100F09 6345 23047 182 167 1278 4860 5349 609E3 111 130 100F10 13355 1251 126 197E3 1148 464 1645 1893 199E3 100 188F11 6302 100 639E3 7944 5924 396E3 142E3 581E3 645E3 745E5 7981F12 390 881E3 890E3 100 890E3 4440 4778 1725 7011 7784 899E3F13 100 2166 207E3 669 580 430 27167 28246 10539 42926 47662F14 4614 100 3610 5593 122 640E3 4215 3289 3481 1272 5129

4 4 3 3 1 1 1 2 2 3 3

optimization algorithms such as ACO BA BBO DE ESGA HS PSO and SGA Based on the results of the tenapproaches on the test problems we can conclude that theHSBA significantly improves the performances of the HSand the BA on most multimodal and unimodal problems

In this work 14 benchmark functions are used to evaluatethe performance of our approach we will test our approachon more problems such as the high-dimensional (119863 ge20) CEC 2010 test suit [45] and the real-world problemsMoreover we will compare HSBA with other metaheuristicmethod such as DLHS [46 47] SGHS [48] adaptive DE[49] CLPSO [50] and DMS-PSO [51] In addition we onlyconsider the unconstrained function optimization in thiswork Our future work consists on adding the diversity rulesinto HSBA for constrained optimization problems such asconstrained real-parameter optimization CEC 2010 test suit[52]

In the field of optimization there are many issues worthyof further study and efficient optimizationmethod should bedeveloped depending on the analysis of specific engineeringproblem Our future work will focus on the two issues onthe one hand we would apply our proposed approach HSBA to solve practical engineering optimization problemsand obviously HSBA can become a fascinating method forreal-world engineering optimization problems on the otherhand we would develop new meta-hybrid approach to solveoptimization problem

Acknowledgments

This work was supported by State Key Laboratory of LaserInteraction with Material Research Fund under Grant noSKLLIM0902-01 and Key Research Technology of Electric-discharge Non-chain Pulsed DF Laser under Grant no LXJJ-11-Q80

References

[1] G Wang L Guo A H Gandomi et al ldquoLevy-flight krill herdalgorithmrdquo Mathematical Problems in Engineering vol 2013Article ID 682073 14 pages 2013

[2] A H Gandomi X S Yang S Talatahari and A H AlaviMetaheuristic Applications in Structures and InfrastructuresElsevier Waltham Mass USA 2013

[3] D H Ackley ldquoAn empirical study of bit vector functionoptimizationrdquo in Genetic Algorithms and Simulated AnnealingL Davis Ed pp 170ndash204 Pitman London UK 1987

[4] R Fletcher and M J D Powell ldquoA rapidly convergent descentmethod for minimizationrdquoThe Computer Journal vol 6 no 2pp 163ndash168 1963

[5] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[6] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[7] K A De Jong Analysis of the Behavior of a Class of GeneticAdaptive Systems University of Michigan 1975

[8] L A Rastrigin Extremal Control Systems NaukaMoscow Rus-sian 1974Theoretical Foundations of Engineering CyberneticsSeries

[9] S-K S Fan and J-M Chang ldquoDynamic multi-swarm particleswarm optimizer using parallel PC cluster systems for globaloptimization of large-scale multimodal functionsrdquo EngineeringOptimization vol 42 no 5 pp 431ndash451 2010

[10] W Gong Z Cai and C X Ling ldquoDEBBO a hybrid differentialevolution with biogeography-based optimization for globalnumerical optimizationrdquo Soft Computing vol 15 no 4 pp 645ndash665 2010

[11] G Wang L Guo H Duan L Liu and H Wang ldquoA modifiedfirefly algorithm for UCAV path planningrdquo International Jour-nal of Hybrid Information Technology vol 5 no 3 pp 123ndash1442012

20 Journal of Applied Mathematics

[12] GWang LGuoHDuan L Liu andHWang ldquoAhybridmeta-heuristic DECS algorithm for UCAVpath planningrdquo Journal ofInformation and Computational Science vol 9 no 16 pp 1ndash82012

[13] GWang L Guo H Duan L Liu HWang andM Shao ldquoPathplanning for uninhabited combat aerial vehicle using hybridmeta-heuristic DEBBO algorithmrdquo Advanced Science Engi-neering and Medicine vol 4 no 6 pp 550ndash564 2012

[14] G Wang L Guo H Duan H Wang L Liu and M Shao ldquoAhybrid meta-heuristic DECS algorithm for UCAV three-dimension path planningrdquo The Scientific World Journal vol2012 Article ID 583973 11 pages 2012

[15] H DuanW Zhao GWang and X Feng ldquoTest-sheet composi-tion using analytic hierarchy process and hybrid metaheuristicalgorithmTSBBOrdquoMathematical Problems in Engineering vol2012 Article ID 712752 22 pages 2012

[16] G Wang L Guo H Duan L Liu and H Wang ldquoDynamicdeployment of wireless sensor networks by biogeography basedoptimization algorithmrdquo Journal of Sensor and Actuator Net-works vol 1 no 2 pp 86ndash96 2012

[17] X S Yang A H Gandomi S Talatahari and A H AlaviMetaheuristics in Water Geotechnical and Transport Engineer-ing Elsevier Waltham Mass USA 2013

[18] D E GoldbergGenetic Algorithms in Search Optimization andMachine Learning Addison-Wesley New York NY USA 1998

[19] R Storn and K Price ldquoDifferential evolutionmdasha simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997

[20] S Das and P N Suganthan ldquoDifferential evolution a survey ofthe state-of-the-artrdquo IEEE Transactions on Evolutionary Com-putation vol 15 no 1 pp 4ndash31 2011

[21] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[22] A H Gandomi G J Yun X-S Yang and S Talatahari ldquoChaos-enhanced accelerated particle swarm optimizationrdquo Communi-cations in Nonlinear Science and Numerical Simulation vol 18no 2 pp 327ndash340 2013

[23] S Talatahari X S Yang R Sheikholeslami andAH GandomildquoOptimal parameter estimation for muskingum model usinghybrid CSS and PSO methodrdquo Journal of Computational andTheoretical Nanoscience In press

[24] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[25] G Wang L Guo H Duan L Liu H Wang and M ShaoldquoHybridizing harmony search with biogeography based opti-mization for global numerical optimizationrdquo Journal of Com-putational and Theoretical Nanoscience In press

[26] Z W Geem J H Kim and G V Loganathan ldquoA new heuristicoptimization algorithm harmony searchrdquo Simulation vol 76no 2 pp 60ndash68 2001

[27] G Wang L Guo H Duan H Wang L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[28] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress Frome UK 2nd edition 2010

[29] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing amp Applications In press

[30] X S Yang andAHGandomi ldquoBat algorithm a novel approachfor global engineering optimizationrdquo Engineering Computa-tions vol 29 no 5 pp 464ndash483 2012

[31] G Wang L Guo H Duan L Liu and H Wang ldquoA Bat algo-rithm with mutation for UCAV path planningrdquo The ScientificWorld Journal vol 2012 Article ID 418946 15 pages 2012

[32] R Parpinelli and H Lopes ldquoNew inspirations in swarm intelli-gence a surveyrdquo International Journal of Bio-Inspired Computa-tion vol 3 no 1 pp 1ndash16 2011

[33] P W Tsai J S Pan B Y Liao M J Tsai and V Istanda ldquoBatalgorithm inspired algorithm for solving numerical optimiza-tion problemsrdquo Applied Mechanics and Materials vol 148 pp134ndash137 2012

[34] Wikipedia ldquoMathematical optimizationrdquo httpenwikipediaorgwikiNumerical optimization

[35] W Y Zhang S Xu and S J Li ldquoNecessary conditions for weaksharp minima in cone-constrained optimization problemsrdquoAbstract and Applied Analysis vol 2011 Article ID 909520 11pages 2012

[36] Wikipedia ldquoGlobal optimizationrdquo httpenwikipediaorgwikiGlobal optimization

[37] S Kirkpatrick C D Gelatt Jr and M P Vecchi ldquoOptimizationby simulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983

[38] M Dorigo and T Stutzle Ant Colony Optimization The MITPress Cambridge Mass USA 2004

[39] M Dorigo Optimization Learning and Natural AlgorithmsPolitecnico di Milano Italy 1992

[40] H G Beyer The Theory of Evolution Strategies Springer NewYork NY USA 2001

[41] W Khatib and P Fleming ldquoThe stud GA a mini revolutionrdquo inParallel Problem Solving From Nature pp 683ndash691 1998

[42] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[43] M Friedman ldquoA comparison of alternative tests of significancefor the problem of 119898 rankingsrdquo The Annals of MathematicalStatistics vol 11 no 1 pp 86ndash92 1940

[44] R Brits A P Engelbrecht and F van den Bergh ldquoLocatingmultiple optima using particle swarm optimizationrdquo AppliedMathematics and Computation vol 189 no 2 pp 1859ndash18832007

[45] K Tang X Li P N Suganthan Z Yang and T Weise ldquoBench-mark functions for the CECrsquo2010 special session and competi-tion on large scale global optimizationrdquo Nature Inspired Com-putation and Applications Laboratory USTC Hefei China2010

[46] Q-K Pan P N Suganthan J J Liang and M F Tasgetiren ldquoAlocal-best harmony search algorithm with dynamic subpopula-tionsrdquo Engineering Optimization vol 42 no 2 pp 101ndash117 2010

[47] Q-K Pan P N Suganthan J J Liang and M F TasgetirenldquoA local-best harmony search algorithm with dynamic sub-harmony memories for lot-streaming flow shop schedulingproblemrdquo Expert Systems with Applications vol 38 no 4 pp3252ndash3259 2011

[48] Q-K Pan P N Suganthan M F Tasgetiren and J J LiangldquoA self-adaptive global best harmony search algorithm forcontinuous optimization problemsrdquo Applied Mathematics andComputation vol 216 no 3 pp 830ndash848 2010

Journal of Applied Mathematics 21

[49] S M Islam S Das S Ghosh S Roy and P N Suganthan ldquoAnadaptive differential evolution algorithm with novel mutationand crossover strategies for global numerical optimizationrdquoIEEE Transactions on Systems Man and Cybernetics B vol 42no 2 pp 482ndash500 2012

[50] J J Liang A K Qin P N Suganthan and S Baskar ldquoCompre-hensive learning particle swarm optimizer for global optimiza-tion of multimodal functionsrdquo IEEE Transactions on Evolution-ary Computation vol 10 no 3 pp 281ndash295 2006

[51] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[52] R Mallipeddi and P Suganthan Problem Definitions and Eval-uation Criteria for the CEC 2010 Competition on ConstrainedReal-Parameter Optimization Nanyang Technological Univer-sity Singapore 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

20 Journal of Applied Mathematics

[12] GWang LGuoHDuan L Liu andHWang ldquoAhybridmeta-heuristic DECS algorithm for UCAVpath planningrdquo Journal ofInformation and Computational Science vol 9 no 16 pp 1ndash82012

[13] GWang L Guo H Duan L Liu HWang andM Shao ldquoPathplanning for uninhabited combat aerial vehicle using hybridmeta-heuristic DEBBO algorithmrdquo Advanced Science Engi-neering and Medicine vol 4 no 6 pp 550ndash564 2012

[14] G Wang L Guo H Duan H Wang L Liu and M Shao ldquoAhybrid meta-heuristic DECS algorithm for UCAV three-dimension path planningrdquo The Scientific World Journal vol2012 Article ID 583973 11 pages 2012

[15] H DuanW Zhao GWang and X Feng ldquoTest-sheet composi-tion using analytic hierarchy process and hybrid metaheuristicalgorithmTSBBOrdquoMathematical Problems in Engineering vol2012 Article ID 712752 22 pages 2012

[16] G Wang L Guo H Duan L Liu and H Wang ldquoDynamicdeployment of wireless sensor networks by biogeography basedoptimization algorithmrdquo Journal of Sensor and Actuator Net-works vol 1 no 2 pp 86ndash96 2012

[17] X S Yang A H Gandomi S Talatahari and A H AlaviMetaheuristics in Water Geotechnical and Transport Engineer-ing Elsevier Waltham Mass USA 2013

[18] D E GoldbergGenetic Algorithms in Search Optimization andMachine Learning Addison-Wesley New York NY USA 1998

[19] R Storn and K Price ldquoDifferential evolutionmdasha simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997

[20] S Das and P N Suganthan ldquoDifferential evolution a survey ofthe state-of-the-artrdquo IEEE Transactions on Evolutionary Com-putation vol 15 no 1 pp 4ndash31 2011

[21] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[22] A H Gandomi G J Yun X-S Yang and S Talatahari ldquoChaos-enhanced accelerated particle swarm optimizationrdquo Communi-cations in Nonlinear Science and Numerical Simulation vol 18no 2 pp 327ndash340 2013

[23] S Talatahari X S Yang R Sheikholeslami andAH GandomildquoOptimal parameter estimation for muskingum model usinghybrid CSS and PSO methodrdquo Journal of Computational andTheoretical Nanoscience In press

[24] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[25] G Wang L Guo H Duan L Liu H Wang and M ShaoldquoHybridizing harmony search with biogeography based opti-mization for global numerical optimizationrdquo Journal of Com-putational and Theoretical Nanoscience In press

[26] Z W Geem J H Kim and G V Loganathan ldquoA new heuristicoptimization algorithm harmony searchrdquo Simulation vol 76no 2 pp 60ndash68 2001

[27] G Wang L Guo H Duan H Wang L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[28] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress Frome UK 2nd edition 2010

[29] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing amp Applications In press

[30] X S Yang andAHGandomi ldquoBat algorithm a novel approachfor global engineering optimizationrdquo Engineering Computa-tions vol 29 no 5 pp 464ndash483 2012

[31] G Wang L Guo H Duan L Liu and H Wang ldquoA Bat algo-rithm with mutation for UCAV path planningrdquo The ScientificWorld Journal vol 2012 Article ID 418946 15 pages 2012

[32] R Parpinelli and H Lopes ldquoNew inspirations in swarm intelli-gence a surveyrdquo International Journal of Bio-Inspired Computa-tion vol 3 no 1 pp 1ndash16 2011

[33] P W Tsai J S Pan B Y Liao M J Tsai and V Istanda ldquoBatalgorithm inspired algorithm for solving numerical optimiza-tion problemsrdquo Applied Mechanics and Materials vol 148 pp134ndash137 2012

[34] Wikipedia ldquoMathematical optimizationrdquo httpenwikipediaorgwikiNumerical optimization

[35] W Y Zhang S Xu and S J Li ldquoNecessary conditions for weaksharp minima in cone-constrained optimization problemsrdquoAbstract and Applied Analysis vol 2011 Article ID 909520 11pages 2012

[36] Wikipedia ldquoGlobal optimizationrdquo httpenwikipediaorgwikiGlobal optimization

[37] S Kirkpatrick C D Gelatt Jr and M P Vecchi ldquoOptimizationby simulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983

[38] M Dorigo and T Stutzle Ant Colony Optimization The MITPress Cambridge Mass USA 2004

[39] M Dorigo Optimization Learning and Natural AlgorithmsPolitecnico di Milano Italy 1992

[40] H G Beyer The Theory of Evolution Strategies Springer NewYork NY USA 2001

[41] W Khatib and P Fleming ldquoThe stud GA a mini revolutionrdquo inParallel Problem Solving From Nature pp 683ndash691 1998

[42] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[43] M Friedman ldquoA comparison of alternative tests of significancefor the problem of 119898 rankingsrdquo The Annals of MathematicalStatistics vol 11 no 1 pp 86ndash92 1940

[44] R Brits A P Engelbrecht and F van den Bergh ldquoLocatingmultiple optima using particle swarm optimizationrdquo AppliedMathematics and Computation vol 189 no 2 pp 1859ndash18832007

[45] K Tang X Li P N Suganthan Z Yang and T Weise ldquoBench-mark functions for the CECrsquo2010 special session and competi-tion on large scale global optimizationrdquo Nature Inspired Com-putation and Applications Laboratory USTC Hefei China2010

[46] Q-K Pan P N Suganthan J J Liang and M F Tasgetiren ldquoAlocal-best harmony search algorithm with dynamic subpopula-tionsrdquo Engineering Optimization vol 42 no 2 pp 101ndash117 2010

[47] Q-K Pan P N Suganthan J J Liang and M F TasgetirenldquoA local-best harmony search algorithm with dynamic sub-harmony memories for lot-streaming flow shop schedulingproblemrdquo Expert Systems with Applications vol 38 no 4 pp3252ndash3259 2011

[48] Q-K Pan P N Suganthan M F Tasgetiren and J J LiangldquoA self-adaptive global best harmony search algorithm forcontinuous optimization problemsrdquo Applied Mathematics andComputation vol 216 no 3 pp 830ndash848 2010

Journal of Applied Mathematics 21

[49] S M Islam S Das S Ghosh S Roy and P N Suganthan ldquoAnadaptive differential evolution algorithm with novel mutationand crossover strategies for global numerical optimizationrdquoIEEE Transactions on Systems Man and Cybernetics B vol 42no 2 pp 482ndash500 2012

[50] J J Liang A K Qin P N Suganthan and S Baskar ldquoCompre-hensive learning particle swarm optimizer for global optimiza-tion of multimodal functionsrdquo IEEE Transactions on Evolution-ary Computation vol 10 no 3 pp 281ndash295 2006

[51] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[52] R Mallipeddi and P Suganthan Problem Definitions and Eval-uation Criteria for the CEC 2010 Competition on ConstrainedReal-Parameter Optimization Nanyang Technological Univer-sity Singapore 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Journal of Applied Mathematics 21

[49] S M Islam S Das S Ghosh S Roy and P N Suganthan ldquoAnadaptive differential evolution algorithm with novel mutationand crossover strategies for global numerical optimizationrdquoIEEE Transactions on Systems Man and Cybernetics B vol 42no 2 pp 482ndash500 2012

[50] J J Liang A K Qin P N Suganthan and S Baskar ldquoCompre-hensive learning particle swarm optimizer for global optimiza-tion of multimodal functionsrdquo IEEE Transactions on Evolution-ary Computation vol 10 no 3 pp 281ndash295 2006

[51] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[52] R Mallipeddi and P Suganthan Problem Definitions and Eval-uation Criteria for the CEC 2010 Competition on ConstrainedReal-Parameter Optimization Nanyang Technological Univer-sity Singapore 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of