the opposition-based harmony search algorithm
TRANSCRIPT
ORIGINAL CONTRIBUTION
The Opposition-based Harmony Search Algorithm
R. P. Singh • V. Mukherjee • S. P. Ghoshal
Received: 16 April 2012 / Accepted: 25 November 2013 / Published online: 8 January 2014
� The Institution of Engineers (India) 2014
Abstract This paper proposes a novel approach to accel-
erate the harmony search (HS) algorithm. The proposed
opposition-based HS of the present work employs opposition-
based learning for harmony memory initialization and also for
the generation jumping. In the present work, opposite numbers
have been utilized to improve the convergence rate of the HS.
The potential of the proposed algorithm is assessed by means
of an extensive comparative study of numerical results on
benchmark test functions. The results obtained confirm the
potential and effectiveness of the proposed algorithm com-
pared to some other algorithms surfaced in the recent state-of-
the art literatures. Additionally, the opposition concept has
been incorporated in an improved variant of HS such as local-
best HS algorithm with dynamic subpopulations and the
potential of incorporation of opposition concept in evolu-
tionary optimization algorithm is established.
Keywords Benchmark test function � Harmony search �Opposite numbers � Optimization
Introduction
The researchers, over the globe, are being inspired by
nature-inspired meta-heuristics [1] on a regular basis to
meet the demands of complex real-world optimization
problems. Thus, the computational costs of the algorithms
are being, dramatically, reduced for the recent past.
Being inspired by this tradition, Geem et al. [2] proposed
harmony search (HS) in 2001. It is a derivative-free meta-
heuristic algorithm. It is a new variant of meta-heuristic
algorithm inspired by the natural musical performance pro-
cess that occurs when a musician searches for a better state of
harmony. In the HS algorithm, the solution vector is analo-
gous to the harmony in music and the local and global search
schemes are analogous to the musician’s improvisations. In
comparison to other meta-heuristics in the literature, the HS
algorithm imposes fewer mathematical requirements and
can be easily adapted for solving various kinds of engi-
neering optimization problems. Furthermore, numerical
comparisons demonstrated that the evolution in the HS
algorithm is faster than genetic algorithm [3]. Therefore, HS
algorithm has captured much attention and has been, suc-
cessfully, applied to solve a wide range of practical optimi-
zation problems, such as structural optimization [4],
parameter estimation of nonlinear Muskingum model [5],
pipe network design [6], vehicle routing [7], design of water
distribution networks [8], scheduling of a multiple dam
system [9], and so on.
HS algorithm is good at identifying the high perfor-
mance regions of solution space within a reasonable time
[10]. Mahdavi et al. [3] presented an improved HS (IHS)
algorithm, by introducing a strategy to dynamically tune
the key parameters. Omran and Mahdavi [11] proposed a
global best HS (GHS) algorithm, by borrowing the concept
from the swarm intelligence. Pan et al. in [12], proposed a
self-adaptive global best HS (SGHS) algorithm for solving
continuous optimization problems.
Tizhoosh [13] introduced the concept of opposition-based
learning (OBL). This notion has been applied to accelerate
R. P. Singh
Department of Electrical Engineering, Asansol Engineering
College, Asansol, India
V. Mukherjee (&)
Department of Electrical Engineering, Indian School of Mines,
Dhanbad, India
e-mail: [email protected]
S. P. Ghoshal
Department of Electrical Engineering, National Institute of
Technology, Durgapur, India
123
J. Inst. Eng. India Ser. B (December 2013–February 2014) 94(4):247–256
DOI 10.1007/s40031-013-0069-5
the reinforcement learning and the backpropagation learning
in neural networks. The main idea behind OBL is the
simultaneous consideration of an estimate and its corre-
sponding opposite estimate (i.e., guess and opposite guess)
in order to achieve a better approximation for the current
candidate solution. In the recent literature, the concept of
opposite numbers has been utilized to speed up the conver-
gence rate of an optimization algorithm e.g. opposition-
based differential evolution (ODE) [14]. In this paper, OBL
has been utilized to accelerate the convergence rate of
the HS. Hence, our proposed approach has been called as
opposition-based HS (OHS). OHS uses opposite numbers
during HM initialization and also for generating the new
HM during the evolutionary process of HS. Additionally,
the concept of OBL is applied to an improved variant of
HS algorithm reported by Pan et al. [15] (termed as DLHS in
Ref. [15]) and this new algorithm is termed as ODLHS
in the present work. The potential of ODLHS is tested on a
suite of first fourteen CEC 2005 benchmark test functions
[16].
The objectives of the current article may be noted as
presented below.
• The proposed algorithm has been tested on a suite of
standard benchmark test functions.
• The obtained optimal results on benchmark test func-
tions are compared to other variants of HS reported in
the recent literatures.
• The comparative convergence profiles of fitness function
values for a few benchmark test functions are presented.
A Brief Description of HS Algorithm
In the basic HS algorithm, each solution is called a harmony.
It is represented by an n-dimension real vector. An initial
randomly generated population of harmony vectors is stored
in an HM. Then, a new candidate harmony is generated from
all of the solutions in the HM by adopting a memory con-
sideration rule, a pitch adjustment rule and a random re-
initialization. Finally, HM is updated by comparing the new
candidate harmony vector and the worst harmony vector in
the HM. The worst harmony vector is replaced by the new
candidate vector if it is better than the worst harmony vector
in the HM. The above process is repeated until a certain
termination criterion is met. Thus, the basic HS algorithm
consists of three basic phases. These are initialization,
improvisation of a harmony vector and updating the HM.
Sequentially, these phases are described below.
(i) Initialization of the Problem and the Parameters of
HS Algorithm
In general, a global optimization problem can be
enumerated as follows: min f ðxÞstxj 2 ½paraminj ;
paramaxj �; j ¼ 1; 2; . . .; n, where f ðxÞ is the
objective function; X ¼ ½x1; x2; . . .xn� is the set of
design variables; n is the number of design variables.
Here, paraminj ; paramax
j are the lower and upper bounds
for the design variable xj, respectively. The parameters
of the HS algorithm are harmony memory size (HMS)
(the number of solution vectors in HM), harmony
memory consideration rate (HMCR), pitch adjusting
rate (PAR), distance bandwidth (BW), and number of
improvisations (NI). The NI is the same as the total
number of fitness function evaluates (NFFEs). It may
be set as a stopping criterion.
(ii) Initialization of HM
HM consists of HMS harmony vectors. Let Xj ¼x
j1; x
j2; . . .; x j
n
� �represent the jth harmony vector
which is randomly generated within the parameter
limits [paraminj ; paramax
j ]. Then, the HM matrix is
filled with the HMS harmony vectors as in (1).
HM ¼
x11 x1
2 . . . x1n
x21 x2
2 . . . x2n
. . .xHMS
1 xHMS2 . . . xHMS
n
2
6664
3
7775ð1Þ
(iii) Improvisation of a New Harmony
A new harmony vector Xnew ¼ xnew1 ; xnew
2 ; . . .; xnewn
� �is
generated (called improvisation) by applying three
rules, namely, (i) a memory consideration, (ii) a pitch
adjustment, and (iii) a random selection. First of all, a
uniform random number r1 is generated in the range [0,
1]. If r1 is less than HMCR, the decision variable xnewj is
generated by the memory consideration, otherwise, xnewj
is obtained by a random selection (i.e., random re-
initialization between the search bounds). In the
memory consideration, xnewj is selected from any
harmony vector i in ½1; 2; . . .;HMS�. Secondly, each
decision variable xnewj will undergo a pitch adjustment
with a probability of PAR if it is updated by the memory
consideration. The pitch adjustment rule is given as
follows:
xnewj ¼ xnew
j � r3 � BW ð2Þ
where r3 is a uniform random number between 0 and 1.
(iv) Updating of HM
After a new harmony vector Xnewj is generated, HM
will be updated by the survival of the fittest vector
between Xnew and the worst harmony vector Xworst in
the HM. That is, Xnew will replace Xworst and become
a new member of the HM if the fitness value of Xnew
is better than the fitness value of Xworst.
248 J. Inst. Eng. India Ser. B (December 2013–February 2014) 94(4):247–256
123
(v) Process of Computation
The computational procedure of the basic HS
algorithm can be summarized as follows [2].
IHS Algorithm
The basic HS algorithm uses fixed values for PAR and BW
parameters. IHS algorithm, proposed by Mahdavi et al. [3],
applies the same memory consideration, pitch adjustment and
random selection on the basic HS algorithm but, dynamically,
updates the values of PAR and BW as in (3) and (4), respectively.
PARðgnÞ ¼ PARmin þ PARmax � PARmin
NI� gn ð3Þ
BWðgnÞ ¼ BWmax � e
ln BWmin
BWmax
� �
NI� gn
0
@
1
A
ð4Þ
In Eq. (3), PAR(gn) is the pitch adjustment rate in the current
generation (gn), PARmin and PARmax are the minimum and the
maximum adjustment rate, respectively. In (4), BW (gn) is the
distance bandwidth at generation (gn), BWmin and BWmax are
the minimum and the maximum bandwidths, respectively.
Opposition-based Learning: A Concept
Evolutionary optimization methods start with some initial
solutions (initial population) and try to improve them toward
some optimal solution(s). The process of searching terminates
when some predefined criteria are satisfied. In the absence of a
priori information about the solution, it is usually started with
random guesses. The computation time, among others, is related
to the distance of these initial guesses from the optimal solution.
The chance of starting with a closer (fitter) solution can be
improved by simultaneously checking the opposite solution [13].
By doing this, the fitter one (guess or opposite guess) can be
chosen as an initial solution. In fact, according to the theory of
probability, 50 % of the time a guess is further from the solution
than its opposite guess. Therefore, starting with the closer of the
two guesses (as judged by its fitness) has the potential to accel-
erate convergence. The same approach can be applied not only to
initial solutions but also continuously to each solution in the
current population.
(i) Definition of Opposite Number
Let x 2 ½ub; lb� be a real number. The opposite
number is defined as in (5).
x^ ¼ ubþ lb� x ð5Þ
Similarly, this definition can be extended to higher
dimensions [13] as stated next.
(ii) Definition of Opposite Point
Let X ¼ ðx1; x2; . . .; xnÞ be a point in n-dimensional
space, where ðx1; x2; . . .; xnÞ 2 R and xi 2 ½ubi; lbi�8i
2 f1; 2; . . .; ng. The opposite point X^
¼ ðx^1; x^
2; . . .;
x^
nÞ is completely defined by its components as in Eq. (6).
x^
i ¼ ubi þ lbi � xi ð6Þ
Now, by employing the opposite point definition, the
opposition-based optimization is defined next.
(iii) Opposition-based Optimization
Let X ¼ ðx1; x2; . . .; xnÞ be a point in n-dimensional
space (i.e., a candidate solution). Assume f ¼ ð�Þ is a
fitness function which is used to measure the candi-
date’s fitness. According to the definition of the
opposite point, X^
¼ ðx^1; x^
2; . . .; x^
nÞ is the opposite of
X ¼ ðx1; x2; . . .; xnÞ. Now, if f ðX^
Þ� f ðXÞ, then point X
can be replaced with X^
; otherwise, we continue with
X. Hence, the point and its opposite point are evaluated
simultaneously in order to continue with the fitter one.
Proposed Algorithms
(i) OHS
Similar to all population-based optimization algo-
rithms, two main steps are distinguishable for HS,
namely, HM initialization and producing new HM by
adopting the principle of HS. In the present work, the
strategy of the OBL [13] is incorporated in two steps.
The original HS is chosen as a parent algorithm and
opposition-based ideas are embedded in it with an
intention to exhibit accelerated convergence profile.
J. Inst. Eng. India Ser. B (December 2013–February 2014) 94(4):247–256 249
123
Corresponding pseudo code for the proposed OHS
approach can be summarized as follows:
250 J. Inst. Eng. India Ser. B (December 2013–February 2014) 94(4):247–256
123
Ta
ble
1M
ean
and
stan
dar
dd
evia
tio
n(±
SD
)o
fth
eB
ench
mar
kfu
nct
ion
op
tim
izat
ion
resu
lts
(n=
30
)
Funct
ion
Glo
bal
Opti
mum
HS
[2]
IHS
[3]
GH
S[1
1]
SG
HS
[12]
OH
SP
ropose
d
Spher
efu
nct
ion
(f1)
00.0
00187
±0.0
00032
0.0
00712
±0.0
00644
0.0
00010
±0.0
00022
0.0
00000
–0.0
00000
0.0
00000
–0.0
00000
Sch
wef
el’s
pro
ble
m2.2
2(f
2)
00.1
71524
±0.0
72851
1.0
97325
±0.1
81253
0.0
72815
±0.1
14464
0.0
00102
±0.0
00017
0.0
00101
–0.0
00013
Rose
nbro
ck
funct
ion
(f3)
0340.2
97100
±266.6
91353
624.3
23216
±559.8
47363
49.6
69203
±59.1
61192
150.9
29754
±131.0
54916
47.3
6718
–130.1
7368
Ste
pfu
nct
ion
(f4)
04.2
33333
±3.0
29668
3.3
33333
±2.1
95956
0±
00.0
00000
–0.0
00000
0.0
00000
–0.0
000000
Rota
ted
hyper
-
elli
pso
id
funct
ion
(f5)
04,2
97.8
16457
±1,3
62.1
48438
4,3
13.6
53320
±1,0
62.1
06222
5,1
46.1
76259
±6,3
48.7
92556
11.7
96490
±7.4
54435
10.3
71876
–6.3
94781
Sch
wef
el’s
pro
ble
m2.2
6(f
6)
030.2
62214
±11.9
60017
34.5
31375
±10.4
00177
0.0
41657
±0.0
50361
0.0
04015
±0.0
06237
0.0
03145
–0.0
05372
Ras
trig
in’s
funct
ion
(f7)
01.3
90625
±0.8
24244
3.4
99144
±1.1
82907
0.0
08629
±0.0
15277
0.0
17737
±0.0
67494
0.0
01376
–0.0
01387
Ack
ley’s
funct
ion
(f8)
01.1
30004
±0.4
07044
1.8
93394
±0.3
14610
0.0
20909
±0.0
21686
0.4
84445
±0.3
56729
0.0
031678
–0.0
00136
Gri
ewan
kfu
nct
ion
(f9)
01.1
19266
±0.0
41207
1.1
20992
±0.0
40887
0.1
02407
±0.1
75640
0.0
50467
±0.0
35419
0.0
37451
–0.0
32175
Six
-hum
pC
amel
-
bac
kfu
nct
ion
(f10)
-1.0
316285
21.0
31628
–0.0
00000
21.0
31628
–0.0
00000
-1.0
31600
±0.0
00018
21.0
31628
–0.0
00000
21.0
31628
–0.0
00000
Shif
ted
Spher
e
funct
ion
(f11)
-450
-443.5
53193
±2.7
77075
-438.8
15459
±3.7
03810
1,3
53.2
11035
±361.7
63223
-450.0
00000
–0.0
00000
-450.0
00000
–0.0
00000
Shif
ted
Sch
wef
el’s
pro
ble
m1.2
(f12)
-450
3,8
88.1
78656
±1,1
15.2
59221
3,3
16.6
02220
±1,5
19.4
08280
18,4
40.5
04168
±4,5
37.9
43604
-431.0
95663
±17.2
51617
-441.3
67184
–16.3
10728
Shif
ted
Rose
nbro
ck
funct
ion
(f13)
390
3,7
90.7
00528
±3,2
71.5
73964
5,7
52.1
22700
±3,7
62.5
43380
35,0
46,9
42.7
85443
±22,1
36,4
32.2
86008
2,5
11.6
78953
–3,9
66.4
80932
2,8
17.6
38432
±3,8
41.4
20371
Shif
ted
Ras
trig
in’s
funct
ion
(f14)
-330
-329.1
28972
±0.8
08682
-328.0
56701
±0.6
67483
-263.2
71951
±9.3
56208
-329.8
60811
–0.3
49908
-329.8
60867
–0.3
49908
Shif
ted
rota
ted
Gri
ewan
k’s
funct
ion
(f15)
-180
547.8
68607
±0.5
00873
494.7
56015
±6.7
17343
546.6
25388547
±8686070.6
13099
-47.1
88943
–13.3
13494
-58.3
12701
±12.5
13207
Shif
ted
rota
ted
Ras
trig
in
funct
ion
(f16)
-330
-274.6
87463
±12.8
63299
-270.6
94907
±16.2
23212
-192.0
95804
±18.6
45826
-232.1
36195
±30.0
33504
-280.4
13671
–11.2
18671
J. Inst. Eng. India Ser. B (December 2013–February 2014) 94(4):247–256 251
123
(ii) ODLHS
An improved version of HS such as DLHS reported
in Pan et al. [15] is chosen as the basic algorithm and
the concept of OBL described in this paper is blended
with it and the new variant of HS resulted is termed
as OHLHS.
Experimental Bench: Optimization of Benchmark Test
Function
OHS for Global optimization
(i) Benchmark Test Function
A suite of sixteen global optimization problems
(Table 1) are used to test the performance of the
proposed OHS algorithm. Among these sixteen
benchmark problems, sphere function, Schwefel’s
problem 2.22, step function, rotated hyper-ellipsoid
function, shifted sphere function and shifted Schwe-
fel’s problem 1.2 are unimodal. Step function is
discontinuous. Rosenbrock function, Schwefel’s
problem 2.26, Rastrigin function, Ackley function,
Griewank function, shifted Rosenbrock function,
shifted Rastrigin function, shifted rotated Griewank’s
function and shifted rotated Rastrigin function are
difficult multimodal problems where the number of
local optima increases with the problem dimension.
Six-hump Camel-back function is a low-dimensional
function with only a few local optima.
(ii) Parameter Setting
The best chosen variables for the proposed OHS are
Jr ¼ 0:3, HMCR ¼ 0:95, PARmin ¼ 0:35, PARmax ¼0:99, BWmin ¼ 1:00e� 06, BWmax ¼ 1=20 ðxmax�xminÞ.
(iii) Discussion on Benchmark Function Optimization
Each benchmark test function is run for 25
independent times. The average and standard
deviations over these 25 runs for 30 and 100
dimensions (except for the two-dimensional six-
hump Camel-back function) are presented in
Tables 1 and 2, respectively. Results of interest
are bold faced in the respective tables. The results
of the HS, IHS and GHS algorithms for these
problems are obtained from Omran and Mahdavi
[11] while those for SGHS are taken from Pan
et al. [12].
It can be observed from Table 1 that the OHS algorithm
generates nine best results out of sixteen functions and for
five test functions, OHS and SGHS yield the same results
(for dimension size of 30). For two functions like shiftedTa
ble
2M
ean
and
stan
dar
dd
evia
tio
n(±
SD
)o
fth
eB
ench
mar
kfu
nct
ion
op
tim
izat
ion
resu
lts
(n=
10
0)
Funct
ion
Glo
bal
Opti
mum
OH
SP
ropose
dH
S[2
]H
S[3
]G
HS
[11
]S
GH
S[1
2]
f 10
8.6
83062
±0.7
75134
8.8
40449
±0.7
62496
2.2
30721
±0.5
65271
0.0
00002
–0.0
00003
0.0
00001
–0.0
00002
f 20
82.9
26284
±6.7
17904
82.5
48978
±6.3
41707
19.0
20813
±5.0
93733
0.0
17581
±0.0
21205
0.0
15438
–0.0
20179
f 30
16,6
75,1
72.1
84717
±3,1
82,4
64.4
88466
17,2
77,6
54.0
59718
±2,9
45,5
44.2
75052
2,5
98,6
52.6
17273
±915,9
37.7
97217
621.7
49360
±583.8
89593
619.7
53628
–581.3
34539
f 40
20,2
80.2
00000
±2,0
03.8
29956
20,8
27.7
33333
±2,1
75.2
84501
5,2
19.9
33333
±1,1
34.8
76027
0.1
00000
±0.3
05129
0.0
91036
–0.3
00141
f 50
215,0
52.9
04398
±28,2
76.3
75538
213,8
12.5
84732
±28,3
05.2
49583
321,7
80.3
53575
±39,5
89.0
41160
37,2
82.0
96600
±5,9
13.4
89066
37,1
73.0
01346
–5,9
10.3
31444
f 60
7,9
60.9
25495
±572.3
90489
8,3
01.3
90783
±731.1
91869
1,2
70.9
44476
±395.4
57330
35.6
75398
±86.0
00104
33.4
13687
–85.1
00030
f 70
343.4
97796
±27.2
45380
343.2
32044
±25.1
49464
80.6
57677
±30.3
68471
12.3
53767
±2.6
3560
11.1
00003
–2.5
43010
f 80
13.8
57189
±0.2
84945
13.8
01383
±0.5
30388
8.7
67846
±0.8
80066
-0.0
00000
–0.0
00000
-0.0
00000
–0.0
00000
f 90
195.5
92577
±24.8
08359
204.2
91518
±19.1
57177
54.2
52289
±18.6
00195
0.0
27932
±0.0
09209
0.0
21349
–0.0
08312
f 11
-450
22,2
41.5
54607
±2,5
50.7
46480
23,0
26.2
41628
±2,3
04.7
87587
88,8
35.2
45672
±9,0
65.4
18923
-449.9
99980
±0.0
00093
-450.0
00000
–0.0
00072
f 12
-450
272,4
95.0
60293
±38,5
04.5
05752
274,4
39.3
36302
±37,3
00.9
50900
496,6
68.9
16387
±51,9
29.4
15486
63,2
51.6
04588
±12,4
30.0
53431
63,2
48.1
12343
–12,4
28.0
4301
f 13
390
2,2
42,2
45,8
18.8
67268
±380,6
21,0
42.7
75803
2,2
11,1
21,2
63.7
79596
±358,6
76,3
87.3
53021
27,9
10,0
12,9
32.7
16747
±3,9
41,6
89,4
20.1
06002
781.5
10290
±293.2
28166
776.4
23648
–285.2
10031
f 14
-330
36.1
64513
±25.5
76559
36.6
85585
±25.3
11496
509.0
66964
±45.1
83819
-317.2
25748
±2.7
32871
-319.2
01033
–2.4
36781
f 15
-180
1,8
85.1
00054
±12.4
99888
1,8
83.4
99365
±15.4
85959
1,8
29.6
69549
±33.5
04803
1,0
06.1
17891
±35.3
07793
1,0
04.2
31340
–34.1
72839
f 16
-330
341.6
76241
±48.3
72925
334.7
47556
±54.6
93700
763.8
18874
±43.6
13654
66.9
15779
±55.3
75297
46.3
10278
–53.1
12855
252 J. Inst. Eng. India Ser. B (December 2013–February 2014) 94(4):247–256
123
Rosenbrock function and shifted rotated Griewank’s
function, SGHS yields better results than OHS. It may also
be noted from Table 2 that with the increase in dimen-
sionality of the benchmark test functions, OHS offers sig-
nificantly better results than the compared results. Thus, as
the dimension, thereby, the complexity of the benchmark
test functions increases, OHS offers superior results.
The convergence profiles of the fitness function value
for the 30D—(a) Sphere function, and (b) Schwefel’s
problem 2.22 against the NFFEs are presented in Fig. 1a, b,
respectively. The HS-, IHS-, GHS-, SGHS-, and OHS-
based comparative convergence profiles of the fitness
function values for the 30D shifted Rastrigin function
against NFFEs are presented in Fig. 2. It can be observed
from these figures that the convergence profile of the
proposed OHS-based optimum value for this selected test
function descends much faster than the other compared
algorithms. It points out the fact that the proposed OHS-
based result for this benchmark test function is superior to
the compared methods.
ODLHS for CEC Benchmark Test Functions
(i) Benchmark Test Functions
The proposed ODLHS algorithm is tested and eval-
uated on CEC 2005 bench-mark functions [16]. The
CEC 2005 test problems include twenty-five functions
with different problems; five of them are unimodal
problems and other twenty are multimodal problems.
Out of these twenty-five test problems, first fourteen
test problems are taken in the present work and the
definitions of these test functions [16] are given in
Fig. 1 Comparative
convergence profiles of fitness
function values for 30D
a Sphere function b Schwefel’s
problem 2.22
Fig. 2 Comparative convergence profiles of fitness function values
for 30D shifted Rastrigin function
Table 3 Definition of first
fourteen CEC 2005 [16]
Benchmark test functions
Name Definition Range
f1 Shifted sphere function [-50, 100]
f2 Shifted Schwefel’s problem 1.2 [-50, 100]
f3 Shifted rotated high conditioned elliptic function [-50, 100]
f4 Shifted Schwefel’s problem 1.2 with noise in fitness [-50, 100]
f5 Schwefel’s problem 2.6 with global optimum on bounds [-50, 100]
f6 Shifted Rosenbrock’s function [-50, 100]
f7 Shifted rotated Griewank’s function without bounds [-300, 600]
f8 Shifted rotated Ackley’s function with global optimum on bounds [-16, 32]
f9 Shifted Rastrigin’s function [-2.5, 5]
f10 Shifted rotated Rastrigin’s function [-2.5, 5]
f11 Shifted rotated Weierstrass function [-0.25, 0.5]
f12 Schwefel’s problem 2.13 [-50,100]
f13 Expanded extended Griewank’s plus Rosenbrock’s function [-3, 1]
f14 Expanded rotated extended Scaffe’s F6 [-50,100]
J. Inst. Eng. India Ser. B (December 2013–February 2014) 94(4):247–256 253
123
Table 3. Sugantha [16] may be referred for detailed
mathematical formula.
(ii) Parameter Setting
The best chosen value of Jr is 0.3. The other
parameters of this algorithm are taken from Pan
et al. [15].
(iii) Discussion on CEC 2005 Benchmark Test Function
Optimization
The results obtained by adopting, the proposed
ODLHS algorithm of the present work on first
fourteen CEC 2005 benchmark test functions are
presented in Table 4 on sample basis. Each bench-
mark test function is run for 25 independent times.
The results yielded by the proposed ODLHS algo-
rithm are compared to SGHS [17], DLHS [17] and
IGHS [17]. In Table 4, an entry shown as ‘–’ means
that the result for this function was not reported in
the original reference. In this table, SGHS-, DLHS-
and IGHS results are taken directly from the
respective references. The results show that the
proposed opposition based strategy in DLHS algo-
rithm also performs well on unimodal and multi-
Table 4 Results of all the algorithms taken over 30 runs
Benchmark function Problem size SGHS [17] DLHS [17] IGHS [17] ODLHSProposed
f1 30 0 2.44e-07 1.19e-07 1.11e-07
0 1.33e-06 1.76e-08 1.04-08
f2 18.90 2.84e?03 1.85e-06 1.11e-06
17.25 1.77e?03 5.75e-07 5.18-07
f3 – 3.19e?06 2.21e?06 2.11e?06
1.72e?06 1.21e?06 1.01e?06
f6 2.12e?03 3.78e?03 1.67e?03 1.44e?03
3.97e?03 4.84e?03 3.58e?03 3.14e?03
f9 1.39e-01 1.58 6.66e-01 6.48e-01
3.50e-01 1.50 7.99e-01 7.89e-01
f10 97.86 – 62.02 62.00
30.03 15.56 15.14
f1 50 – 7.48e-10 3.36e-07 3.36e-07
8.04e-10 5.16e-08 5.11e-08
f2 – 7.98e?03 2.99e-04 2.41e-04
3.29e?03 2.77e-04 2.48e-04
f3 – 1.01e?07 2.70e?06 2.59e?06
3.09e?06 1.07e?06 1.01e?06
f6 – 3.08e?03 2.93e?03 2.78e?03
4.11e?03 3.79e?03 3.19e?03
f9 – 2.57 3.24e-01 3.11e-01
2.50 6.98e-01 6.48e-01
f1 100 2.00e-05 – 2.98e-06 2.49e-06
9.30e-05 3.87e-07 3.78e-07
f2 6.37e?04 – 5.81e?04 5.11e?04
1.24e?04 8.87e?03 8.41e?03
f6 391.51 – 138.91 138.14
293.23 62.59 62.59
f9 12.77 – 16.54 16.11
2.73 3.31 3.11
f10 396.92 – 371.98 370.17
55.37 57.28 57.14
f11 – – – 5.14e?0
f12 – – – 2.61e?4
f13 – – – 1.98e?0
f14 – – – 3.14e?0
254 J. Inst. Eng. India Ser. B (December 2013–February 2014) 94(4):247–256
123
modal functions. The SGHS-, DLHS-, IGHS- and the
proposed ODLHS-based comparative convergence
profiles of the fitness function values for 30D shifted
sphere function, shifted Schwefel’sproblem1.2 and
shifted rotated high conditioned elliptic function are
depicted in Figs. 3 to 5, in order. It may be observed
from these figures that blending of opposition
strategy in an improved variant of HS such as DLHS
(termed as ODLHS) offers faster convergence profile
of fitness function value for these selected test
functions as compared to other algorithms.
Conclusion
In this paper, the concept of OBL has been employed to
accelerate the basic HS algorithm as well as an improved
variant of HS such as DLHS algorithm. The notion of OBL
has been utilized to introduce opposition-based HM ini-
tialization and opposition-based generation jumping. By
embedding these two steps within the HS framework, OHS
algorithm and ODLHS are proposed in this paper. The
proposed algorithms are tested on benchmark test func-
tions. The simulation results demonstrate the effectiveness
and robustness of these proposed algorithms to solve
benchmark test functions. Moreover, the results yielded by
the proposed algorithms have been compared to those
surfaced in the recent state-of-the-art literatures. The
comparison of the numerical results and the convergence
profiles of the optimum objective function values confirm
the effectiveness and the superiority of the proposed two
approaches of the current article.
References
1. S. Rahul, M. Masoud, X. Yao, Evolutionary Computations
(Kluwer Academic Publishers, New York, 2003)
2. Z.W. Geem, J.H. Kim, G.V. Loganathan, A new heuristic opti-
mization algorithm: harmony search, Simulations, 76, 60 (2001)
3. M. Mahdavi, M. Fesanghary, E. Damangir, An improved har-
mony search algorithm for solving optimization problems, Appl.
Math. Comput, 188, 1567 (2007)
4. K.S. Lee, Z.W. Geem, A new structural optimization method
based on the harmony search algorithm, Comput. Struct, 82(9/
10), 781 (2004)
Fig. 3 Comparative convergence profiles of fitness function values
for 30D shifted sphere function Fig. 5 Comparative convergence profiles of the fitness function
values for the Shifted rotated high conditioned 30D elliptic function
Fig. 4 Comparative convergence profiles of fitness function values
for 30D shifted Schwefel’s problem1.2
J. Inst. Eng. India Ser. B (December 2013–February 2014) 94(4):247–256 255
123
5. J.H. Kim, Z.W. Geem, E.S. Kim, Parameter estimation of the
nonlinear Muskingum model using harmony search, J. Am. Water
Resour. Assoc, 37, 1131 (2001)
6. Z.W. Geem, J.H. Kim, G.V. Loganathan, Harmony search opti-
mization: application to pipe network design, Int. J. Model.
Simul, 22(2), 125 (2002)
7. Z.W. Geem, K.S. Lee, Y. Park, Application of harmony search to
vehicle routing, Am. J. Appl. Sci, 2(12), 1552 (2005)
8. Z.W. Geem, Optimal cost design of water distribution networks
using harmony search, Eng. Optim, 38, 259 (2006)
9. Z.W. Geem, Optimal scheduling of multiple dam system using
harmony search algorithm (Springer, New York, 2007), 316
10. S. Das, A. Mukhopadhyay, A. Roy, A. Abraham, B.K. Panigrahi,
Exploratory power of the harmony search algorithm: analysis and
improvements for global numerical optimization, IEEE Trans.
Syst. Man Cybern. B Cybern, 41(1), 89 (2011)
11. M.G.H. Omran, M. Mahdavi, Global-best harmony search, Appl.
Math. Comput, 198, 643 (2008)
12. Q.-K. Pan, P.N. Suganthan, M.F. Tasgetiren, J.J. Liang, A self-
adaptive global best harmony search algorithm for continuous
optimization problems, Appl. Math. Comput, 216, 830 (2010)
13. H.R. Tizhoosh, Opposition-based learning: a new scheme for machine
intelligence, Proceedings of International Conference Computing
Intelligence Modeling Control and Automation, 1, 695 (2005)
14. S. Rahnamayan, H.R. Tizhoosh, M.M.A. Salama, Opposition-
based differential evolution, IEEE Trans, EC, 12(1), 64 (2008)
15. Q.-K. Pan, P.N. Suganthan, J.J. Liang, M.F. Tasgetiren, A local-
best harmony search algorithm with dynamic subpopulations,
Eng. Optim, 42(2), 101 (2010)
16. P.N. Suganthan, N. Hansen, J.J. Liang, K. Deb, Y.-P. Chen, A.
Auger, S. Tiwari, Problem Definitions and Evaluation Criteria for
the CEC 2005 Special Session on Real-Parameter Optimization,
Technical Report, Nanyang Technological University, Singapore,
May 2005 and KanGAL Report #2005005, IIT Kanpur, India
17. Mohammed El-Abd, An improved global-best harmony search
algorithm, Appl. Math. Comput, 222, 94 (2013)
256 J. Inst. Eng. India Ser. B (December 2013–February 2014) 94(4):247–256
123