[ieee 2013 sixth international conference on advanced computational intelligence (icaci) - hangzhou,...

6
2013 Sixth Inteational Conference on Advanced Computational Intelligence October 19-21, 2013, Hangzhou, China An Effective Improvement of JADE for Real-parameter Optimization Chunjiang Zhang and Liang Gao Abstract-Although the metaheuristics cannot guarantee to find the optimum for global optimization, they are efficient indeed, especially for the problems very difficultly be optimized by traditional optimization methods. Differential evolution algorithm is one of the most competitive metaheuristics and the adaptive DE with optional external archive (JADE) is an excellent DE-variant. Based on the analysis of shortcomings of JADE, an effective improvement of JADE is put forward in this paper. Two parameters in JADE can be reinitialized and two new mutation strategies are added in the improved JADE. 28 benchmark problems for competition on real-parameter single objective optimization in 2013 IEEE Congress on Evolutionary Computation (CEC 2013) are used to test the performance of the proposed algorithm. The results compared with DE/rand/l and original JADE shows the improvement is effective. Keywords-Dferential evolution; Improved JADE; Real-parameter optimization I. INTRODUCTION D Differential evolution (DE) is a powerful and simple stochastic algorithm for real-parameter optimization. DE has become a bright star in the sky of nature-inspired metaheuristics since it was first put forward by R. Sto and K. V. Price in 1995 [1]. And DE or DE-based algoritluns were always on the top list in the former IEEE Congress on Evolutionary Computation (CEC) competitions. For example, classical DE ranked second and SaDE (self-adaptive DE) ranked third on lO-D problems in 2005 CEC competition on real parameter optimization. In the review paper [3], four reasons why the researchers have been looking at DE as an attractive optimization tool were pointed out. They are as follows: 1) DE is simple; 2) DE is powerl; 3) The number of control parameters in DE is very few; 4) The space complexity of DE is low. For higher accuracy and efficiency, many DE-variants such as SaDE [4], jDE [5], DEGL (DE with global and local neighborhoods) [6], JADE (Adaptive DE with optional exteal archive) [7], CODE (DE with composite trial vector generation strategies and control parameters) [8] were proposed in recent years. Although the modified DE-variants are more powerl than classical DE in some respects, there are still shortcomings in them and there is still some space for improvement. This paper focuses on JADE. Aſter analyzing the drawbacks of JADE, a simple, straightforward and effective improvement is made (IJADE). This research work is supported by the National Basic Research Program of China (973 Program) under grt No. 201lCB706804 and the Natural Science Foundation of China SFC) under Grant no. 51121002 The authors are with State Key Laboratory of Digital Mufacturing Equipment &Technology, Huazhong University of Science and Technology, Wuhan 430074, PR China. (e-mails: zh chg.com; gaoliaail.hust.edu.cn ) 978-1-4673-6343-3113/$31.00 ©2013 IEEE 58 And the benclunark problems for competition on real-parameter single objective optimization CEC 2013 are used to test the performance of our improved algoritlun. The remainder of this paper is organized as follows: DE and JADE are introduced in section Error! Reference source not found. . Section Error! Reference source not found. provides the analysis of the drawbacks of JADE and the improvement of JADE. Experimental results are presented and discussed in section Error! Reference source not found. . In section Error! Reference source not found., conclusions are drawn. II. DE AND JADE A. DE Like other evolutionary algorithms, DE has a population with size of NP individuals. Each individual is a D-dimensional vectors representing the candidate solutions. The subsequent generation in DE is denoted by G = 0, 1, ... , Gmax. The ith individual of the population at the current generation is denoted as - X,.G = [x i , l.G ' X i.2.G ' ••• , X ,.D , G] · (1) At the beginning of DE, an initial population is generated by uniformly randomizing individuals within the feasible search space. For example, the jth component of the ith individual is generated at G = 0 as Xi, j , O = x j ,min + rand i,} (0, l).(x j ,max - x j ,min) (2) where Xj,min and xj,max is the minimum and maximum bound value at jth dimension and rand;! (0, 1) is a uniformly distributed random number lying between 0 and 1. Aſter initialization, DE contains three steps at each generation G: mutation, crossover and selection, In mutation, DE creates a mutant vector Vi , G =(V i , I , G ' V i , 2 , G ' , . , V i , D , G ) for each population member X"G. The five most equently used mutation strategies are listed as follows. 1) " DE/randl1": v"G =X/l,G + -(X/2,G -Xr3,G) (3) 2) " DE/bestl1": v"G =Xbesl,G + -(Xrl,G -Xr2,G) (4) 3) " DE/current-to-bestl1": V " G =X " G +-(XbeS"G - x " G)+-(Xrl , G -X/2 , G ) (5) 4) " DE/best/2": Vi,G =Xbesl,G + -(Xrl,G -X/2,G)+ -(X/3,G -Xr4,G) (6)

Upload: liang

Post on 09-Mar-2017

226 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: [IEEE 2013 Sixth International Conference on Advanced Computational Intelligence (ICACI) - Hangzhou, China (2013.10.19-2013.10.21)] 2013 Sixth International Conference on Advanced

2013 Sixth International Conference on Advanced Computational Intelligence

October 19-21,2013, Hangzhou, China

An Effective Improvement of JADE for Real-parameter Optimization

Chunjiang Zhang and Liang Gao

Abstract-Although the metaheuristics cannot guarantee to

find the optimum for global optimization, they are efficient

indeed, especially for the problems very difficultly be optimized

by traditional optimization methods. Differential evolution

algorithm is one of the most competitive metaheuristics and the

adaptive DE with optional external archive (JADE) is an

excellent DE-variant. Based on the analysis of shortcomings of

JADE, an effective improvement of JADE is put forward in this

paper. Two parameters in JADE can be reinitialized and two

new mutation strategies are added in the improved JADE. 28

benchmark problems for competition on real-parameter single

objective optimization in 2013 IEEE Congress on Evolutionary

Computation (CEC 2013) are used to test the performance of

the proposed algorithm. The results compared with DE/rand/l

and original JADE shows the improvement is effective.

Keywords-Differential evolution; Improved JADE; Real-parameter optimization

I. INTRODUCTION

DDifferential evolution (DE) is a powerful and simple stochastic algorithm for real-parameter optimization. DE has become a bright star in the sky of

nature-inspired metaheuristics since it was first put forward by R. Storn and K. V. Price in 1995 [1]. And DE or DE-based algoritluns were always on the top list in the former IEEE Congress on Evolutionary Computation (CEC) competitions. For example, classical DE ranked second and SaDE (self-adaptive DE) ranked third on lO-D problems in 2005 CEC competition on real parameter optimization. In the review paper [3], four reasons why the researchers have been looking at DE as an attractive optimization tool were pointed out. They are as follows: 1) DE is simple; 2) DE is powerful; 3) The number of control parameters in DE is very few; 4) The space complexity of DE is low. For higher accuracy and efficiency, many DE-variants such as SaDE [4], jDE [5], DEGL (DE with global and local neighborhoods) [6], JADE (Adaptive DE with optional external archive) [7], CODE (DE with composite trial vector generation strategies and control parameters) [8] were proposed in recent years. Although the modified DE-variants are more powerful than classical DE in some respects, there are still shortcomings in them and there is still some space for improvement. This paper focuses on JADE. After analyzing the drawbacks of JADE, a simple, straightforward and effective improvement is made (IJADE).

This research work is supported by the National Basic Research Program of China (973 Program) under grant No. 201lCB706804 and the Natural Science Foundation of China (NSFC) under Grant no. 51121002

The authors are with State Key Laboratory of Digital Manufacturing Equipment &Technology, Huazhong University of Science and Technology, Wuhan 430074, PR China. (e-mails: zh [email protected]; [email protected] )

978-1-4673-6343-3113/$31.00 ©2013 IEEE 58

And the benclunark problems for competition on real-parameter single objective optimization in CEC 2013 are used to test the performance of our improved algoritlun.

The remainder of this paper is organized as follows: DE and JADE are introduced in section Error! Reference source not found . . Section Error! Reference source not found. provides the analysis of the drawbacks of JADE and the improvement of JADE. Experimental results are presented and discussed in section Error! Reference source not found . . In section Error! Reference source not found., conclusions are drawn.

II. DE AND JADE

A. DE

Like other evolutionary algorithms, DE has a population with size of NP individuals. Each individual is a D-dimensional vectors representing the candidate solutions. The subsequent generation in DE is denoted by G = 0, 1, ... , Gmax. The ith individual of the population at the current generation is denoted as

-

X,.G = [xi,l.G' Xi.2.G' • • • , X,.D,G]· (1)

At the beginning of DE, an initial population is generated by uniformly randomizing individuals within the feasible search space. For example, the jth component of the ith individual is generated at G = 0 as

Xi,j,O = xj,min + randi,} (0, l).(xj,max - xj,min) (2)

where Xj,min and xj,max is the minimum and maximum bound

value at jth dimension and rand;! (0, 1) is a uniformly

distributed random number lying between 0 and 1. After initialization, DE contains three steps at each

generation G: mutation, crossover and selection, In mutation, DE creates a mutant vector

Vi,G = (Vi,I,G' Vi,2,G' , • . , Vi,D,G) for each population member

X"G. The five most frequently used mutation strategies are

listed as follows.

1) "DE/randl1" : v"G = X/l,G + F; -( X/2,G -Xr3,G ) (3)

2) "DE/bestl1" : v"G = Xbesl,G + F; -(Xrl,G -Xr2,G) (4)

3) "DE/current-to-bestl1":

V"G =X"G +F;-(XbeS"G -x"G)+F;-(Xrl,G -X/2,G) (5)

4) "DE/best/2":

Vi,G = Xbesl,G + F; -(Xrl,G -X/2,G) + F; -(X/3,G -Xr4,G) (6)

Page 2: [IEEE 2013 Sixth International Conference on Advanced Computational Intelligence (ICACI) - Hangzhou, China (2013.10.19-2013.10.21)] 2013 Sixth International Conference on Advanced

5) "DE/rand/2":

V"e = Xrl,e + p, .(Xr2,e -Xr3,e) + p, .(Xr4,e -Xr5,e) (7)

In the above equations, rl, r2, r3, r4, and r5 are distinct integer randomly selected from the range [1,NP] and are also

different from i, Xbesl,G is the best individual in the current

population, In classic DE, the parameter F, = F is a positive fixed parameter which is called the scaling factor for

amplifying the difference vectors (Xrl,G -Xr2,e) etc" While

in many improved DE-variants, taking JADE for example, each individual i has its own scaling factor F;,

After mutation, a crossover operator is applied to X',G and

V',G to generate a trial vectoru"G = (u"I,e,ui,2,

e"··,u,,D,e)·

The DE family can use two kinds of crossover schemes which are exponential and binomial crossover respectively. In this paper, the binomial crossover is only used. Under the scheme of binomial crossover, the trial vector is obtained as

{V,,J,G' if rand',J (0, 1) � CR, or J = Jm"" u. = "j

,G x',J,G' otherwise

(8)

where i = 1, 2, . . . , NP , J = 1, 2, . . . , D , Jm"" is a randomly

chosen integer in [I ,D ], rand,)O,I) is a uniformly

distributed random number between 0 and 1. In many adaptive DE variants, CR; is associated each individual and it may be varied at different generation.

The selection operator is performed to determine whether the target or the trial vector survives to the next generation after crossover. For the minimization problem, it is expressed as follows:

- {�"G' if f(�',G) � f(�',G) Xi,G+l = _

X"G, otherwise (9 )

The above three steps are repeated generation after generation until a termination criterion is satisfied.

B. JADE

JADE is a DE variant by implementing a new mutation strategy "DE/current-to-pbest" with optional external archive and updating control parameters in an adaptive manner. Its earlier version was presented by Zhang and Sanderson proposed in CEC 2007 [9 ]. The journal article of JADE was published in 2009 [7]. JADE is very competitive among the DE variants. In the article of another DE variant CoDE [8], the experimental results of five DE variants (including JADE, jDE, SaDE, EPSDE and CoDE) on the benchmark problems of CEC 2005 for real-parameter single objective optimization showed that JADE ranked second and it was just slightly inferior than CoDE. The key points of JADE are introduced as follows.

1) DElcurrent-to-pbestI1 In JADE, a new mutation strategy, named

DE/current-to-pbestll was put forward. Actually, DE/current-to-pbestll is generalized from DE/current-to-bestil in order to enhance its global

59

exploration ability. The new mutation strategy has two schemes. In the first one without optional archive, a mutation vector is generated as - - -p - - -Vi.G =Xi.G +p'.(Xbesl,G -Xi.G)+p'·(Xrl.G -Xr2,G) (8)

-p where Xbesl,G is randomly chosen from the top 100p%

individuals with p E (0,1] . Each individual i has its own F;

which is updated at each generation through an adaptive manner introduced later. Comparing Equation (5) and

Equation (8), the only difference is that Xbes',G has a

superscript p in Equation (8). If 100p% equals l,the two mutation strategies are the same.

In the second scheme DE/current-to-pbestll with archive, a mutation vector is generated in the following manner: - - -p - - -Vi.G =Xi.G +p'.(Xbesl,G -Xi,G)+p'·(Xrl.G -Xr2,G) (9 )

where Xr2,G is randomly selected from the union of current

population P and an archive population A. The archive population A is initiated to be empty, Then, after each generation, the parent solution that fail in the selection process are added to the archive. If the size of the archive A exceeds a given threshold, some individuals are randomly removed from the archive.

2) Parameter Adaptation At each generation, the scaling factor F; of each individual

Xi.G is generated according to a Cauchy distribution with

local parameter JiF and scale parameter 0. 1

(10)

and then set to be 1 if F; > l or regenerated if F; < 0 . SFi is

denoted as the set of successful scaling factors in each generation. JiF is updated at the end of each generation as

follows:

JiF=(1-c).JiF + c.meanL(SF) (11)

where meanL (SF) is the Lehmer mean

LFES,F2 meanL (SF) ="

F'

L....FESF (12)

The crossover probability CRt of each individual i is

generated according to a normal distribution of mean JiCR and standard deviation 0. 1

CRt = randn, (JiCR, 0. 1) (13)

then truncated to [0,1]. The mean JiCR is initialized to be 0. 5

and updated at the end of each generation as

JiCR = (1-c).JiCR + c·meanA (SCR) (14)

where SeR is the set of all successful crossover probabilities

CRt at current generation.

Page 3: [IEEE 2013 Sixth International Conference on Advanced Computational Intelligence (ICACI) - Hangzhou, China (2013.10.19-2013.10.21)] 2013 Sixth International Conference on Advanced

III. THE IMPROVED JADE

JADE is an outstanding algorithm, there is room for improvement, however. The analysis of the deficiencies of JADE and the improvement is presented below.

A. Reinitialization of JiF and JiCR

Firstly, the parameter adaptation mechanism might fail when the problem is very hard to be optimization. If none of individual can be updated at a generation, the set of successful scaling factors SF and the set of successful crossover

probabilities SCR will be empty. Then JiF and JiCR cannot be

updated as well. This situation has not yet been considered in

[7]. However, when solving the real-parameter single objective benchmark functions of CEC 2013 by JADE, this situation happen often, especially for multimodal functions and composition functions. For example, because some composition functions have different properties around different local optimal, different parameters are needed. The probability of generating a less than 0. 6 CRi value is just

0. 00 l3 under the normal distribution of the mean JiCR = 0. 9

and the standard deviation O. l. And the probability of generating a less than 0. 5 CRi value is near zero. If JiCR is

0. 9 and it cannot be updated unless a CR, value can be less

than 0. 5, the JiCR can never be updated and the result cannot

be improved any more. In order to avoid this situation, in the

improved JADE, we reinitialize JiCR and JiF when certain

conditions are met. The details of the implementation are shown in the last part of this section.

B. New Added Mutation Strategies

Secondly, there is only one mutation strategy in JADE. It is an undeniable fact that the DE/current-to-pbestll mutation strategy in JADE cannot be replaced by other frequently used mutation strategies such as Equation (3) to (7). This does not mean the new mutation strategy can beat any others on all problems. Obviously, if JADE fails on a problem, it probably performs better when another mutation strategy is adopted. It is known that many DE variants (like SaDE [4], EPSDE[lO] and CoDE [8]) which use a few mutation strategies have good performance. So we try to add more mutation strategies into JADE. The first mutation strategy added into JADE is DE/randll which is the most commonly used strategy in the literature. It has no bias to any special search directions, which leads to better perturbation than the DE/current-to-pbestil. The second mutation strategy added into JADE is the DE/rand/2/dir [11] that incorporates the objective function information to guide the direction of the mutation vector as

- -

V;.G = Xrl.G + F; I 2-(Xrl.G - Xr2.G -Xr3.G) (15)

where r 1, r2, r3 are distinct integer different from i randomly selected from the range [1,NP] and they are subjected to

- - -

f(x,I,G ) < {I(X,2,G ), f(Xr3,G )} . According to the conclusion

made by Mezura-Montes et al. in [12] where eight different DE-schemes were compared over a test-suite of 13 benchmark problems, DEirand/2/dir remained most competitive and slightly faster to the global optimum on

60

multimodal and non-separable functions. Another reason why it is the chosen one is that when all the individuals are almost at the same position, only DE/rand/2/dir can be used to generate a new mutation vector.

C. Imp/emetation of the Improved JADE

In the improved JADE, MSch is a parameter for determining the mutation strategy. Three values 0, 1 and 2 which represent the three mutation strategies can be chosen for MSch. MSch is initialized as 0, which means DE/current-to-pbestil is reserved at the beginning. And the parameter adaption mechanism is still used in JADE. If at least one individual is updated at every generation, HADE is JADE itself.

In order to determine when to the use the reinitialization of JiF and JiCR and the new mutation strategies, three NoFl,

NoF2, NoF3 are added into HADE. The three control variables have corresponding thresholds THl, TH2 and TH3. NoFI is the number of generations with successive update failing. Only when NoFI reaches its corresponding

thresholds THl, the reinitialization of JiF and JiCR is

executed. Three values 0. 167, 0. 5 and 0.833 can be selected randomly for reinitialization. NoF2 records the times of NoFI reaching THI. If NoF2 reaches its threshold TH2, a value different the current one will be selected for MSch. For example, if the MSch = 0, it will be set as 1 or 2 randomly, which means DE/randll or DE/rand/2/dir will become the new mutation strategies. Seemly, NoF3 records the times of NoF2 reaching TH2. If NoF3 reaches TH3, this situation means the reinitialization of JiF and JiCR and new mutation

strategies have no effect. In this case, it is likely that the popUlation get stuck in local optimum. So we will reinitiate the population in HADE.

The pseudo code of the improved JADE with archive is shown as follows. It is based on the original version in [7]. The major differences lie in from line 08 to line 13 and from line 34 to line 46.

Line# Procedure of Improved JADE with Archive

01 Begin 02 Set JiCR = 0. 5; JiF = 0. 5; NoFI =NoF2=NoF3=O 03 Set MSch = 0; A = 0 ;

04 Create a random population {Xi,O I i = 1, 2, . . . , NP} 05 For G = 1 to Gmax 06 SF=0; SCR =0; 07 Fori=l to NP 08 IfMSch = 0 09

- - - p - - -

V;,G = X;,G + F; -(Xbesl,G -x;,G ) + F; -(Xrl,G -Xr2,G)

10 Else if MSch = 1 -

11 V"G = Xrl,G + F; -(Xr2,G -Xr3,G )

12 Else - -

13 V;,G = Xrl,G + F; I 2-(Xrl,G - Xr2,G -Xr3,G )

14 End If 15 Generate Jrand = randint(1, D) 16 ForJ= 1 to D

Page 4: [IEEE 2013 Sixth International Conference on Advanced Computational Intelligence (ICACI) - Hangzhou, China (2013.10.19-2013.10.21)] 2013 Sixth International Conference on Advanced

17 If} = }rand or rand(O,l) < CR,

18 U . = l,j,G 19 Else

20 Ui,j,G = Xi,j,G 21 End If 22 End for

23 If f(Xi,G ) ::; f(Ui,G ) 24 Xi,G+1 = Xi,G 25 Else

26 Xi,G+1 = Ui,G; Xi,G � A ; CRi � SCR; F, � SF 27 End If 28 End For

29 Randomly remove solutions from A so that I AI ::; NP

30 If SF * 0 and SCR * 0 31 NoFl =NoF2=NoF3=O

32 JiF = (1- c ).JiF + c·meanL (SF)

33 JiCR = (1- c).JiCR + c·mean A (SCR) 34 Else 35 NoF1++ 36 If NoFl = TH1 37 Reinitialize JiF and JiCR; NoFl =0; NoF2++ 38 If NoF2 = TH2 39 Select a MSch value different from current one 40 NoF2=O; NoF3++ 41 IfNoF3 = TH3 42 Reinitialize the population; NoF3=O 43 End If 44 End If 45 End If 46 End If 47 End for 48 End

IV. NUMERICAL EXPERIMENT

The benchmark problems for competitIOn on real-parameter single objective optimization in CEC 2013 Error! Reference source not found. are used to test the improved JADE, The test suite includes 28 benchmark functions which contain 5 unimodal, 15 mutimodal and 8 composition functions. The description and the code of the functions can be can be downloaded from the website: http://www. ntu.edu. sgihome/EPNSuganiindex files/CEC20 13/CEC2013. htm.

The dimension, D, has two values 10, 30. For each algorithm and each test function with different dimension, 51 independent runs are done. The termination criterion is the maximum function evaluations (MaxFES) which equals to

10000* D. The function error values (J; (x) -J; (x' ) ) are

recorded in each run. And if the error is less than 1e-8, it is taken as O. C++ is used for coding.

The parameter settings in IJADE are as follows. NP, population size, is set as 100 in all runs. JiF and JiCR are

initialized as 0. 5. Two important parameters in JADE, c

61

which controls the rate of parameter adaptation and p which determines the greediness of the mutation strategy, are set as recommending in the original paper [7]. c is set as 0. 1 and p is 0. 05. The new add parameter MSch is initialized as O. Three thresholds TH1, TH2 and TH3 are set as 4, 6 and 2 respectively.

We compare five algorithms on 28 benchmark functions with 10 and 30 dimensions. They are DElrandll, JADE without archive, JADE with archive and two improved JADE. The population size of the five algorithms is all 100. The scaling factor F is 0. 5 and the mutation probability CR is set as 0. 9 in DE/rand/I. The two parameter values are used or recommended in [2][5]. The same parameters of JADE and IJADE are set the same values in order to compare fairly.

The error values' mean values and standard deviations are shown in TABLE I. and TABLE II. HADE with archive is compared with four other algorithms. The Wilcoxon's rank sum test at a 0. 05 significance level is conducted. In the tables, " -", "+" and " "," mean that the performance of the corresponding algorithm is worse than, better than and similar to that of IJADE with archive, respectively. From TABLE I. , it is can be seen that the two improved JADE perform the same. The two versions get similar results on 21 functions. And the HADE without archive performs better than the one without archive on 4 functions and worse on 3 functions. HADE performs much better than the original JADE and DE/rand/1/. It gets better results on 21 functions compared with DE/rand/I. The two algorithms get similar results on 5 functions including 4 unimodal functions. IJADE with archive performs better than JADE without archive and JADE with archive on 17 and 15 functions, respectively. Especially, two HADEs perform better than other algorithms on composition functions. The original JADE without archive and the one with archive just perform better than IJADE with archive on 2 and 3 functions respectively. As for 30 dimension functions, HADE with archive performs little better than IJADE without archive. And they are still much better than DE/rand/I. Compared with DE/randll, the number of functions on which HADE gets better results is still 21. The number of worse results is only one. HADE with archive just gets better results than JADE without archive and JADE without archive on 9 and 10 functions. Two original JADEs perform better than IJADE with archive on 4 and 2 functions. It can be concluded that two versions of IJADE have similar performance; they are much better than DE/rand/I on functions with 10 and 30 dimension; they perform much better than JADE on 10 dimension functions and better than JADE on functions with 30 dimension. In a word, the improvement of JADE is efficient.

V. CONCLUSION AND FUTURE WORK

JADE is an improved version of DE. In this paper, an improved JADE is proposed by reinitializing two control parameters and reselecting the mutation strategies. The reinitialization and reselection are managed by three new added control parameters which are used to monitor the execution of the algorithm. The improvement is simple, straightforward and the numerical experiments show that the improved JADE is effective.

Page 5: [IEEE 2013 Sixth International Conference on Advanced Computational Intelligence (ICACI) - Hangzhou, China (2013.10.19-2013.10.21)] 2013 Sixth International Conference on Advanced

To design better control method for the reinitialization of

j.iF and j.iCR and res election of mutation strategies is a part

of the future work. What's more, the new pattern of improving an algorithm can be applied to other algorithms such as other DE variants and PSO variants.

REFERENCES

[I] R. Storn and K. Price, "Differential evolution: A simple and efficient adaptive scheme for global optimization over continuous spaces, " In!. Compu!. Sci. Ins!., Berkeley, CA, Tech. Rep. TR-95-012, 1995.

[2] R. Storn and K. Price, "Differential evolution a simple and efficient heuristic for global optimization over continuous spaces," J. Global Optimization, vol. II, no. 4, pp. 341-359,1997.

[3] S. Das and P. N. Suganthan, "Differential evolution: a survey of the state-of-the-art, " IEEE Transactions on Evolutionary Computation, vol. 15, no. l, pp. 4- 31, 2011.

[4] A. K. Qin, V. L. Huang, and P. N. Suganthan, "Differential evolution algorithm with strategy adaptation for global numerical optimization," IEEE Trans. Evo!. Comput., vol. 13, no. 2, pp. 39S-417, Apr. 2009.

[5] J. Brest, S. Greiner, B. Bovskovi'c, M. Mernik, and V. Zumer, "Selfadapting control parameters in differential evolution: A comparative study on numerical benchmark problems," IEEE Trans. Evo!. Comput., vol. 10, no. 6, pp. 646-657, Dec. 2006.

[6] S. Das, A. Abraham, U. K. Chakraborty, and A. Konar, "Differential evolution using a neighborhood based mutation operator," IEEE Trans. Evo!. Comput., vol. 13, no. 3, pp. 526-553, Jun. 2009.

[7] J. Zhang and A. C. Sanderson, "JADE: Adaptive differential evolution with optional external archive," IEEE Trans. Evo!. Comput., vol. 13, no. 5, pp. 945-958, Oct. 2009.

[S] Y. Wang, Z. Cai, Q. Zhang, "Differential evolution with composite trial vector generation strategies and control parameters, " IEEE Transactions on Evolutionary Computation, vol. 15, no. I, pp. 55 - 66, 2011.

[9] J. Zhang and A. C. Sanderson, "JADE: Self-adaptive differential evolution with fast and reliable convergence performance," in Proc. IEEE Congr. Evol. Comput., Singapore, Sep. 2007, pp. 2251-2258.

[10] R. Mallipeddi, P.N. Suganthan, Q.K. Pan, M.F. Tasgetiren, "Differential evolution algorithm with ensemble of parameters and mutation strategies, " Applied Soft Computing, vol. II, no. 2, pp. 1679 - 1696, 2011.

[II] V. Feoktistov and S. Janaqi, "Generalization of the strategies in differential evolution," in Proc. 18th IPDPS, Apr. 2004, p. 165a.

[12] E. Mezura-Montes, 1. Verazquez-Reyes, and C. A. Coello Coello, "A comparative study of differential evolution variants for global optimization," in Proc. Genet. Evo!. Comput. Conf, 2006, pp. 485-492.

[13] J. 1. Liang, B-Y. Qu, P. N. Suganthan, Alfredo G. Hernandez-Diaz, "Problem Definitions and Evaluation Criteria for the CEC 2013 Special Session and Competition on Real-Parameter Optimization", Technical Report 201212, Computational Intelligence Laboratory, Zhengzhou University, Zhengzhou China and Technical Report, Nanyang Technological University, Singapore, January 2013.

TABLE I. COMPARISON BETWEEN JADE AND IJADE OVER 51 INDEPENDENT RUNS ON 28 TEST FUNCTIONS OF 10 DIMENSIONS

Function DE /rand /I JADE w/o archive JADE warchive lJADE w/o archive /JADE warchive I O.OOE+OO±O.OOE+OO;::o O.OOE+OO±O.OOE+OO;::o O.OOE+OO±O.OOE+OOz O.OOE+OO±O.OOE+OOz O.OOE+OO±O.OOE+OO

Unimodal 2 O.OOE+OO±O.OOE+OOz O.OOE+OO±O.OOE+OOz O.OOE+OO±O.OOE+OOz O.OOE+OO±O.OOE+OOz O.OOE+OO±O.OOE+OO 3 2.42E-0 I ±S. 79E-0 1- 5.32E+01±1.29E+02- 4.10E+OI±8.4IE+OI- 1.24E-OI±8.76E-01- 2.80E-03± 1.39E-02

Functions 4 O.OOE+OO±O.OOE+OOz O.OOE+OO±O.OOE+OOz O.OOE+OO±O.OOE+OOz O.OOE+OO±O.OOE+OOz O.OOE+OO±O.OOE+OO 5 O.OOE+OO±O.OOE+OOz O.OOE+OO±O.OOE+OOz O.OOE+OO±O.OOE+OOz O.OOE+OO±O.OOE+OOz O.OOE+OO±O.OOE+OO 6 8.93E-OS±6.3IE-07+ 5.96E+00±4.79E+00- S.OSE+00±3.74E+00- 2.S6E+00±3.82E+00- 1.98E+00±3.5IE+00 7 6.54E-05± 7.91 E-05- 8.87E-02±1.31 E-OI- 9.39E-02±1.40E-01- 7.04E-06±1.60E-05z 4.77E-06±1.37E-05 S 2.04E+0 I±S.64E-02;::o 2.03E+OI±S.S3E-02;::o 2.04E+0 I±8. 77E-02z 2.04E+OI±6.80E-02z 2.04E+OI±6.02E-02 9 5.53E+00±2.I 8E+00- 3.74E+00± 7.68E-OI- 3.S2E+00±5.85E-0 1- 1.36E+00± 1.45E+00- 6.42E-0I±S.59E-OI

10 6.09E-02±6.0SE-02- 1.83E-02±9.33E-03;::o 1.72E-02±8.76E-03+ 6.0SE-03± 7 .33E-03+ I. 78E-02± 1.24E-02 II 3.64E-0 1±5.9IE-OI- O.OOE+OO±O.OOE+OO;::o O.OOE+OO±O.OOE+OOz O.OOE+OO±O.OOE+OOz O.OOE+OO±O.OOE+OO

Basic 12 8.31 E+00±4.38E+00- 4.45E+00± 1.36E+00- 4.59E+00± 1.39E+00- 2.92E+00±8.60E-01;::o 3.0IE+00±9.50E-OI Multimodal 13 1.14E+01±7.I 8E+00- 5.19E+00±2.37E+00- 4.S0E+00±2.ISE+00- 2.S2E+00±1.37E+00z 2.83E+00± 1.2IE+00 Functions 14 5.40E+0 1±3.59E+0 I- I.7IE-02±3.05E-02- I.22E-02±2.77E-02;::o 1.IOE-02±2.68E-02+ l.35E-02±3.IIE-02

15 1.34E+03±3.22E+02- 4.92E+02±1.08E+02- 4.74E+02±1.19E+02- 4.17E+02± 1.62E+02z 4.24E+02± 1.41 E+02 16 8.44E-OI±3.29E-OI + 1.13E+00±2.51 E-OI- I.OSE+00±1.82E-0 I z 9.SIE-OI±1.90E-OI;::o 1.03E+00±2. IOE-O I 17 1.15E+01±7.77E-OI- 1.0 I E+O I± 1.78E-15z 1.01 E+OI± 7.91 E-15z 1.01 E+OI± 1.7SE-15z 1.0 I E+O I± 1.78E-15 IS 2.91 E+01±5.S4E+00- 1.88E+01±2.00E+00- I.S6E+0 1±2.21 E+OO- 1.77E +0 1±2. OOE+OOz 1.73E+0 I± 1.65E+00 19 6.17E-OI±1.75E-OI- 3.59E-OI±4.SSE-02+ 3.50E-0 I±5.15E-02+ 3.S6E-0 I±5.44E-02;::o 3. 78E-0 I±6.36E-02 20 2.94E+00±3.04E-OI- 2.26E+00±3.S9E-0 1- 2.29E+00±3.80E-0 1- 2.04E+00±3.66E-01;::o 1.95E+00±4.S0E-0 I 21 3.88E+02±4.7IE+OI- 4.00E+02±0.00E+00- 4.00E+02±0.00E+00- 3.50E+02±7.74E+Olz 3.59E+02±7.22E+OI 22 1.33E+02±6.02E+01- 4.54E+00±5.43E+00+ 3.S3E+00±3.80E+00+ 9.7IE+00±1.26E+01+ 1.15E+0 1±2.38E+0 I 23 1.48E+03±3.06E+02- 4.97E+02± 1.59E+02- 5.04E+02± 1.47E+02- 5.06E+02± 1.56E+02- 3.59E+02± 1.49E+02

Composition 24 2. I 8E+02±3.90E+00- 2.00E+02±4.05E+00- 2.01 E+02±1.19E+01- 1.97E+02±1.0SE+Olz I.93E+02± I.S2E+0 I Functions 25 2.19E+02±1.02E+01- 2.00E+02±7.71 E+OOz 2.00E+02±5.39E+00z 2.00E+02±6.29E-01;::o 2.0 I E+02±2.04E+00

26 2.00E+02±1.02E-04- l.37E+02±4.43E+OI- 1.27E+02±3.85E+OI- I.I3E+02±2.64E+Olz I. 13E+02±2.30E+0 I 27 4.95E+02±2.96E+0 1- 3.00E+02±1.00E-OI- 3.00E+02±1.85E+00- 3.00E+02±0.00E+00z 3.00E+02±0.00E+00 2S 2.88E+02±4. 71 E+OI- 2.96E+02±2. 77E+O 1- 3.00E+02±0.00E+00- 2.77E+02±6.0SE+OI;::o 2. 76E+02±6.44E+0 I

- 21 17 15 3 + 2 2 3 4 z 5 9 10 21

TABLE I/, COMPARISON BETWEEN JADE AND lJADE OVER 5ILNDEPENDENT RUNS ON 28 TEST FUNCTIONS OF 30 DIMENSIONS

Function JADE w/o archive JADE w archive IJADE w/o archive [JADE warchive I O.OOE+OO±O.OOE+OOz O.OOE+OO±O.OOE+OOz O.OOE+OO±O.OOE+OO;::o O.OOE+OO±O.OOE+OOz O.OOE+OO±O.OOE+OO

Unimodal 2 1.03E+06±6.96E+05- I.3SE+04±8.22E+03- 9.61 E+03±5.S9E+03z I. 70E+04± 1.06E+04- 1.04E+04± 7.83E+03 Functions 3 9.06E+05±2.82E+06- 3.29E+05±1.0IE+06z 4.12E+05±1.60E+06;::o I.23E+05±5.5SE+05+ 3.40E+05±1.4IE+06

4 9.39E+02±5.30E+02- 6.S7E+03±1.41 E+04- 3.27E+03± 1.02E+04- 1.36E-02±2.14E-02- 4.56E-04±1.59E-03

62

Page 6: [IEEE 2013 Sixth International Conference on Advanced Computational Intelligence (ICACI) - Hangzhou, China (2013.10.19-2013.10.21)] 2013 Sixth International Conference on Advanced

S O.OOE+OO±O.OOE+OO:::; O.OOE+OO±O.OOE+OO:::; O.OOE+OO±O.OOE+OO:::; O.OOE+OO±O.OOE+OO:::; O.OOE+OO±O.OOE+OO 6 I.7SE+O 1±1.l OE+OO-- S.18E-Ol±3.66E+00-- 1.1 lE+OO±S.14E+00-- S.74E-Ol±4.33E-Ol- 2.78E-01±1.96E+00 7 9.ISE-01±1.74E+00+ 2.64E +00±2. 67E+00:::; 3.0 I E+00±4.34E+00-- l.97E +00±2. 71 E+OO+ 2.21 E+00±2.39E+00 8 2. I OE+0I±S.S8E-02:::: 2.09E+01±9.7IE-02:::: 2.09E+01±9.12E-02:::; 2.09E+0I±S.29E-02:::: 2.09E+01±6.3IE-02 9 3.02E+Ol±S.72E+00-- 2.S9E+Ol±1.81 E+OO-- 2.62E+0 1±1.62E+00-- 1.9SE+Ol±S.0IE+00+ 2.08E+Ol±S.03E+00

10 2. 77E-02± I.S4E-02:::; 6.99E-02±3.99E-02- S.36E-02±3.96E-02- 6.44E-02±3.67E-02- 2.49E-02±I.S7E-02 11 1.44E+0 1±4.S2E+00-- O.OOE+OO±O.OOE+OO:::; O.OOE+OO±O.OOE+OO;::o O.OOE+OO±O.OOE+OO:::; O.OOE+OO±O.OOE+OO

Basic 12 3.42E+01± l.SI E+OI- 2.SIE+01±4.SIE+00-- 2.4 7E+0 1±4.S7E+00-- 2.4 7E+01±4. 7SE+00-- 2.26E+0I±S.22E+00 Multimodal 13 6.01 E+OI± l.92E+OI- 4.89E+01± 1.39E+OI;::o 4.88E+0I±l.OSE+O1;::0 4.32E+01±l.2IE+01+ 4.70E+01±l.14E+OI Functions 14 9.97E+02±2.44E+02- 9.90E-02±3.87E-02;::o 3.63E-02±2.63E-02+ 1.34E-Ol±4.06E-02- 9.33E-02±2.S6E-02

IS 6.86E+03± 7.47E+02- 3.27E+03±3.09E+02+ 3.33E+03±2.33E+02;::o 3.42E+03±3.3SE+02- 3.36E+03±3.12E+02 16 2.44E+00±2.92E-Ol- 2.01E+00±6.63E-0 1;::0 2. 1 SE+00±4. IOE-Ol- 2.03E+00±2.94E-OI ;::0 2.04E+00±2.S4E-Ol 17 S.20E+Ol±4.87E+00-- 3.04E+Ol±1.l3E-14z 3.04E+OI±7.96E-1S;::o 3.04E+Ol±4.71E-14z 3.04E+0 1±2.39E-14 18 1.69E+02±3.28E+OI- 7. 73E+01±6. 72E+OO+ 7.7SE+01±6.S2E+00+ 8.19E+01±8.77E+00:::; 8.07E+01±7.32E+00 19 2.74E+00±6.19E-Ol- 1. S6E+00±1. 62E-0 1 + 1.91E+00±1.24E-0 1;::0 1.9SE+00±1.44E-0 1 z 1.94E+00±2.4SE-Ol 20 I.3SE+01±2.63E-0 1- 1.04E+0I±S.46E-OI;::o I.OSE+0I±S.67E-Olz I.OSE+0I±S.76E-OI;::o 1.06E+0I±S. 77E-O I 21 2.83E+02±6.97E+OI- 3.00E+02± 7.86E+OI;::o 3.00E+02±6.92E+0 I z 2.69E+02±4.77E+OI- 2.S7E+02±4.7SE+OI 22 1. I 4E+03±2.S0E+02z 1.04E+02±1.7IE+01+ l.lSE+02± 1.81 E+O 1;::0 1.18E+02±3.66E+Ol- 1.14E+02±2. 73E+0 1 23 7.33E+03± 7.1 OE+02- 3.4SE+03±3.31 E+02- 3.42E+03±3.80E+02- 3.66E+03±3.78E+02- 3.17E+03±3.S7E+02

Composition 24 2.9SE+02±6.42E+00-- 2.06E+02±6.46E+00z 2.08E+02±8.39E+00;::o 2.0SE+02±3.96E+00z 2.09E+02±1.07E+Ol Functions 2S 2.97E+02±4.38E+00-- 2. 73E+02± l.23E+OI- 2.76E+02±I.SI E+OI- 2.62E+02± l.41 E+OI;::o 2.S8E+02± 1.28E+OI

26 3.34E+02±8.22E+OI- 2.13E+02±4.08E+OI;::o 2.ISE+02±4.12E+0 I z 2.0SE+02±2.21 E+OI + 2.ISE+02±4.0IE+OI 27 1.23E+03±8.2SE+Ol- 6.43E+02±2.31 E+02- 6.S7E+02±2.ISE+02- 4.68E+02±1.66E+02+ S.10E+02±1.9IE+02 28 3.00E+02±3.76E-13:::; 3.00E+02± l.96E-13:::; 3.00E+02± l.49E-13z 3.00E+02±3.70E-13:::; 3.00E+02±3.62E-13

Total - 21 9 10 9 + 1 4 2 6 ;::0 6 IS 16 13

63