[ieee 2013 sixth international conference on advanced computational intelligence (icaci) - hangzhou,...

5
2013 Sixth Inteational Conference on Advanced Computational Intelligence October 19-21, 2013, Hangzhou, China A Differential Evolution Algorithm with Minimum Distance Mutation Operator Wenchao Vi, Xinyu Li, Liang Gao, and Yunqing Rao Abstract-This paper proposes a novel mutation operator named minimum distance mutation for differential evolution (DE) algorithm. We try to improve the local search ability of the algorithm in the mutation operation. During the mutation operation, the selected base particle will be compared with the nearest particle. The better particle will be selected for the mutation operation in this way the neighborhood information can be applied. A set of famous benchmark functions has been used to test and evaluate the performance of the proposed algorithm. The experimental results show that the proposed algorithm has achieved good improvement. Keywords: differential evolution algorithm (DE), minimum distance mutation strate, local search I. INTRODUCTION D IFFERENTIALeVolution, which was firstly introduced by Sto and Price [1], is a stochastic direct search method. The DE algorithm applies mutation, crossover and selection operators to generate the new generation, which inherits om evolution algorithm (EA). Differential evolution algorithm is an efficient global search algorithm. This advantage makes it possible to apply in various fields [2][3][4][5][6][7]. Despite several attractive features, DE sometimes does not perform as expectations. As some research [8] show that DE may stop proceeding to global optimum though the population has not converged to its local optimum. So the local and global search ability of DE algorithm can be improved. Many researchers have made improvements on DE algorithm since it has been proposed. Some focus on the optimization of the control parameters F and CR, the optimization outcomes differ a lot when using different settings of control parameters .The most recent trend in this direction is a self-adaptive optimization of the control parameters [8]. Brest [9] proposed a self-adapting control parameter strategy. In the paper he mentioned that the better values of control parameters lead to better individuals, which are more likely to survive and produce the offspring during the evolutionary process. In [10] Pan suggested a self-adaptive strategy, which each particle having its own trial vector generate strategy. Both the mutation strategy and control parameters are self-adaptive for each particle. Muscript received May 27, 2013. This work is supported by the National Basic Research Program of China (973 Program) under grant no. 201lCB706804 and the National Science and Technology Major Project of China under grant no. 201IZX04015-011-07. The authors are with State Key Laboratory of Digital Manufacturing Equipment &Technology, Huazhong University of Science and Technology, Wuhan, China (Emails:[email protected].lixinyu@mai1.hust.edu.cn. [email protected], [email protected] ). 978-1-4673-6343-3/13/$3l.00 ©2013 IEEE 86 Other researchers make some innovation on the mutation operators. An efficient mutation operator can effectively enhance the optimization outcomes of the differential evolution algorithm. In paper [11], Zhou introduced an effective intersect mutation operator by sorting particles into better part and worse part. Base vector in the mutation operator is selected in the worse part, while the difference vectors are choosing in the better part. Piotrowski [12] proposed an adaptive memetic DE algorithm with global and neighborhood-based mutation operator which shows a good balance between exploitation and exploration. The research on the mutation operator shows that it can obtain a better outcome [11]. In this paper, we will propose a new effective mutation operator to enhance the local search ability of the DE algorithm, called minimum distance differential evolution algorithm (MDDE). In the mutation operator, the proposed operator finds the particle that is nearest to the base vector and chooses the particle that obtains the better fitness function value. The remaining contents of this paper are organized as follows. In Section 2, the original differential algorithm is stated and formulated. In Section 3, the minimum distance mutation operator is proposed. The benchmark test nction set is tested and listed in detail in Section 4. The computational results and comparisons are provided in Section 5. Finally, we end the paper with some conclusions in Section 6. II. THE PROPOSED MDDE A. Introduction ofMDDE In this section, we proposed a novel direction to enhance the local search ability of the DE, a mutation operator based on the minimum distance around the random generate particle rO is used. The amework of the proposed MDDE can be described as follows:

Upload: yunqing

Post on 01-Mar-2017

216 views

Category:

Documents


4 download

TRANSCRIPT

Page 1: [IEEE 2013 Sixth International Conference on Advanced Computational Intelligence (ICACI) - Hangzhou, China (2013.10.19-2013.10.21)] 2013 Sixth International Conference on Advanced

2013 Sixth International Conference on Advanced Computational Intelligence October 19-21, 2013, Hangzhou, China

A Differential Evolution Algorithm with Minimum Distance Mutation

Operator Wenchao Vi, Xinyu Li, Liang Gao, and Yunqing Rao

Abstract-This paper proposes a novel mutation operator

named minimum distance mutation for differential evolution

(DE) algorithm. We try to improve the local search ability of

the algorithm in the mutation operation. During the mutation

operation, the selected base particle will be compared with the

nearest particle. The better particle will be selected for the

mutation operation in this way the neighborhood information

can be applied. A set of famous benchmark functions has been

used to test and evaluate the performance of the proposed

algorithm. The experimental results show that the proposed

algorithm has achieved good improvement.

Keywords: differential evolution algorithm (DE), minimum distance mutation strategy, local search

I. INTRODUCTION

DIFFERENTIALeVolution, which was firstly introduced by Storn and Price [1], is a stochastic direct search method. The DE algorithm applies mutation, crossover

and selection operators to generate the new generation, which inherits from evolution algorithm (EA). Differential evolution algorithm is an efficient global search algorithm. This advantage makes it possible to apply in various fields [2][3][4][5][6][7].

Despite several attractive features, DE sometimes does not perform as expectations. As some research [8] show that DE may stop proceeding to global optimum though the population has not converged to its local optimum. So the local and global search ability of DE algorithm can be improved. Many researchers have made improvements on DE algorithm since it has been proposed.

Some focus on the optimization of the control parameters F and CR, the optimization outcomes differ a lot when using different settings of control parameters . The most recent trend in this direction is a self-adaptive optimization of the control parameters [8]. Brest [9] proposed a self-adapting control parameter strategy. In the paper he mentioned that the better values of control parameters lead to better individuals, which are more likely to survive and produce the offspring during the evolutionary process. In [10] Pan suggested a self-adaptive strategy, which each particle having its own trial vector generate strategy. Both the mutation strategy and control parameters are self-adaptive for each particle.

Manuscript received May 27, 2013. This work is supported by the National Basic Research Program of China (973 Program) under grant no. 201lCB706804 and the National Science and Technology Major Project of China under grant no. 201IZX04015-011-07.

The authors are with State Key Laboratory of Digital Manufacturing Equipment &Technology, Huazhong University of Science and Technology, Wuhan, China (Emails:[email protected]@mai1.hust.edu.cn. [email protected], [email protected] ).

978-1-4673-6343-3/13/$3l.00 ©2013 IEEE 86

Other researchers make some innovation on the mutation operators. An efficient mutation operator can effectively enhance the optimization outcomes of the differential evolution algorithm. In paper [11], Zhou introduced an effective intersect mutation operator by sorting particles into better part and worse part. Base vector in the mutation operator is selected in the worse part, while the difference vectors are choosing in the better part. Piotrowski [12] proposed an adaptive memetic DE algorithm with global and neighborhood-based mutation operator which shows a good balance between exploitation and exploration.

The research on the mutation operator shows that it can obtain a better outcome [11]. In this paper, we will propose a new effective mutation operator to enhance the local search ability of the DE algorithm, called minimum distance differential evolution algorithm (MDDE). In the mutation operator, the proposed operator finds the particle that is nearest to the base vector and chooses the particle that obtains the better fitness function value.

The remaining contents of this paper are organized as follows. In Section 2, the original differential algorithm is stated and formulated. In Section 3, the minimum distance mutation operator is proposed. The benchmark test function set is tested and listed in detail in Section 4. The computational results and comparisons are provided in Section 5. Finally, we end the paper with some conclusions in Section 6.

II. THE PROPOSED MDDE

A. Introduction of MDDE

In this section, we proposed a novel direction to enhance the local search ability of the DE, a mutation operator

based on the minimum distance around the random generate particle rO is used.

The framework of the proposed MDDE can be described as follows:

Page 2: [IEEE 2013 Sixth International Conference on Advanced Computational Intelligence (ICACI) - Hangzhou, China (2013.10.19-2013.10.21)] 2013 Sixth International Conference on Advanced

generate the initial population evaluate the fitness for each individual while the termination criterion is not satisfied do for i=l:NP do

compute distance and find the minimum distance between particle i and particle mi dis[i][mi}

for j=l:N do generate particles rO,r l,r2

if !(X,o) <!(Xmi)

v,' = X:o + F * (X:, -X:2) else

V/ = X �i + F * (X,', - X,'2) end if

end for for j=l:N do

if rand(O, 1) <CR or }. is equal to }.

UJ =vj else

I I

U/ =X/ end if

end for end for for i=l:NP evaluate U,

if j(U,)5J(XJ for j=l:N

replace xi with U/ end for

end if end for end while

A. The Introduction of DE

rand

In general, original DE algorithm has three operations: mutation, crossover and selection. The original DE algorithm can be described as follows:

The initial generation should randomly generates from the feasible region with uniform distribution between the prescribed upper and lower bounds of X::G and X:'G' i= 1, 2,

.. . NP. G denotes one generation. The procedure ends up with NP particles are generated. The initialization procedure can be described as

Xi = x: + rand [0,1)* (X;' - X:) In fact many combinations of vectors can be used to

generate the mutation vector, a generalized formula can be written as:

1 N-l v"G = f.,G + F * N L (X"(2n+l),G - X,(2n+2),G ) n=O

In this fonnula, Y"G is the base vector and it should be

mutually different from other vectors in the formula. Most commonly used mutation strategy is represented in the following fonnula and we applied this strategy in this paper:

V;,G = x,O,G + F*(X",G - X,2,G)

87

Crossover operation is used to enhance the diversity of the population. We choose the trail vector according to a random probability .The most commonly used binomial crossover is defined as:

Ui = IV/G' if(rand)O,I) < CR)

I,G Xi h i,G,ot erwise

To minimization problems DE uses the survivor selection

where the target vector Xi,G competes against the trail

vector, which obtains a smaller fitness survivor into the next generation. The strategy is shown as follows:

Xi.G

+l = {Xi'G'if(f(�i'G) < !(Ui,G))

, Ui,G,otherWlse

Mutation, crossover and selection will repeat during the iteration procedure until termination conditions of the algorithm achieved. The termination conditions can be set as maximum number of iteration or a certain solution precision.

B. Minimum Distance Mutation Strategy

The most frequently referred basic mutation operators are DE/randll/bin and DE/best//bin. DE/randll/bin bears a stronger exploration capability. DE/best/l/bin is an exploitative and fast mutation strategy, while it relies on the best solution found by so far. Comparing to DE/bestll/bin, DE/randll/bin is more explorative and less greedy.

DE/rand/l/bin: vi,G = X ,O,G + F * (XIl,G - X,2,G)

DElbest/l/bin:

v"G = Xbest,G + F * (X/",G -X/'2,G) Choosing a suitable mutation operator that efficient

enough is, frequently, a problem puzzled us. In [7], Pan stated that 'DE/rand/l/bin' and 'DE/rand/2/bin' share the similarity that they both have a slow convergence speed with superior exploration ability. Therefore, they perform better when solving multimodal problems than 'DE/best/2/bin'. 'DE/current-to-best/l/bin' IS a rotation-invariant operator. In paper [13], M M. Ali mentions that if the base vector particle is the best of the three random generate particle that its vicinity can be explored .

The strategies mostly use random particles or the best particle to mutate, to some extent, which ignore the inner relationship between the particles. Suitable operator should be proposed to enhance both the local search ability and global search ability of the mutation operation. We propose the minimum distance mutation operator, which considers the neighborhood particle around Xi,G' The evolutionary operator is based on the DElrandll/bin operator. When choosing the particle XrO,G, we define a particle that is nearest to the particle Xmi,G' Then we compare the fitness function value of the random generate particle XrO,G and Xmi,G, and choose a better one to process the mutation operation. The reason that we choose the improved operator is if the nearest particle Xmi,G has better fitness value than XrO,G, to some extent, it can be regarded as the local search

Page 3: [IEEE 2013 Sixth International Conference on Advanced Computational Intelligence (ICACI) - Hangzhou, China (2013.10.19-2013.10.21)] 2013 Sixth International Conference on Advanced

process of the DE algorithm. The time complexity of calculating the distance between particles is O(N*NP*logNP).

minimum distance mutation operator: v = {X,O,G + F * (X,I,G - X,2,G ),if(f(X,.o,G) < !(Xmi,G)) ',G

xmi.G + F * (X"I

.G - x"

.G ),otherwise

III. EXPERIMENT AL RESULTS

Twenty-one benchmark functions from Yao et al. [14] were used to test the performance of MDDE. In the test function, S denotes the range of the variable, D denotes the dimensionality of the test function, !nun denotes the global optimum which is the minimum function value. All the settings of the parameters are the same to the [14] to assure a fair comparison. In [14], the benchmark functions were described as follows. Functions./i:fs are unimodal functions. Function./6 is the step function, which has one minimum and is discontinuous. Function h is a noisy quartic function. Functions ls-f\3 are multimodal functions where the number of local minima increases exponentially with the problem dimension and functions fwhl are low-dimensional functions which only have a few local minima. The fust thirteen benchmark functions are high dimensional problems with 30 dimensions, while the last eight benchmark functions are low dimensional problems with 2 or 4 dimensions according to the settings. The proposed MDDE algorithm is coded in C++ and implemented on a computer with a 1.8GHz Intel(R) Pentium(R) Dual CPU E2160.

Motived by the mutation operator in section 2.3, we can presume that properly enhance the probability that the base particle selected from the better part (We define the best top 50% particle of the current generation as the better part) of the current generation can improve the algorithm's performance. In DElrandll, the probability of the base vector chosen from the better part is 50%. We attempt to enhance the probability to 100% by sorting the particles in current generation. Considering the computation cost of sorting the particles is high, we attempt to select the better particle from two random generate particles as the base vector and lower its probability to 75%. The above two mutation operator DElrandll/sorting and rDE experimental results are shown in Table 2.

We incorporate the proposed minimum distance framework into the original DE algorithm and compare its performance with the 'DE/randll " 'DE/bestll " 'DE/rand/lisorting', 'rDE' and MDDE. The selected mutation operators included the probability that the base vector particle chosen in the better part from 50% to 100%, we mark the probability in the brackets below. The referred mutation operators can be described as follows:

DE/rand/1(50%):

v"G = Xro,G + F * (Xr1,G - Xr2,G )

DE Ibestll (1 00%):

v"G = Xbest,G + F * (Xr1,G - Xr2,G) DE/randll with sorting rO (DE/rand/lisorting)(100%):

v"G = Xro,G + F * (Xr1,G - Xr2,G)

88

(rO is the top 50% best particle of the population) DE with a random choosing particle mutation operator

(rDE)(75%):

V ={�O'G+F*(�'I'G-Xr2,G)'if(f(�.o'G)<f(�wrl'G))

I,G �wrI,G + F*(Xr1,G -Xr2,G),othetwise

DE with minimum distance mutation operator (MDDE)(75%):

v = {X,O,G + F * eX,I,G - X,2,G)' if(feX,o,G) < !eXmi,G)) ',G

Xmi,G + F* eX",G - x",G),otherwise The initial population is generated randomly in the range.

In the proposed DE algorithm and (original) DE, we set the control parameters F and CR are in the brackets below. The maximum number of generations in each test benchmark function is shown in Table 2 ..

Table 1: Maximum generation number of benchmark functions

Maximum number of Benchmark functions generation

1500 ./i ,./6,./iO,./i2,./i3

2000 h,./il

3000 h 4000 ./i5

5000 h,f4,f9 9000 Is

20000 Is 100 ./i4, ./i5, ./i6, ./i7, ./is, ./i9, .120,

.121

B. Experimental Results

The average results of 20 independent runs are shown in Table 3. In function ./io - f13, MDDE shows superior performance to other algorithms. MDDE outperforms than DE/randll except in function/!, 15, DE/bestll except in ./i­

/!, DElrand/lisorting except in/!, 15, Is, 19, rDE except in/!, 15. A comparison using the experimental results clearly shows the projection. MDDE in most of the unimodal and multimodal functions exhibits either improvement or an equal performance. This implies the local search based on the minimum distance is effective in enhancing the performance of the DE algorithm and raising the probability that the base vector chosen from the better part properly can improve the performance of the algorithm, We can take a look at the four algorithm with the proposed algorithm separately. The proposed algorithm obvious can save more sorting time than DE/rand/lisorting, Comparing with DE/bestll, global searching ability can be enhanced by using the minimum distance particle. While the local search ability is improved when comparing with DE/randll,

Page 4: [IEEE 2013 Sixth International Conference on Advanced Computational Intelligence (ICACI) - Hangzhou, China (2013.10.19-2013.10.21)] 2013 Sixth International Conference on Advanced

f

I

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

Table 2: Experimental results

DE/randJI(F�0.5,CR-0.9) DE/best/I(F�O. 7, CR-0.9) DE/randJllsorting(F�0.5,CR-0.9) rDE(F�0.5,CR-0.9) MDDE(F�0.5,CR-0.9)

Mean Best Stdv Mean Best Stdv Mean

4.98E-14 9.11E-15 5.18E-14 5.63E-38 6.09E-40 1.29E-37 2.29E-27

5.34E-1O 1.42E-IO 2.75E-IO 6.74E-27 5.47E-28 1.48E-26 4.74E-\9

2.96E-12 6.46E-13 2.39E-12 9.78E-34 3.63E-35 1.54E-33 6.04E-23

2.83E-03 6.97E-11 5.44E-OI 2.89E-14 1.47E-16 7.14E-14 6.60E-04

6.42E-30 0 1.85E-29 0.930212 0 1.68615 9.55E-30

0 0 0 0.7 0 0.737111 0

4.29E-03 2.03E-03 1.44E-03 4.27E-03 l.85E-03 1.43E-03 2.80E-03

-11725.6 -12569.5 1316.79 -10469.8 -11187.7 485.586 -12357

84.867 23.257 35.059 45.0311 27.8588 9.2403 10.6648

S.8IE-8 2.99E-8 1.69E-8 2.02E-01 7.5SE-IS 4.6IE-01 3.2SE-01

2.3SE-20 0 2.69E-20 9.86E-04 7.3IE-03 0 8.12E-03

S.OSE-IS 3.87E-16 S.32E-IS 1.70E-01 I.S7E-32 2.S9E-01 4.07E-28

4.6SE-12 4.58E-13 8.46E-12 3.30E-03 I.3SE-32 S.04E-03 S.74E-28

0.998004 0.998004 7.16E-18 1.32804 0.998004 3.60458 0.998004

3.IS4E-4 3.07SE-4 5.52EO-S 4.60IE-4 3.07SE-4 9.3SE-4 3.749E-4

-1.03163 -1.03163 5.63E-12 -20.3064 -1.03163 2.60E-18 -1.03163

0.397887 0.397887 6.07E-18 0.397887 0.397887 0 0.397887

3 3 6.65E-IS 3 3 9.64E-16 3

-10.1532 -10.1532 8.87E-07 -6.88262 -10.1532 8.72263 -10.1532

-10.4029 -10.4029 3.9IE-07 -9.47436 -10.4029 5.71999 -10.4029

-10.5364 -10.5364 1.4SE-06 -9.6108 -10.5364 6.47618 -10.5364

A. Convergence Curve of MDDE and DE algorithms

The figures below are the convergence curve of 4 benchmark functions. We compared the MDDE with the above four DE algorithms onjj, 13.f6, Is.

�"'" -bes�1 --+ran:1I1iSortif'l;l �OE -+-MDDE

. '00:-0

-------------f.;--------------c;!;;--------------,! iteration

Fig. 1. The convergence curve of functionjj

Best

1.66E-28

9.93E-20

9.51 E-25

7.20E-07

0

0

8.19E-04

-12569.5

3.97984

7.5SE-IS

0

3.66E-30

3.2 I E-29

0.998004

3.094E-4

-1.03163

0.397887

3

-10.1532

-10.4029

-10.5364

89

Stdv Mean Best Stdv Mean Best

3.0IE-27 7.05E-25 9.37E-26 4.45E-24 6.16E-34 1.45E-35

3.6IE-19 2.99E-17 1.18e-17 8.36E-17 1.30E-24 2.25E-25

9.5IE-23 1.43E-20 6.27E-22 9.80E-20 3.09E-24 1.19E-26

7.27E-04 7.92E-04 9.39E-07 5.97E-03 5.30E-03 1.28E-04

1.60e-29 0 1.73E-28 4.4IE-29 0 1.46E-29

0 0 0 0 0 0

9.2IE-04 2.56E-03 1.07E-03 4.95E-03 2.52E-03 1.54E-03

172.667 -12340.5 -12569.5 379.007 -11968.5 -12569.5

14.9761 4.9748 69.5481 12.503 4.975 3.71585

6.22E-01 2.04E-13 8.24E-14 S.87E-13 2.91E-16 1.91E-17

3. I 6E-03 S.7SE-04 0 0.01191 1.63E-20 0

8.64E-28 6.76E-26 4.70E-27 6.40E-2S 4.IIE-31 5.64E-33

4.0SE-28 S.18E-24 2.43E-2S 2.99E-23 4.64E-29 1.42E-34

2.ISE-16 0.998004 0.998004 5.86E-16 0.998004 0.998004

1.97E-05 3.075E-4 3.07SE-4 1.69E-09 3.075E-4 3.0745E-4

3.48E-17 -1.03163 -1.03163 3.24E-16 -1.03163 -1.03163

9.6SE-18 0.397887 0.397887 I.3SE-12 0.397887 0.397887

2.84E-15 3 3 4.47E-IS 3 3

2.93E-12 -10.1532 -10.1532 6.62E-10 -10.1532 -10.1532

S.27E-13 -10.4029 -10.4029 4.6SE-11 -10.4029 -10.4029

2.13E-13 -10.5364 -10.5364 2.6SE-10 -10.5364 -10.5364

�o������������������--�� �erat�n

Fig. 2 . The convergence curve of function 13

Stdv

9.57E-34

9.06E-25

9.69E-24

6.83E-03

3.22E-29

0

3.90E-04

328.8

3.876

1.24E-16

2.03E-20

5. 1 OE-31

2.23E-28

9.68E-19

1.42E-09

2.98E-15

0

0

6.23E-1O

3.7SE-10

4.S8E-1O

Page 5: [IEEE 2013 Sixth International Conference on Advanced Computational Intelligence (ICACI) - Hangzhou, China (2013.10.19-2013.10.21)] 2013 Sixth International Conference on Advanced

3r"-----,------,-----,------,-----, ---rardi1 -besVl ---&-rardlllsortirg �OE -+-MDDE

�erat�n

Fig. 3. The convergence curve of functionj6

-"" -bes�1 -+ran:JIIlsortin;l �OE -+-I,mDE

""',L --_"=_--�-----,'=____,l .. ,-----�""--�---"=---�� iteration

Fig. 4. The convergence curve of function./8

Fig. 1.-2. demonstrates that MDDE has a fast convergence speed and a better result than DE/randll, DE/rand/1/sorting and rDE. As to the step functionj6, MDDE shows superior convergence speed and performance than other four algorithms in Fig. 3 .. In Fig. 4., MDDE employed in complex multimodal function ./8 has a better result than DE/randll, DE/bestll and rDE. This establishes the fact that a local search based on minimum distance proved to be effective in enhancing the precision and efficiency of the DE algorithm.

IV. CONCLUSION

In the paper, a new mutation operator is proposed, which has been proven to be efficient in the test benchmark functions, i.e. minimum distance mutation operator. By using the better particle around XrO,G replaces XrO,G, local search ability has been improved. The proposed algorithm has better outcome in the majority of test benchmark functions compared with the DE algorithm. In the paper, we apply the minimum distance mutation operator. It is an interesting work to combine the MDDE algorithm with other efficient DE algorithm with control parameter optimization.

90

Reference

[1 ]

[2]

R. Storn and K. Price, "Differential evolution---A simple and efficient heuristic for global optimization over continuous spaces," Journal of Global Optimization, vol. 11, pp. 341-359, Dec. 1997. B. V. Badu and R. Angira, "Modified differential evolution for optimization of non-linear chemical processes," Computer and Chemical Engineering, vol. 30, pp. 989-1002, May 2006.

[3] S. M. Almeida-Luz, M. A. Vega-Rodriguez, J. A. Gomez-Pulido, and J. M. Sanchez-perez, "Differential evolution for solving the mobile location management," Applied Soft Computing, vol. 11, pp. 410-427, Jan. 2011.

[4] A. K. Qin and P. N. Suganthan, "Self-adaptive differential evolution algorithm for numerical optimization," in IEEE

Congress on Evolutionary Computation, Edinburgh, Scotland, 2005, pp. 1785-1791.

[5] T. W. Liao, "Two hybrid differential evolution algorithms for engineering design optimization," Applied Soft Computing, vol. 10, pp. 1188-1199, Sept. 2010.

[6] G. Onwubolu and D. Davendra, "Scheduling flow shops using differential evolution algorithm," European Journal of

Operational Research, vol. 2, pp. 674-692, Jun. 2006. [7] Q. K. Pan, L. Wang, L. Gao, and W. D. Li, "An effective

hybrid discrete differential evolution algorithm for the flow shop scheduling with intermediate buffers," Information

Sciences, vol. 181, pp. 668-685, Feb. 2011. [8] H. J. Fu, D. T. Ouyang, and J. M. Xu, "A self-adaptive

differential evolution algorithm for binary CSPs," Computer and Mathematics with Applications, vol. 62, pp. 2712-2718, Oct. 2011

[9] J. Brest, S. Greiner, B. Boskovic, M. Mernik, and V. Zumer, "Self-adapting control parameters in differential evolution: a comparative study on numerical benchmark problems," IEEE

Transactions on Evolutionary Computation, vol. 10, pp. 646-657, Dec. 2006.

[10] Q. K. Pan, P. N. Suganthan, L. Wang, L. Gao, and R. Mallipeddi, "A differential evolution algorithm with self-adapting strategy and control parameters," Computers and Operations research, vol. 38, pp. 394-408, Jan. 2011.

[11] Y. Z. Zhou, X. Y. Li, and L. Gao, "A differential evolution algorithm with intersect mutation operator," Applied Soft

Computing, vol. 13, pp. 390-401, Jan. 2013. [12] A. P. Piotrowski, "Adaptive memetic differential evolution

with global and local neighborhood-based mutation operator," Information Science, vol. 241, pp. 164-194, Aug. 2013.

[13] M. M. Ali, "Differential evolution with generation differentials," Journal of Computational and Applied Mathematics, vol. 235, pp. 2205-2216, Feb. 2010.

[14] X. Yao, Y. Liu, and G. Lin, "Evolution programming made faster," IEEE Trans. Evol. Comput., vol. 3, pp. 82-102, Jul. 1999.