global optimizations and tabu search based on memory

9
Global optimizations and tabu search based on memory Mingjun Ji * , Huanwen Tang Department of Applied Mathematics, Dalian University of Technology, Dalian 116024, People’s Republic of China Abstract Based on the idea of tabu search that Glover et al put forward, a new tabu search, named Memory Tabu Search (MTS), is proposed for solving the multiple-minima problem of continuous functions. Two convergence theorems, which show that MTS asymptotically converges to the global optimal solutions in probability one under suitable conditions, are given. Numerical results illustrate that this algorithm is efficient, robust and easy to implement. Ó 2003 Elsevier Inc. All rights reserved. Keywords: Memory tabu search; Convergence in probability one; Multiple-minima problem 1. Introduction Tabu search (TS) is a metaheuristic originally developed by Glover [1,2], which has been successfully applied to a variety of combinatorial optimization problems. However, very few works deal with its application and theory to the global minimization of functions depending on continuous variables. Up to now, we are aware of some works [3–6] related to the subject. TS has been used and evaluated in various contexts, such as the structure if cluster [7], molecular docking [8] and conformational energy optimization of oligopeptides [9]. Some papers discuss the convergence about the discrete problems [10,11]. In this paper, we propose an adaptation of TS to continuous optimization problem, * Corresponding author. E-mail address: [email protected] (M. Ji). 0096-3003/$ - see front matter Ó 2003 Elsevier Inc. All rights reserved. doi:10.1016/j.amc.2003.10.028 Applied Mathematics and Computation 159 (2004) 449–457 www.elsevier.com/locate/amc

Upload: mingjun-ji

Post on 26-Jun-2016

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Global optimizations and tabu search based on memory

Applied Mathematics and Computation 159 (2004) 449–457

www.elsevier.com/locate/amc

Global optimizations and tabu searchbased on memory

Mingjun Ji *, Huanwen Tang

Department of Applied Mathematics, Dalian University of Technology, Dalian 116024,

People’s Republic of China

Abstract

Based on the idea of tabu search that Glover et al put forward, a new tabu search,

named Memory Tabu Search (MTS), is proposed for solving the multiple-minima

problem of continuous functions. Two convergence theorems, which show that MTS

asymptotically converges to the global optimal solutions in probability one under

suitable conditions, are given. Numerical results illustrate that this algorithm is efficient,

robust and easy to implement.

� 2003 Elsevier Inc. All rights reserved.

Keywords: Memory tabu search; Convergence in probability one; Multiple-minima problem

1. Introduction

Tabu search (TS) is a metaheuristic originally developed by Glover [1,2],

which has been successfully applied to a variety of combinatorial optimizationproblems. However, very few works deal with its application and theory to the

global minimization of functions depending on continuous variables. Up to

now, we are aware of some works [3–6] related to the subject. TS has been used

and evaluated in various contexts, such as the structure if cluster [7], molecular

docking [8] and conformational energy optimization of oligopeptides [9]. Some

papers discuss the convergence about the discrete problems [10,11]. In this

paper, we propose an adaptation of TS to continuous optimization problem,

* Corresponding author.

E-mail address: [email protected] (M. Ji).

0096-3003/$ - see front matter � 2003 Elsevier Inc. All rights reserved.

doi:10.1016/j.amc.2003.10.028

Page 2: Global optimizations and tabu search based on memory

450 M. Ji, H. Tang / Appl. Math. Comput. 159 (2004) 449–457

called Memory Tabu search (MTS) and proved that MTS converges to the

global optimal solution in probability one. Numerical results illustrate that thisalgorithm is efficient, easy to implement and open to improvement. The rests of

this paper are organized as follows: In Sections 2 and 3, a full descriptions and

implementation of MTS is presented. The convergence of this search for

solving the problem (P) in Section 2 is proved in Section 4. The computational

results are given in Section 5. In Appendix A, six test functions which were

used for testing MTS are given.

2. Memory tabu search

Consider the following continuous global optimization problem

ðPÞ min f ðxÞs:t x 2 X;

where X is a compact subset of Lebesgue measure space ðRn; LðRnÞ; lÞ and f is

a real-valued continuous function defined on X. MTS for solving problem (P)

is described as follows:

Step 1: Generate a initial point x0 2 X; set x�0 :¼ x0; k :¼ 0.

Step 2: If a prescribed termination condition is satisfied, stop. Otherwise gen-

erate a random vector y by using the generation probability density

function.

Step 3: If f ðyÞ6 f ðx�kÞ, x�kþ1 :¼ y, xkþ1 :¼ y, else if f ðyÞ6 f ðxkÞ, xkþ1 :¼ y, else

if y did not satisfy the tabu conditions, then xkþ1 :¼ y, else xkþ1 :¼ xk.Go to step 2.

The main distinction between MTS and TS lies in Step 3. Here we introduce

a variable x�kþ1 to record the optimal one of fxiþ1ji ¼ 1; . . . ; k þ 1g (The reason

why our algorithm is named so). As we can see in Section 4, this ensure that the

algorithm converge to global optimal solution.

3. Implementation of memory tabu search

In this section, some implementation issues of MTS are discussed. It is

possible to offer alternative approaches to the implementation of MTS.

3.1. Generation of an initial solution

We randomly generated 10 n solutions in X by using the uniform distribu-

tion and select the optimal one of the 10 n solutions as the initial solution x0.

Page 3: Global optimizations and tabu search based on memory

M. Ji, H. Tang / Appl. Math. Comput. 159 (2004) 449–457 451

3.2. Generation probability of the new y

In this paper, we generate the component yj of y which satisfies the Gaussian

distribution with mean xjk and standard deviation r, j ¼ 1; 2; . . . ; n. When

k increased, r :¼ d r. r ¼ 1. If r < 10�4, r ¼ 10�4. d was chosen from 0.997

to 0.999.

3.3. Tabu conditions

This subsection describes the tabu conditions of MTS. A move is tested tocheck whether it is tabu or not. In MTS, there are the following three criteria

which are used to determine if a move is tabu.

(1) kxk � yk, which is the total distance moved.

(2) jf ðxkÞ � f ðyÞj, which is the total change in the objective function.

(3) jf ðxkÞ � f ðyÞj=f ðyÞ, which is the percentage improvement or destruction

that will be accepted if the new move is accepted. Thus, the new solution

at step 2 is assumed tabu if the total distance moved at the current iterationis less than d1 and the total change in the objective function is less than d2 or

the percentage of destruction at the objective functions is higher than a per-

centage d3. In this paper, d1 ¼ 0:1, d2 ¼ 0:005, d3 is generated randomly

number between 0.50 and 0.75. The tabu list size L is chosen from 6 to 13. Set

S1 ¼ fy 2 Xj kxk � yk < d1g;S2 ¼ fy 2 Xj jf ðxkÞ � f ðyÞj < d2g;S3 ¼ fy 2 Xj jf ðxkÞ � f ðyÞj=f ðyÞ 100 > 100 d3g:

So that we know the acceptance probability is

A ¼ 1; f ðyÞ6 f ðxkÞ;lfX �

Skk�LðS1 \ S2 [ S3Þg=lfXg; f ðyÞ > f ðxkÞ;

which must satisfySk

k�LðS1 \ S2 [ S3Þ � X.

3.4. Termination condition

If the value of objective function is less than 0.1–2% of the optimal value of

function, the termination condition is satisfied.

4. Convergence of MTS

In order to prove the convergence of MTS, we introduce the following the

definitions and theory [12].

Page 4: Global optimizations and tabu search based on memory

452 M. Ji, H. Tang / Appl. Math. Comput. 159 (2004) 449–457

Definition 4.1. Let nn ðn ¼ 0; 1; . . .Þ be a sequence of random numbers defined

on a probability space. We say that fnng converges in probability to the ran-dom number n, if for any � > 0, such that

limn!1

Pfjnn � nj < eg ¼ 1:

Definition 4.2. Let nn ðn ¼ 0; 1; . . .Þ be a sequence of random numbers definedon a probability space. We say that fnng converges in probability one to the

random number n, if for any � > 0, such that

P limn!1

nn

n¼ n

o¼ 1

or

P\1n¼1

[k P n

½jnn

(� njP ��

)¼ 0:

Obviously, convergence in probability one is stronger than convergence in

probability.

Theorem 4.1 (Borel–Cantelli theorem). Let A1;A2; . . . be a sequence on a proba-bility space, and set Pk ¼ PfAng. Then

P1n¼1 Pk < 1 implies Pf

T1n¼1

SkPn Akg¼

0. IfP1

n¼1 Pk ¼ 1 and Ak are independent, then PfT1

n¼1

Sk P n Akg ¼ 1.

The following lemma and theorems give the global convergence property the

objective optimal value sequence induces by MTS as described above for

solving problem (P). f is supposed to have a global minimum f � ¼ minx2X f ðxÞ,for any � > 0, let D0 ¼ fx 2 Xj jf ðxÞ � f �j < �g, D1 ¼ X n D0.

Lemma 4.1. Solving the problem (P) by using MTS, we set x�k 2 D1. Let theprobability of x�kþ1 2 D1 be qkþ1 and the probability of x�kþ1 2 D0 be pkþ1. If yi,i ¼ 1; 2; . . . ; n satisfies the Gaussian distribution, then qkþ1 6 c, c 2 ð0; 1Þ.

Proof. Let xmin is a global optimal solution of problem (P). Since f is a con-

tinuous function, there exists r > 0, such that jf ðxÞ � f ðxminÞj < e=2. LetQxmin;r ¼ fx 2 Xjkx� xmink6 rg. Obviously, Qxmin ;r � D0. By assumption

x�k 2 D1, we have f ðx�kþ1Þ6 f ðx�kÞ6 f ðxkÞ. yi � Nðxik; r2Þ, i ¼ 1; 2; . . . ; n leads to

the generation probability density function g ¼ 1ffiffiffiffi2p

pr

expðyi�xikÞ

2

2r2

h i. The accep-

tance probability

A ¼1; f ðyÞ6 f ðxkÞ;l X �

Skk�LðS1 \ S2 [ S3Þ

n o.lfXg; f ðyÞ > f ðxkÞ:

(

Obviously, A6 1.

Page 5: Global optimizations and tabu search based on memory

M. Ji, H. Tang / Appl. Math. Comput. 159 (2004) 449–457 453

The probability of x�kþ1 2 Qxmin ;r is

Pfx�kþ1 2 Qxmin ;rg ¼ Pfy 2 Qxmin;rg ¼ZQxmin ;r

ðg AÞdX6

ZQxmin ;r

gdX:

Since ; 6¼ Qxmin ;r � D0, we know 0 < Pfx�kþ1 2 Qxmin;rg < 1. y is the continuousrandomly variable produced by the Gaussian distribution and Qxmin ;r is the

boundary closed set, so that there exists P , such that P ¼ miny2Qxmin ;rPfy 2

Qxmin;rg. As a result of Qxmin;r � D0, we have pkþ1 P Pfx�kþ1 2 Qxmin;rgP P . Let

c ¼ 1 � P , obviously c 2 ð0; 1Þ. Based on qkþ1 þ pkþ1 ¼ 1, we have

qkþ1 ¼ 1 � pkþ1 6 1 � P ¼ c < 1, so qkþ1 6 c 2 ð0; 1Þ. �

Theorem 4.2. Solving the problem (P) by using MTS, if yi, i ¼ 1; 2; . . . ; n satisfiesthe Gaussian distribution, then Pflimk!1 f ðx�kÞ ¼ f �g ¼ 1. Namely x�k convergesin probability one to the global optimal solution of problem (P).

Proof. 8� > 0, Let qk ¼ Pfjf ðx�kÞ � f �jP �g. If there exists j 2 f0; 1; . . . ; kg,such that x�j 2 D0, then qk ¼ 0. If 8j 2 f0; 1; . . . ; kg, such that x�j 2D0, we setqk ¼ P . By the Lemma 4.1, we have

Pk ¼ Pfx�0 2 D1; x�1 2 D1; . . . ; x�k 2 D1g6 ck:

So

X1k¼1

Pk 6X1k¼1

ck ¼ c1 � c

< 1:

Then by Theorem 4.1, we know

P\1n¼1

[k P n

½jf ðx�kÞ(

� f �jP ��)

¼ 0:

According to Definition 4.2, we gain the result. h

Theorem 4.3. Solving the problem (P) by using MTS, if yi, i ¼ 1; 2; . . . ; n satisfiesthe uniform distribution, then Pflimk!1 f ðx�kÞ ¼ f �g ¼ 1. Namely, x�k convergesin probability one to the global optimal solution of problem (P).

5. Computational results

Using our MTS, we conducted experiments for six test functions listed in

Table 1. All test functions are multi-model functions with many local minima.

Because of the characteristics, it is difficult to seek for the global minima.

Page 6: Global optimizations and tabu search based on memory

Table 1

Functions

Function Dimension Local Global Reference

GP 2 4 1 [13]

BR 2 3 3 [13]

Hn3 3 4 1 [13]

Hn6 6 4 1 [13]

RA 2 50 1 [3]

SH 2 760 18 [3]

454 M. Ji, H. Tang / Appl. Math. Comput. 159 (2004) 449–457

The results of MTS are listed in Table 3 compared with the results of othermethods in the Table 2. The results of MTS optimization of test functions (Table

1) are the average outcome of 100 independent runs. The reliability is excellent:

in each case 70–100% of runs have been successful (with the final result within

0.1–2% of the global minimum). With this degree of precision, the global min-

imum in all our test functions was isolated from local minima, so that the

solution can always be refined to any desired accuracy by any local optimizer.

Results for a standard set of test functions thus indicate that MTS is reliable

and efficient: more so than Pure Random search and the Multi-start method.MTS significantly reduces the amount of blind search characteristic of earlier

techniques. Compared with the resulted of the SA and TS, our algorithm is the

best of them.

Table 3

Average number of objective function evaluations used by six methods to optimize six functions

Method GP BR Hn3 Hn6 RA SH

PRS 5125 4850 5280 18 090 5964 6700

MS 4400 1600 2500 6000 N/A N/A

SA1 5439 2700 3416 3975 N/A 241 215

SA2 563 505 1459 4648 N/A 780

TS 486 492 508 2845 540 727

MTS 378 166 240 2709 310 261

Table 2

Global optimization methods used for performance analysis

Method Name [Reference]

PRS Pure random search [14]

MS Multi-start [15]

SA1 Simulated annealing based on stochastic differential equations [15]

SA2 Simulated annealing [15]

TS Taboo search [3]

MTS This work

Page 7: Global optimizations and tabu search based on memory

M. Ji, H. Tang / Appl. Math. Comput. 159 (2004) 449–457 455

Acknowledgements

The authors would like to thank Professor Jacek Klinowski for providing

the bibliographies.

Appendix A

1. GP (Goldstein–Price function: n ¼ 2)

fGðX Þ ¼ 1 þ ðx1 þ x2 þ 1Þ2 19 � 14x1 þ 3x21 þ 6x1x2 þ 3x2

2

� �h i

30 þ ð2x1 � 3x2Þ2ð18 � 32x1 þ 12x21 þ 48x2 � 36x1x2 þ 27x2

2Þh i

;

� 2 < xi < 2; i ¼ 1; 2:

The global minimum is equal to 3 and the minimum solution is ð0;�1Þ.There are four local minima in the minimization region.

2. BR (Branin: n ¼ 2)

f ðx1; x2Þ ¼ aðx2 � bx21 þ cx1 � dÞ2 þ eð1 � f Þ cosðx1Þ þ e;

where a ¼ 1, b ¼ 5:1=ð4p2Þ, c ¼ 5=p, d ¼ 6, e ¼ 10, f ¼ 1=ð8pÞ, �56 x1 6 10,06 x2 6 15, xmin ¼ ð�p; 12:275Þ; ðp; 2:275Þ; ð3p; 2:475Þ, f ðxminÞ ¼ 5=ð4pÞ. There

are no more minima.

3. Hn (Hnrtman functions: n ¼ 3; 6)

fH ¼X4

i¼1

ci exp

"�Xn

j¼1

aijðxj � pijÞ2#; 06 xi 6 1; i ¼ 1; 2; . . . ; n:

For n ¼ 3, the global minimum is equal to )3.86 and it is reached at the

point (0.114, 0.556, 0.882). For n ¼ 6 the minimum is )3.32 at the point (0.201,

0.150, 0.477, 0.275, 0.311, 0.657).

n ¼ 3

i ai1 ai2 ai3 ci pi1 pi2 pi31 3 10 30 1 0.3689 0.1170 0.2673

2 0.1 10 35 1.2 0.4699 0.4387 0.7470

3 3 10 30 3 0.1091 0.8742 0.55474 0.1 10 35 3.2 0.03815 0.5743 0.8828

Page 8: Global optimizations and tabu search based on memory

456 M. Ji, H. Tang / Appl. Math. Comput. 159 (2004) 449–457

4. RA (Rastrigin function: n ¼ 2)

n ¼ 6

i ai1 ai2 ai3 ai4 ai5 ai6 ci1 10 3 17 3.5 1.7 8 1

2 0.05 10 17 0.1 8 14 1.2

3 3 3.5 1.7 10 17 8 3

4 17 8 0.05 10 0.1 14 3.2

i pi1 pi2 pi3 pi4 pi5 pi61 0.1312 0.1696 0.5569 0.0124 0.8283 0.5886

2 0.2329 0.4135 0.8307 0.3736 0.1004 0.9991

3 0.2348 0.1451 0.3522 0.2883 0.3047 0.6650

4 0.4047 0.8828 0.8732 0.5743 0.1091 0.0381

gðx1; x2Þ ¼ x21 þ x2

2 þ cosð18x1Þ � cosð18x2Þ � 16 x1; x2 6 1

which has 50 minima in the region. The global minimum is at x ¼ ð0; 0Þwhere f ¼ �2.

5. SH (Shubert function)

f ðx1; x2Þ ¼X5

i¼1

i cosðði(

þ 1Þx1 þ iÞ) X5

i¼1

i cosðði(

þ 1Þx2 þ iÞ);

� 106 x1; x2 6 10:

In the region the function has 760 local minima, 18 of which are global with

f ¼ �186:7309.

References

[1] F. Glover, Tabu search––Part I, ORSA Journal on Computing 1 (1989) 190–206.

[2] F. Glover, Tabu search––Part II, ORSA Journal on Computing 2 (1990) 4–32.

[3] D. Cvijovi�c, J. Klinowski, Taboo search: an approach to the multiple minima problem, Science

267 (1995) 664–666.

[4] T. Trafalis, S. Kasap, A novel metaheuristics approach for continuous global optimization,

Journal of Global Optimization 23 (2002) 171–190.

[5] K.S. Al-Sultan, M.A. Al-Fawzan, A tabu search Hooke and Jeeves algorithm for

unconstrained optimization, European Journal of Operational Research 103 (1997) 198–208.

Page 9: Global optimizations and tabu search based on memory

M. Ji, H. Tang / Appl. Math. Comput. 159 (2004) 449–457 457

[6] V.V. Kovacevic-Vujcic, M.M. Cangalovic, M.D. A�sic, L. Lvanovic, M. Drazic, Tabu search

methodology in global optimization, Computers and Mathematics with Application 37 (1999)

125–133.

[7] D.J. Wale, J.P.K. Doye, Global optimization by basinhopping and lowest energy structures

of Lennard-Jones clusters containing up to 110 atoms, Journal of Physical Chemistry A 100

(1997) 5111–5116.

[8] D.R. Westhead, D.E. Clark, C.W. Muray, A comparison of heuristic search algorithms for

molecular docking, Journal Computer-Aided Molecular Design 11 (1997) 209–228.

[9] L.B. Morales, R. Garduno-Juarez, J.M. Aguilar-Alvarado, F.J. Castro, A parallel tabu search

for conformational energy optimization of oligopeptides, Journal of Computational Chemistry

21 (2000) 147–156.

[10] S. Hanafi, On the convergence of tabu search, Journal of Heuristics 7 (2000) 47–58.

[11] F. Glover, S. Hanafi, Tabu search and finite convergence, Discrete Applied Mathematics 119

(2002) 3–36.

[12] R.G. Laha, V.K. Rohatgi, Probability Theory, John Wiley & Sons, New York, 1979.

[13] A. T€orn, A. �Zilinskas, Global Optimization, Lecture Notes in Computer Science 350, Springer-

Verlag, 1989.

[14] R.S. Anderssen, L.S. Jennings, D.M. Ryan, Optimization, University of Queensland Press,

St. Luncia, Australia, 1972.

[15] A. Dekkers, E. Aarts, Global optimizations and simulated annealing, Mathematical

Programming 50 (1991) 367–393.