a combined global and local search method to deal with constrained optimization for continuous tabu...

23
INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING Int. J. Numer. Meth. Engng 2008; 76:1869–1891 Published online 10 July 2008 in Wiley InterScience (www.interscience.wiley.com). DOI: 10.1002/nme.2393 A combined global and local search method to deal with constrained optimization for continuous tabu search Xi Chen , , Jing Yang, Zhaohua Li, Daqing Tian and Zhijiang Shao State Key Laboratory of Industrial Control Technology, Department of Control Science and Engineering, Zhejiang University, Hangzhou 310027, China SUMMARY Heuristic methods, such as tabu search, are efficient for global optimizations. Most studies, however, have focused on constraint-free optimizations. Penalty functions are commonly used to deal with constraints for global optimization algorithms in dealing with constraints. This is sometimes inefficient, especially for equality constraints, as it is difficult to keep the global search within the feasible region by purely adding a penalty to the objective function. A combined global and local search method is proposed in this paper to deal with constrained optimizations. It is demonstrated by combining continuous tabu search (CTS) and sequential quadratic programming (SQP) methods. First, a nested inner- and outer-loop method is presented to lead the search within the feasible region. SQP, a typical local search method, is used to quickly solve a non-linear programming purely for constraints in the inner loop and provides feasible neighbors for the outer loop. CTS, in the outer loop, is used to seek for the global optimal. Finally, another local search using SQP is conducted with the results of CTS as initials to refine the global search results. Efficiency is demonstrated by a number of benchmark problems. Copyright 2008 John Wiley & Sons, Ltd. Received 7 August 2007; Revised 30 April 2008; Accepted 30 April 2008 KEY WORDS: constrained optimization; tabu search; SQP 1. INTRODUCTION Tabu search (TS) is a meta-heuristic method originally developed for combinatorial optimizations by Glover [1, 2]. It employs adaptive memory and responsive exploration to effectively search within Correspondence to: Xi Chen, State Key Laboratory of Industrial Control Technology, Department of Control Science and Engineering, Zhejiang University, Hangzhou 310027, China. E-mail: [email protected] Contract/grant sponsor: National Key Basic Research and Development Program of China; contract/grant number: 2002CB312200 Contract/grant sponsor: National Natural Science Foundation of China; contract/grant number: 60704029 Contract/grant sponsor: 863 Program of China; contract/grant number: 2007AA04Z192 Copyright 2008 John Wiley & Sons, Ltd.

Upload: xi-chen

Post on 15-Jun-2016

218 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: A combined global and local search method to deal with constrained optimization for continuous tabu search

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERINGInt. J. Numer. Meth. Engng 2008; 76:1869–1891Published online 10 July 2008 in Wiley InterScience (www.interscience.wiley.com). DOI: 10.1002/nme.2393

A combined global and local search method to deal withconstrained optimization for continuous tabu search

Xi Chen∗,†, Jing Yang, Zhaohua Li, Daqing Tian and Zhijiang Shao

State Key Laboratory of Industrial Control Technology, Department of Control Science and Engineering,Zhejiang University, Hangzhou 310027, China

SUMMARY

Heuristic methods, such as tabu search, are efficient for global optimizations. Most studies, however, havefocused on constraint-free optimizations. Penalty functions are commonly used to deal with constraintsfor global optimization algorithms in dealing with constraints. This is sometimes inefficient, especially forequality constraints, as it is difficult to keep the global search within the feasible region by purely addinga penalty to the objective function. A combined global and local search method is proposed in this paperto deal with constrained optimizations. It is demonstrated by combining continuous tabu search (CTS)and sequential quadratic programming (SQP) methods. First, a nested inner- and outer-loop method ispresented to lead the search within the feasible region. SQP, a typical local search method, is used toquickly solve a non-linear programming purely for constraints in the inner loop and provides feasibleneighbors for the outer loop. CTS, in the outer loop, is used to seek for the global optimal. Finally,another local search using SQP is conducted with the results of CTS as initials to refine the global searchresults. Efficiency is demonstrated by a number of benchmark problems. Copyright q 2008 John Wiley& Sons, Ltd.

Received 7 August 2007; Revised 30 April 2008; Accepted 30 April 2008

KEY WORDS: constrained optimization; tabu search; SQP

1. INTRODUCTION

Tabu search (TS) is a meta-heuristic method originally developed for combinatorial optimizationsby Glover [1, 2]. It employs adaptive memory and responsive exploration to effectively search within

∗Correspondence to: Xi Chen, State Key Laboratory of Industrial Control Technology, Department of Control Scienceand Engineering, Zhejiang University, Hangzhou 310027, China.

†E-mail: [email protected]

Contract/grant sponsor: National Key Basic Research and Development Program of China; contract/grant number:2002CB312200Contract/grant sponsor: National Natural Science Foundation of China; contract/grant number: 60704029Contract/grant sponsor: 863 Program of China; contract/grant number: 2007AA04Z192

Copyright q 2008 John Wiley & Sons, Ltd.

Page 2: A combined global and local search method to deal with constrained optimization for continuous tabu search

1870 X. CHEN ET AL.

the solution space and prevent it from being trapped in a local optimum. Good performance has beendemonstrated by a number of successful applications in combinatorial optimization problems, suchas graph coloring [3], quadratic assignment [4], electronic circuit design [5] and scheduling [6]. Inrecent years, it has been employed to solve continuous optimization problems. Hu [7] may be thefirst one who introduced TS to continuous optimizations, but the main principle of his method seemsrather far from the original TS. Siarry and Berthiau [8] presented an adaptation of the original simpleTS to continuous optimization. However, it could deal only with low-dimension problems andmany concepts such as intensification, diversification and aspiration level were not included. Later,Chelouah and Siarry [9] further developed an improved continuous tabu search (CTS) algorithm,where the diversification and intensification strategies were introduced. Wang et al. [10] studiedthe selection in the central area and proposed an improved CTS algorithm. The above-mentionedmethods have been successfully demonstrated by a number of continuous optimization problems.However, due to their heuristic feature, these methods limited their applications to constraint-freeproblems. Similar situation also happens in other global search methods. In real-world applica-tions, however, many continuous optimization problems are constrained. They can be defined asfollows:

min f (x)

s.t. gi (x)�0, i=1, . . . , p

h j (x)=0, j =1, . . . ,q

xlb�x�xub

(1)

where x is a vector of n variables (x1, x2, . . . , xn) with a lower bound, xlb, and an upper bound,xub,gi (x)�0, i=1, . . . , p, are inequality constraints; and h j (x)=0, j=1, . . . ,q, are equalityconstraints. Up to now, several methods in overcoming constraints associated with constrainednon-linear optimization problems with evolutionary algorithms have been reported. All of thesestrategies could be classified into the following categories [11, 12]:(a) Methods based on penalty functions.(b) Methods based on preserving feasibility of solutions.(c) Methods that make a clear distinction between feasible and infeasible solutions.(d) Methods based on decoders.(e) Other hyper methods.

Among them, the most commonly used approach is based on penalty functions, which can bestatic [13, 14], dynamic [15, 16] or adaptive [17]. The essential idea of penalty function is totransform a constrained problem into a sequence of unconstrained problems. But when we applythis method combined with CTS to solve problems with equality constraints, we often face theproblem that we cannot even obtain any feasible solution after lengthy calculation let alone theoptimal solution. This is because neighbor solutions are randomly generated by adding changesfrom the current one, and they will have very low probability of falling into the feasible regionof problems with equality constraints. Lin and Miller [18] used a direct substitution method toremove the equality constraint for CTS. It can deal with problems with simple constraints. Forrelatively complex problems, application of this method will be difficult. To utilize the advantageof TS to solve constrained optimization problems, a proper constraint handling method is required.In this paper, a combined global and local search method to deal with constrained optimization

Copyright q 2008 John Wiley & Sons, Ltd. Int. J. Numer. Meth. Engng 2008; 76:1869–1891DOI: 10.1002/nme

Page 3: A combined global and local search method to deal with constrained optimization for continuous tabu search

CONSTRAINED OPTIMIZATION FOR CTS 1871

for CTS is proposed. It introduces sequential quadratic programming (SQP), a typical local searchmethod, into CTS to deal with constraints. It can lead and keep the global search in the feasibleregion and refine the global search result.

2. PROPOSED CONSTRAINT HANDLING METHOD FOR CTS

2.1. Overview of the proposed constraint handling method

Considering the negative feature for penalty function method in overcoming constraints, we proposea new constraint handling method for CTS by firstly transforming the non-linear constrainedproblem into a nested optimization problem. As shown in Figure 1, the first step is a nestedinner- and outer-loop optimization structure. It uses SQP method in the inner loop to quickly leadthe search back into the feasible region as much as possible, and CTS in the outer loop to dealwith local optimality within the feasible region. The constraints in Equation (1) are handled byconstructing a sub-optimization problem in the inner loop as follows:

min G(x,w1,w2)=q∑

j=1w1 j h

2j (x)+

p∑

i=1w2i [max{0,gi (x)}]2

s.t. xlb�x�xub

(2)

SQP

Min G(x)

CTS

Min f(x)+G(x)Combined CTS+SQP

Min f(x)

s.t.:

g(x)≤0

h(x)=0 SQP

Min f(x)

s.t.:

g(x)≤0

h(x)=0

Figure 1. A combined global and local search structure for constrained optimizations.

Copyright q 2008 John Wiley & Sons, Ltd. Int. J. Numer. Meth. Engng 2008; 76:1869–1891DOI: 10.1002/nme

Page 4: A combined global and local search method to deal with constrained optimization for continuous tabu search

1872 X. CHEN ET AL.

where w1 and w2 are two weight vectors. In our problem, weights are simplified to unity vectorsand Equation (2) is thus simplified to

min G(x)=q∑

j=1h2j (x)+

p∑

i=1[max{0,gi (x)}]2

s.t. xlb�x�xub

(3)

The max and squared terms ensure that the objective function is non-negative. In addition, it can beproved that G has continuous gradients if h and g have continuous gradients. If all constraints ofthe original problem are satisfied, the objective function value of the above optimization problem iszero; else if any constraint of the previous problem is violated, the objective function is larger thanzero. Therefore, the optimal solution of this sub-optimization problem is also a feasible solutionof the original non-linear constrained problem. By minimizing the inner-loop sub-optimizationproblem, it provides a guide for search in the feasible region. We use SQP method to solve theinner-loop sub-optimization associating with constraints only. Ideally, the outer-loop optimizationproblem only needs to use global search in terms of the original objective function with the feasiblesolutions obtained by the inner-loop optimization. But in some cases the inner-loop problemconstructed as Equation (3) may not be able to reach its global minimum, or it takes too longtime to reach it, due to the complexity of the constraints. To control the time cost, we fix themaximal iteration number in the inner loop. This means the outer-loop search within the feasibleregion is not 100% guaranteed. Therefore, we formulate a norm penalty function in the outer-loopoptimization as

min F(x)= f (x)+G(x)

s.t. xlb�x�xub(4)

where f is the original objective function in Equation (1) and G is the objective function constructedin Equation (3). If the inner-loop optimization succeeds, the objective function of the outer loop isidentical to the original objective function. While, in any circumstance, the inner loop does notconverge to zero, the objective function of the outer-loop optimizationwill be penalized by addingG.

SQP, a gradient-based local search method, is used in the inner-loop problem based on theassumption that it is sufficient for solving the problem constructed by Equation (3). This assumptionis based on the following reasons. First, many of the local minima and other problems of thebenchmark examples are caused by the combination of objective function and constraints. Forconstraints only, there may not be so many difficulties for the local search methods to obtain afeasible solution if the constraint functions are continuous. Second, even if there is a local minimumin solving Equation (3), the penalty term in Equation (4) will help. At this case, it is no worsethan the traditional penalty-function-based methods. Results presented in Section 3 also supportthe above statement.

After CTS finishes its global search in the outer loop, another SQP module is activated, asshown also in Figure 1. By inputting the final CTS result as an initial, a gradient-based local searchcontinues to refine the result obtained by the global search. This module also helps the search toreach the valley bottom as the global search has little chance to reach the exact bottom due to thelack of guide using gradient. Moreover, as the global search result is already close to the globaloptimal point, the gradient-based local search costs little time.

Copyright q 2008 John Wiley & Sons, Ltd. Int. J. Numer. Meth. Engng 2008; 76:1869–1891DOI: 10.1002/nme

Page 5: A combined global and local search method to deal with constrained optimization for continuous tabu search

CONSTRAINED OPTIMIZATION FOR CTS 1873

2.2. SQP algorithm

SQP methods [19, 20] are a class of optimization methods through solving a quadratic programmingsub-problem at each iteration. Each QP sub-problem minimizes a quadratic model of a certainmodified Lagrangian function subject to linear constraints. The merit function is reduced alongeach search direction to ensure convergence from any starting point. Given a general NLP problemas shown in Equation (1), the QP sub-problem can be defined by linearizing both the inequalityand equality constraints as follows:

mindk∈Rn

12d

Tk Hkdk+∇ f (xk)

Tdk

s.t. ∇hi (xk)Tdk+hi (xk)=0, i=1,2, . . . , p

∇g j (xk)Tdk+g j (xk)�0, j =1,2, . . . ,q

xlb�xk�xub

(5)

where matrix Hk is a positive-definite approximation of the Hessian matrix of the Lagrangianfunction, which can be updated by any of the quasi-Newton methods. The QP sub-problem canbe solved by using any QP algorithm. The solution dk is used to form a new iterate:

xk+1=xk+�kdk (6)

where the step length parameter, �k , is determined by an appropriate line search procedure sothat a sufficient decrease in a merit function is obtained. Good performance of SQP has beendemonstrated in dealing with constrained optimization problems.

2.3. CTS algorithm

TS is a meta-heuristic method originally developed for combinatorial optimizations by Glover.Later, a number of adaptations extend its use in continuous optimization. The CTS can be summa-rized as follows. For current point x , its neighborhood, H(x), is first defined following certainrules. A set of neighbor candidates is randomly generated within the neighborhood. Sometimestrying to cover all of the feasible regions, diversification strategy is required for the neighborgeneration. The next move of point x to a new point is selected among the neighbor candidatesby normally comparing the objective function. A tabu list is created during the iteration to storepoints recently visited so as to avoid being trapped into a local minimum. Points in the tabu listare prohibited from selection of next move. Aspiration criterion is also used to release some pointsin the tabu list during the iteration. The procedure is repeated until certain termination criterion issatisfied. Some details of the developed algorithm are presented below.

2.3.1. Neighbor generation. Considering the diversification strategy, the neighbor space ofcurrent solution x , as shown in Figure 2, is partitioned by a set of concentric hyper-rectanglesHi (x,hi ,hi−1) with radii h0,h1, . . . ,hk such that

Hi (x,hi−1,hi ) = {x ′|hi−1, j�|x ′j −x j |<hi j , x lb j<x ′

j<x ub j , j =1,2, . . . ,n}i=1,2, . . . ,k (7)

H0(x,h0) = {x ′| |x ′j −x j |<h0 j , x lb j<x ′

j<x ub j , j =1,2, . . . ,n} (8)

hi = hi−1∗2, i=1,2, . . . ,k (9)

Copyright q 2008 John Wiley & Sons, Ltd. Int. J. Numer. Meth. Engng 2008; 76:1869–1891DOI: 10.1002/nme

Page 6: A combined global and local search method to deal with constrained optimization for continuous tabu search

1874 X. CHEN ET AL.

x

H0(x,h0)

H1(x,h1,h0)

H2(x,h2,h1)

H3(x,h3,h2)

Figure 2. Neighbor space partition of CTS.

where radius h0 is an independent parameter. Any new point, x ′, satisfying Equations (7)–(9)would be considered as a neighbor candidate. A number of initial neighbors are first generatedwithin the hyper-rectangles without considering the constraints. The constraint handling methodproposed in the previous section is used to search from the initial neighbors to a group of newcandidate solutions by quickly solving the sub-optimization problems in Equation (3) using SQPmethod. The collection of the new candidate solutions are sorted in terms of the objective functionvalue of Equation (4) to form a new neighborhood for the out-loop optimization using TS.

2.3.2. Tabu criteria. To prevent the search from being trapped into a local minimum, TS uses atabu list to restrict the visited solutions. But unlike the combinatorial optimizations, continuousTS restricts an area rather than a point. In this paper the tabu criteria in [10] are adopted. The tabucondition is organized by a twofold tabu list, which contains all the visited solutions and theirobjective function values during the last L iterations. To check whether a solution is tabooed, itfirst determines if the current objective function value, F(x), is within the tolerance (� f ) of anyfunction value in the tabu list. It should be noted that to restrict an area rather than a point, thetolerance is not a very small value. Denote xti as any element in the tabu list. If

|F(x)−F(xti )|�� f for i=1,2, . . . , L (10)

it means that the current solution has not been tabooed; else it means that there is a point, xti , inthe tabu list having close value to current solution x in terms of the objective function value. Thus,the second criterion is next applied by checking the distance between these two points. Solution xis considered to have been tabooed if

‖x−xti‖<h0 (11)

where h0 is the radius of the inner hyper-rectangle.

2.3.3. Aspiration and termination criteria. During the searching procedure, if there is a candidatewhose objective function value is better than the best point that has been visited so far, it will betaken as a new solution, no matter whether this point is tabooed or not. The TS terminates if eitherof the following conditions is satisfied: a predefined maximum iteration number, Imax, is reached,or the optimal solution has not been improved for a given number of iterations (M).

Copyright q 2008 John Wiley & Sons, Ltd. Int. J. Numer. Meth. Engng 2008; 76:1869–1891DOI: 10.1002/nme

Page 7: A combined global and local search method to deal with constrained optimization for continuous tabu search

CONSTRAINED OPTIMIZATION FOR CTS 1875

2.4. Algorithm implementation

With the criteria proposed above, the basic procedure of the combined global and local searchalgorithm, as illustrated in Figure 3 (the overall flowchart), is summarized as follows:

Step 1: Initialize the parameters and randomly generate a point within the upper and lowerbounds as an initial solution, x .

Step 2: Randomly select k points from the concentric hyper-rectangles of current solution xfollowing the rules stated before to construct an initial neighbor solution set, X ini.

Step 3: Use the inner-loop constrained handing method to obtain a new neighbor set, Xnew, anda set storing their function values, Fnew.

Step 4: Sort Xnew in terms of the objective function value and select the first candidate.Step 5: Apply the aspiration criteria on the selected candidate; if it is satisfied, update the

best-known solution and go to Step 8; otherwise, go to Step 6.Step 6: Judge the tabu criteria; if the current solution is tabooed, go to Step 7; otherwise, go to

Step 8.Step 7: Judge if all the points in the neighbor solution set Xnew have been evaluated. If yes,

select and release the first point in the tabu list and go to Step 9; otherwise, select the next pointin the neighbor solution set Xnew and return to Step 6.

Figure 3. The overall flowchart of the combined algorithm.

Copyright q 2008 John Wiley & Sons, Ltd. Int. J. Numer. Meth. Engng 2008; 76:1869–1891DOI: 10.1002/nme

Page 8: A combined global and local search method to deal with constrained optimization for continuous tabu search

1876 X. CHEN ET AL.

Figure 4. The flowchart of the inner-loop constraint handling method.

Step 8: Update the tabu list and the current solution with the selected candidate.Step 9: Judge the termination criteria; if yes, stop the program and output the best-known

solution; otherwise, return to Step 2 and repeat the procedure.Step 10: Use the result of global search as an initial and repeat the original NLP by using pure

SQP.Step 11: Output the optimal solution after SQP terminates and end the program.Among those steps, Step 3 is a key to the nested strategy, which is responsible for the inner-

loop constraint handing. As also illustrated in Figure 4, the procedure of this step can be furtherelaborated as follows:

Step 3.1: Select a candidate, x , in the initial neighbor solution set, X ini.Step 3.2: Judge if the selected candidate is a feasible solution of the original optimization

problem; if yes, go to Step 3.3; otherwise, go to Step 3.4.Step 3.3: Compute the objective function, f (x), and add the candidate and its objective function

value to a new neighbor solution set, Xnew, and the function value set, Fnew, respectively; then goto Step 3.5.

Step 3.4: Conduct the sub-optimal problem as shown in Equation (3) by using SQP method;it terminates if either the SQP method converges or a predefined number of iteration is reached;compute the objective function in terms of Equation (4) of the new point, x ′; add the new point tothe new neighbor solution set, Xnew, and the function value to the set Fnew; then go to Step 3.5.

Step 3.5: Judge if x is the last point in the initial neighbor solution set, X ini; if yes, go to Step 4;otherwise, select the next point in the initial neighbor solution set, X ini, and go to Step 3.2.

3. RESULTS AND DISCUSSIONS

In this section, a number of different constrained optimization problems are used to demonstrate theperformance of the combined global and local search method. Comparisons with other methods are

Copyright q 2008 John Wiley & Sons, Ltd. Int. J. Numer. Meth. Engng 2008; 76:1869–1891DOI: 10.1002/nme

Page 9: A combined global and local search method to deal with constrained optimization for continuous tabu search

CONSTRAINED OPTIMIZATION FOR CTS 1877

also discussed. The algorithm is implemented in Matlab 7.1 in Windows XP system on a PentiumIV 3.0G PC with 512M RAM. The standard SQP method in Matlab Optimization Toolbox [21] isused in the inner loop. The outer-loop CTS algorithm is coded by the authors. Unless otherwisespecified, for each problem 10 concentric hyper-rectangles are used with the radii set defined inEquations (7)–(9). The innermost hyper-rectangle, H0, is tabooed to restrict the search around anylocal minima. The radius h0 in this project is set as a relation of the bounds

h0=0.001×(x ub−x lb)/2 (12)

A set of neighbors are generated at each iteration. Among them, 10 points are selected randomlyin H1; and each other point is selected inside Hi , for i varying from 2 to 9. Tolerance � f is setas 10−6. The number of the maximum iteration number (Imax) is set as 1e4. The program stops ifthe objective function has not been improved for 80 iterations.

3.1. Feasibility demonstration with simple examples

As stated in Section 2, the constraint handling method for CTS is proposed based on the idea thatit is capable of punishing the candidates out of the feasible region, leading the search back intothe feasible region, and keeping the search within the feasible region. To easily demonstrate thefeasibility of the proposed method, three relatively simple examples are first presented as follows.It should be noted that for demonstration the parameters are not exactly the same as specified inthe previous paragraph. In addition, only the nested inner- and outer-loop strategy in Steps 1–9 isused at this stage; the further refinement using SQP for the global search result in Step 10 is notapplied.

3.1.1. Connected feasible region with inequality constraints only. Test problem 1 [22]:min f (x)=(x1−0.8)2+(x2−0.3)2

s.t. g1(x)=[(x1−0.2)2+(x2−0.5)2]−0.16�0

g2(x)=0.81−[(x1+0.5)2+(x2−0.5)2�0

xi ∈(0,1), i=1 :2This is a two-dimensional non-linear problem with inequality constraints only. The crescent-shapedfeasible region formed by constraint functions g1(x) and g2(x), as shown in Figure 5, is non-convex.The contour plot of the objective function is also illustrated by the dashed ellipses in the figure.The optimal solution, xopt, marked with an asterisk in the figure, is at (0.5794736,0.3735097).An initial solution, denoted as x0 in the figure, is randomly generated out of the feasible region.Following the neighbor generation method described before, four candidates, denoted as x10 , x

20 , x

30

and x40 in the figure, are first generated. Because of the stochastic feature of TS, the first three initialneighbor candidates fall out of the feasible region. By applying the constraint handling method,these initial candidates move into the feasible region to three new candidate points, x10

′, x20

′and

x30′. The analysis on the sub-optimal problem in Equation (3) tells that the inner-loop optimization

terminates immediately after the candidates move into the feasible region. The trace of how thethird candidate, x30 , moves to a new feasible candidate, x30

′, is also drawn in Figure 5. It shows

that by applying SQP method to the sub-optimal problem, it converges to the feasible point after

Copyright q 2008 John Wiley & Sons, Ltd. Int. J. Numer. Meth. Engng 2008; 76:1869–1891DOI: 10.1002/nme

Page 10: A combined global and local search method to deal with constrained optimization for continuous tabu search

1878 X. CHEN ET AL.

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

0x 10x

20x

30x

40x

10 'x

20 'x

30 'x

optx

x2

x1

Figure 5. Plot of feasible neighbor generation of Test problem 1.

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

0xoptx

x2

x1

Figure 6. The searching result of Test problem 1 using the proposed nested optimization method.

only three iterations. As all candidates have moved into the feasible region, the penalty part inEquation (4) will not influence the original objective function. Among these four new candidates,x10

′results in the minimum objective function value and is thus selected as a new solution in the

outer-loop optimization. Figure 6 illustrates how the program propagates from the initial solutionto a final solution very close to the global optimal point. It should be noted that there is a total ofabout 110 points in the figure, most of which cluster around the global optimum.

Copyright q 2008 John Wiley & Sons, Ltd. Int. J. Numer. Meth. Engng 2008; 76:1869–1891DOI: 10.1002/nme

Page 11: A combined global and local search method to deal with constrained optimization for continuous tabu search

CONSTRAINED OPTIMIZATION FOR CTS 1879

3.1.2. Connected feasible region with both equality and inequality constraints. Test problem 2 [12]:min f (x)=(x1−2)2+(x2−1)2

s.t. x1−2x2+1=0

x21/4+x22 −1�0

This is also a two-dimensional non-linear problem, but with both equality and inequality constraints.As shown in Figure 7, the feasible region is a line intercepted by an ellipse. The optimal solutionis at the upper-right cross point of the line and the ellipse. Because of the existence of the equalityconstraint, it is basically impossible to randomly generate a feasible neighbor candidate with thetraditional CTS. Figure 7 also shows how the constraint handling method successfully works tolead the initial infeasible neighbor candidates moving to new feasible solutions all on the line.It should be noted that the intermediate SQP iterates are omitted in the drawing; only results ateach iteration of the outer-loop optimization are presented. Figure 8 shows how the nested methoddrives the iteration to the global optimal point. We can see from the converging trace that theproposed nested optimization method can efficiently lead the outer-loop search along the line ofthe feasible region till the global optimal solution, xopt, is obtained. As a comparison, the resultof Deb’s approach is also presented in the figure. Deb’s approach [22] is a typical and effectivepenalty function-based approach that does not require any penalty parameter. Comparisons amongfeasible and infeasible solutions are made so as to provide a search direction towards the feasibleregion. Figure 8 shows that this penalty function approach is not as effective as our method tolead the search within the feasible region. This is also reasonable for the existence of the equalityconstraints; it results in the penalty function having little effect. Our nested method, as a contrast,has the powerful ability to lead the search back into the feasible region by dealing with constraintsusing SQP.

-2 -1.5 -1 -0.5 0 0.5 1 1.5 2-1

-0.8

-0.6

-0.4

-0.2

0

0.2

0.4

0.6

0.8

1

0x 10x

20x

30x

40x

40 'x

30 'x2

0 'x

10 'x

optx*

x2

x1

Figure 7. Plot of feasible neighbor generation of Test problem 2.

Copyright q 2008 John Wiley & Sons, Ltd. Int. J. Numer. Meth. Engng 2008; 76:1869–1891DOI: 10.1002/nme

Page 12: A combined global and local search method to deal with constrained optimization for continuous tabu search

1880 X. CHEN ET AL.

x1

x2

-2 -1.5 -1 -0.5 0 0.5 1 1.5 2-1

-0.8

-0.6

-0.4

-0.2

0

0.2

0.4

0.6

0.8

1

Nested methodDeb's approach

Xopt

X0

Figure 8. Comparison of Deb’s approach and the nested method for Test problem 2.

3.1.3. Unconnected feasible region. Test problem 3:

min f (x)=(x1−0.8)2+(x2−0.3)2

s.t. g1(x)=[(x1−0.2)2+(x2−0.5)2]−0.16�0

g2(x)=0.81−[(x1+0.5)2+(x2−0.5)2]�0

g3(x)=0.04−[(x1−0.57)2+(x2−0.4)2]�0

xi ∈(0,1), i=1 :2This is a modification of Test problem 1 by adding another inequality constraints, g3(x). Thecrescent-shaped feasible region formed by g1(x) and g2(x) is divided by g3(x) into two unconnectedregions, which are bordered by solid lines as shown in Figure 9. It is also purposely designedthat the global optimal point lies in the part with the smaller area. It locates at (0.4848,0.2191),the rightmost point of the lower region. Figure 9 shows that both regions can be visited duringthe iteration and the global optimal point can be successfully reached even if it is located in thesmaller region. We believe that this is mainly due to the diversification feature of CTS in generatingneighbors. The unconnected feasible regions are like absorbers and the SQP method is somehowlike the magnetic force pulling the outside point into it. The bigger region may have bigger force.Even if the global optimal point is in a small region, it is still possible to be pulled into its regionif the randomly generated neighbor is close to that region.

3.2. Comparison with penalty function method

In the previous subsection, we demonstrated the proposed method through some relative simpleexamples, which may also be solved readily by other methods. In this subsection, a number

Copyright q 2008 John Wiley & Sons, Ltd. Int. J. Numer. Meth. Engng 2008; 76:1869–1891DOI: 10.1002/nme

Page 13: A combined global and local search method to deal with constrained optimization for continuous tabu search

CONSTRAINED OPTIMIZATION FOR CTS 1881

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

x2

x1

xopt

x0

Figure 9. The searching result of Test problem 3 using the proposed nested optimization method.

of relatively complex examples will be used to further compare the performance of the nestedproposed method with that of the penalty-function-based Deb’s approach. The test problems areas follows:

Test problem 4 [22]:min f (x)=(x1−10)2+5(x2−12)2+x43 +3(x4−11.0)2

+10x65 +7x26 +x47 −4x6x7−10x6−8x7

s.t. g1(x)=127−2x21 −3x42 −x3−4x24 −5x5�0

g2(x)=282−7x1−3x2−10x23 −x4+x5�0

g3(x)=196−23x1−x22 −6x26 +8x7�0

g4(x)=−4x21 −x22 +3x1x2−2x23 −5x6+11x7�0

−10�xi�10, i=1, . . . ,7

The best-known solution is

x∗ = (2.330499,1.951372,−0.4775414,4.365726,−0.6244870,1.038131,1.594227)

f (x∗) = 680.630111

Copyright q 2008 John Wiley & Sons, Ltd. Int. J. Numer. Meth. Engng 2008; 76:1869–1891DOI: 10.1002/nme

Page 14: A combined global and local search method to deal with constrained optimization for continuous tabu search

1882 X. CHEN ET AL.

Test problem 5 [22]:min f (x)=exp(x1x2x3x4x5)

s.t. h1(x)= x21 +x22 +x23 +x24 +x25 =10

h2(x)= x2x3−5x4x5=0

h3(x)= x31 +x32 =−1

−2.3�xi�2.3, i=1,2

−3.2�xi�3.2, i=3,4,5

The best-known solution is

x∗ = (−1.717143,1.595709,1.827247,−0.7636413,−0.7636450) or

(−1.717143,1.595709,1.827247,0.7636413,0.7636450) or

(−1.717143,1.595709,−1.827247,0.7636413,−0.7636450) or

(−1.717143,1.595709,−1.827247,0.7636413,−0.7636450)

f (x∗) = 0.0539498

Test problem 6 [18]:min f (x)= x0.61 +x0.62 −6x1−4x3+3x4

s.t. −3x1+x2−3x3=0

x1+2x3�4

x2+2x4�4

(0 0 0 0)�x�(3 4 2 1)

The best-known solution is

x∗ =(1.333333 4 0 0), f (x∗)=−4.514202

Test problem 7 [23]:min f (x)=(x1−1)2+(x1−x2)

2+(x2−x3)3+(x3−x4)

4+(x4−x5)4

s.t x1+x22 +x33 =3√2+2

x2−x23 +x4=2√2−2

x1x5=2

xi ∈[−5,5], i=1 :5The best-known solution is

x∗ = (1.1166,1.2204,1.5378,1.9728,1.7911)

f (x∗) = 0.0293

Copyright q 2008 John Wiley & Sons, Ltd. Int. J. Numer. Meth. Engng 2008; 76:1869–1891DOI: 10.1002/nme

Page 15: A combined global and local search method to deal with constrained optimization for continuous tabu search

CONSTRAINED OPTIMIZATION FOR CTS 1883

Table I. Results of the proposed constraint handling method on Test problems 4–7.

Average Average Optimized objective function valueiteration time P1 P2 Best-known

Test no. cost (s) (%) (%) solution Best Median Worst

4 936.1 204.72 44.74 100 680.630111 680.7820 681.1449 681.95405 82.5 65.399 0 66.27 0.0539498 0.0539498 0.0539506 0.05399436 57.1 9.3906 1.06 82.31 −4.514202 −4.5142 −4.5142 −4.51427 82.4 55.444 0 61.64 0.0293 0.029311 0.029317 0.029338

Table II. Detailed results of the optimized solutions of Test problems 4–7 using the nested method.

Test Xopt Xbkn Xerr=‖Xopt−Xbkn‖4 (2.329751,1.921844, (2.330499,1.951372, 0.1502

−0.356137,4.436580, −0.4775414,4.365726,−0.631569,1.080501, −0.6244870,1.038131,

1.602742) 1.594227)5 (−1.716520,1.594988, (−1.717143,1.595709, 0.0015

1.828406,0.7638827, 1.827247,0.7636413,0.76354) 0.7636450)

6 (1.333333,4,0,0) (1.333333,4,0,0) 07 (1.1169,1.2202,1.5378, (1.1166,1.2204,1.5378, 6.8557e−4

1.9731,1.7906) 1.9728,1.7911)

Ten runs are performed on each problem with different random initials. The results are presented inTable I. The information of average iteration number and average time cost are listed in the secondand third columns, respectively. To have a detailed analysis on the constraint handling performance,two statistical metrics, P1 and P2, relating to the probability of generating feasible neighborsare calculated as shown in the fourth and fifth columns of Table I. P1 denotes the probability ofthe feasible neighbors among those initial neighbors randomly selected in the concentric hyper-rectangles. P2 is the probability of the feasible neighbors after constraint handling falling into thefeasible region. Both metrics are calculated by averaging all the intermediate results during theprocedures of all the 10 runs. Since test problem 4 has only inequality constraints, it has relativelylarge possibility of randomly generating a feasible initial solution. For other cases with equalityconstraints, it is very difficult to generate a feasible solution randomly. But after being dealt withour proposed constraint handling method, the neighbors have large probabilities of being led intothe feasible region. The best, the median and the worst results of each problem are listed in thelast three columns of Table I. Compared with the best-known solution, we can see that our methodhas repeatable good performance in reaching the global optima; even the worst solutions are veryclose to the optimal solutions. Besides the objective function, we also study the solution values.A norm distance metric of the optimized result, Xopt, from the best-known solution, Xbkn, of eachexample is calculated as follows:

Xerr=‖Xopt−Xbkn‖ (13)

The results are shown in Table II, in which the first column lists the best optimized solutionobtained by the combined method, the second column lists the best-known solution and the thirdcolumn lists the distance between them calculated by using Equation (13).

Copyright q 2008 John Wiley & Sons, Ltd. Int. J. Numer. Meth. Engng 2008; 76:1869–1891DOI: 10.1002/nme

Page 16: A combined global and local search method to deal with constrained optimization for continuous tabu search

1884 X. CHEN ET AL.

Table III. Results of TS with Deb’s constraint handling method on Test problems 4–7.

Optimized objective function valueAverage Average Best-knownTest iteration no. time cost (s) P1 (%) solution Best Median Worst

4 8635 25.4 36.98 680.630111 680.8621 680.9314 681.04155 9474.2 27.75 0 0.0539498 — — —6 5071.6 14.7 10.91 −4.514202 0 0 07 11330 36.25 0 0.0293 — — —

As a comparison, we also do the same test using CTS with the constraint handling methodproposed by Deb’s approach. To avoid falling into local optima, the maximum iteration numberwithout any improvement of objective function is increased to 5000. The results are listed inTable III. Test problem 4 is successfully solved after a long iteration. As there is no equalityconstraint, the problem has relatively large probability of randomly generating a feasible solutionduring the iteration. But the optimized results are a bit worse than those of our nested method.No feasible solutions are obtained for test problems 5 and 7 due to the existence of the equalityconstraints. In the original Deb’s results using genetic algorithm [22], it is reported that Testproblem 5 has only 19 out of 50 runs to converge within 1% of the best-known solution even withtheir maximum generation setting, and 31 other runs converged out of 50% of the best-knownsolution. The best, median and worst solutions in that report are 0.053950, 0.241289 and 0.507761,which are obviously worse than our results. Deb’s approach finds a feasible solution, the originpoint, for Test problem 6. The 10.91% probability of P1 in this test is because of a special actionin our CTS algorithm that a randomly generated neighbor will be pulled to the bound if it is out ofthe bounds. The origin is such a bound that some of randomly generated points are pulled to it, andit happens to be a feasible point. But this feasible solution is far from the best-known solution. Asa summary, we can see that our method has obvious advantage in dealing with equality constraints.

3.3. Comparison with SQP

The above test problems demonstrate that the proposed nested method is efficient in dealing withconstraints. However, most of the above problems could also be solved by standard SQP quickly.Hereby we present some examples either with many local optima or non-continuous, which areunsuitable to be solved by pure SQP. TS algorithm will exhibit the superiority in solving theseproblems.

Test problem 8 [24]:min f (x)=−a∗ exp(−b∗sqrt(1/n∗sum(x2i ))−exp(1/n∗sum(cos(c∗xi )))+a+exp(1)

s.t. x1+x2−10�0

x1−x2−10�0

−x1−x2−10�0

−x1+x2−10�0

15sin(x21 +x22)−x3=0

a=20, b=0.2, c=2�

−32.768�xi�32.768, i=1 :3Copyright q 2008 John Wiley & Sons, Ltd. Int. J. Numer. Meth. Engng 2008; 76:1869–1891

DOI: 10.1002/nme

Page 17: A combined global and local search method to deal with constrained optimization for continuous tabu search

CONSTRAINED OPTIMIZATION FOR CTS 1885

Figure 10. Plot of modified Ackley’s path problem.

The best-known solution to this problem is as follows:

x∗ = (0,0,0)

f ∗ = 0

This problem is also referred to as Ackley’s path function, which has many local minima. Theoriginal problem is constraint-free. To test the performance in dealing with the constraints, theauthors add constraints without varying the global optimal solution. It has one equality constraintand three independent variables, resulting in two degrees of freedom. To illustrate the complexityof the problem in a three-dimensional plot, we substitute x3 by the other two variables and presentthe plot in Figure 10. The vertical axis is the objective function value, and two horizontal axes arevariables x1 and x2. The shadowed area is the feasible region of the transformed problem. We canclearly see that there are many local minima. It should be noted that the substitution is only tobetter illustrate the complexity of the problem; we do not use the substitution method for solvingthe problem.

The efficiencies of our nested method and the standard SQP are further compared. The problemis tested for 10 runs from different random initial points using our nested method. All the 10runs reach very close to the global optimum with the results shown in Tables IV and V. Wecan see that the initial neighbors have a very low probability of falling into the feasible region;but after applying the constraint handling method, they are well led back to the feasible regionwith relatively large probability. The average computing time of the 10 runs is 10.639 s. As acomparison, we also conducted 10 runs with those same initials using the standard SQP in Matlab.None converges to the global optima with the best objective function value as 16.09, which is farfrom the global minima.

Copyright q 2008 John Wiley & Sons, Ltd. Int. J. Numer. Meth. Engng 2008; 76:1869–1891DOI: 10.1002/nme

Page 18: A combined global and local search method to deal with constrained optimization for continuous tabu search

1886 X. CHEN ET AL.

Table IV. Comparison results of the nested algorithm and SQP.

Average Average Best Optimized objective function valueiteration time P1 P2 Best-known solution of

Test no. cost (s) (%) (%) solution SQP Best Median Worst

8 60.9 10.639 2.08 99.92 0 16.09 1.5099e−13 1.5099e−13 1.5099e−139 55.1 32.061 0 54.18 189.311627 — 189.311630 189.311630 189.311630

Table V. Detailed results of the optimized solutions of Test problems8 and 9 using the nested method.

Test Xopt Xbkn Xerr=‖Xopt−Xbkn‖8 (−5.3e−15,2.7e−15,4.4e−16) (0,0,0) 6.0e−159 (0,16.6667,100) (0,16.6667,100) 0

Test problem 9 [18]:min f (x)=35x0.61 +35x0.62

s.t. 600x1−50x3−x1x3+5000=0

600x2+50x3−x1x3−15000=0

(0,0,100)�x�(34,17,300)

The best-known solution to this problem is as follows:

x∗ = (0,16.6667,100)

f ∗ = 189.311627

This problem is non-continuous at the optimal solution due to the existence of the exponent terms.Moreover, the objective function is in real domain only if the variables are non-negative. As thestandard SQP does not limit the search within the feasible region path, it always fails when thealgorithm passes the boundary when it approaches the optimal solution. Table IV lists the resultsof this problem by using our nested method, from which we can see that the method effectivelysolves the problem. Table V lists the detailed solution values of this two test problems, whichfurther demonstrate the performance of the nested constraint handling method. Our nested method,though also using SQP in the inner loop, will not have that problem as the inner SQP is onlyused to deal with constraints. We believe that global optimization problems may have many localminima, but normally their constraints are relatively simple. Our nested method has advantages todeal with those types of constrained problems.

3.4. Comparison with CONOPT algorithm in GAMS

Three benchmark problems from GLOBALLib of GAMSWORLD [25] are also tested. Testproblem 12 has very complex constraints. The other two problems have many known local optima.Comparison with results using GAMS is also presented.

Copyright q 2008 John Wiley & Sons, Ltd. Int. J. Numer. Meth. Engng 2008; 76:1869–1891DOI: 10.1002/nme

Page 19: A combined global and local search method to deal with constrained optimization for continuous tabu search

CONSTRAINED OPTIMIZATION FOR CTS 1887

Test problem 10 (ex2 1 1 in [25]):min f (x)=−50(x21 +x22 +x23 +x24 +x25)+42x1+44x2+45x3+47x4+47.5x5s.t. 20x1+12x2+11x3+7x4+4x5�40

The best-known solution to this problem is as follows:

x∗ = (1,1,0,1,0)

f ∗ = −17

Test problem 11 (ex6 1 2 in [25]):min f (x)= x1(0.06391+ log(x1))+x2(log(x2)−0.02875)

+0.925356626778358x1x4+0.746014540096753x2x3s.t. x3(x1+0.159040857374844x2)−x1=0

x4(0.307941026821595x1+x2)−x2=0

x1+x2=1

The best-known solution to this problem is as follows:

x∗ = (0.004210226087,0.9957897739,0.02589609643,0.9986997100)

f ∗ = −0.0324637513048

Test problem 12 (ex7 3 5 in [25]):min f (x)= x4

s.t. x13x83 −x11x

63 +x9x

43 −x7

√x3+x5=0

x12x63 −x10x

43 +x8

√x3−x6=0

−x1−0.145x4�−0.175

x1−0.145x4�0.175

−x2−0.15x4�−0.2

x2−0.15x4�0.2

−4.53√x1+x5=0

−(5.28√x1+0.364x1)+x6=0

−(5.72√x1x2+1.13

√x1+0.425x1)+x7=0

−(6.93√x1x2+0.0911x1)+x8=0.00422

−(1.45√x1x2+0.168x1x2)+x9=0.000338

−(1.56√x1x2+0.00084

√x1x2+0.0135x1x2)+x10=0.0000135

−(0.125√x1x2+0.0000168

√x1x2+0.000539x1x2)+x11=0.00000027

−(0.005√x1x2+0.0000108x1x2)+x12=0

−0.0001√x1x2+x13=0

0�x3�10

Copyright q 2008 John Wiley & Sons, Ltd. Int. J. Numer. Meth. Engng 2008; 76:1869–1891DOI: 10.1002/nme

Page 20: A combined global and local search method to deal with constrained optimization for continuous tabu search

1888 X. CHEN ET AL.

The best-known solution to this problem is as follows:

x∗ = (0,0.145269517241,0,1.20689655172,0,0,0,0.00422,0.000338,

0.0000135,0.00000027,0,0.000000177042)

f ∗ = 1.20689655172

Table VI presents the results of the proposed constraint handling method on Test problems 10–12.The information of P1 in the table shows that Test problem 10 has large probability of generatingfeasible neighbors as it has only inequality constraints. The other two problems, however, have anumber of equality constraints, resulting in a very low probability of having feasible neighborspurely by random generation. The nested constraint handling method can effectively lead andkeep the search within the feasible region as revealed by information of P2 in the table. Additionally,the best optimized results of our method on Test problems 11 and 12 are slightly better thanthe best-known solutions reported in [25]. The detailed results of the optimized solutions of Testproblems 10–12 are listed in Table VII.

The above test problems are also studied by solving them with GAMS, famous commercialsoftware for optimization. CONOPT, the default algorithm for NLP, is used. Fifty runs are conductedfor each test problem with different randomly initialized values. The results are listed in Table VIII.Only one among 50 runs reaches the best-known solution in Test problem 10. The other 49 runs

Table VI. Results of the proposed constraint handling method on Test problems 10–12.

Average Average Optimized objective function valueiteration time P1 P2 Best-known

Test no. cost (s) (%) (%) solution Best Median Worst

10 61.3 1.1438 96.17 100 −17 −17 −17 −1711 64.2 76.593 0.60 36.73 −0.0324637 −0.0325065 −0.0324973 −0.000004812 176.5 180.80 0 100 1.20689655 1.20689572 1.21031143 1.32038654

Table VII. Detailed results of the optimized solutions of Test problems 10–12 using the nested method.

Test Xopt Xbkn Xerr=‖Xopt−Xbkn‖10 (1,1,0,1,0) (1,1,0,1,0) 011 (0.004224919,0.9957751, (0.004210226,0.9957898, 1.3164e−4

0.02598440,0.9986951) 0.02589610,0.9985997)12 (0,0.062140154,0,1.20689572, (0,0.1452695,0,1.20689655,0,0,0, 0.0831

0,0,0,0.00422,0.000337, 0.00422,0.000338,0.0000135,0.0000130,0,0,0) 0.00000027,0,0.000000177)

Table VIII. GAMS results of Test problems 10–12.

Success run no. Local optimum no. Infeasible solution no.Test among 50 runs among 50 runs among 50 runs

10 1 20 011 26 1 012 13 2 9

Copyright q 2008 John Wiley & Sons, Ltd. Int. J. Numer. Meth. Engng 2008; 76:1869–1891DOI: 10.1002/nme

Page 21: A combined global and local search method to deal with constrained optimization for continuous tabu search

CONSTRAINED OPTIMIZATION FOR CTS 1889

TableIX

.Im

provem

entof

Xerrby

applying

SQPaftertheglobal

search.

Run

1Run

2Run

3Run

4Run

5Run

6Run

7Run

8Run

9Run

10

Test

40.23

560.18

080.22

460.32

590.65

860.18

970.15

020.83

290.99

380.18

982.2E

−07

2.2E

−07

2.2E

−07

2.2E

−07

2.2E

−07

2.2E

−07

2.2E

−07

2.2E

−07

2.2E

−07

2.2E

−07

Test

52.3E

−02

2.8E

−03

3.6E

−03

2.8E

−03

1.5E

−03

4.1E

−03

2.4E

−03

1.9E

−03

3.2E

−03

4.5E

−03

3.1E

−06

3.1E

−06

3.1E

−06

3.1E

−06

3.1E

−06

3.1E

−06

3.1E

−06

3.1E

−06

3.1E

−06

3.1E

−06

Test

65.1E

−09

1.2E

−08

1.5E

−08

8.4E

−09

5.0E

−09

8.2E

−09

5.2E

−09

1.9E

−08

1.6E

−09

1.2E

−08

00

00

00

00

00

Test

73.6E

−03

6.6E

−04

2.4E

−03

4.3E

−03

3.5E

−03

9.1E

−03

4.5E

−03

2.7E

−03

6.2E

−03

5.8E

−03

00

00

00

00

00

Test

86.0E

−15

6.0E

−15

6.0E

−15

6.0E

−15

6.0E

−15

6.0E

−15

6.0E

−15

6.0E

−15

6.0E

−15

6.0E

−15

6.0E

−15

6.0E

−15

6.0E

−15

6.0E

−15

6.0E

−15

6.0E

−15

6.0E

−15

6.0E

−15

6.0E

−15

6.0E

−15

Test

90

00

00

00

00

00

00

00

00

00

0Test

100

00

00

00

00

00

00

00

00

00

0Test

119.2E

−04

1.11

91.12

1.8E

−04

4.6E

−04

1.11

71.1E

−03

5.6E

−04

5.8E

−04

1.11

95.8E

−08

1.11

71.11

85.8E

−08

5.8E

−08

1.11

75.8E

−08

5.8E

−08

5.8E

−08

1.11

8Test

120.13

20.03

80.32

70.08

30.18

60.12

50.09

50.14

60.10

70.1

0.12

20.03

80.12

60.08

30.14

70.12

20.09

50.14

60.10

70.07

1

Copyright q 2008 John Wiley & Sons, Ltd. Int. J. Numer. Meth. Engng 2008; 76:1869–1891DOI: 10.1002/nme

Page 22: A combined global and local search method to deal with constrained optimization for continuous tabu search

1890 X. CHEN ET AL.

stops at 20 different local minima, which are far from the best-known solution. For Test problem11, 26 among 50 runs reach the best solution; the other 24 runs are trapped in a local minimum.For Test problem 12, 13 among 50 runs reaches the global optimal point; 27 runs are trapped ina local minimum and 1 run in another local minimum; the other 9 runs cannot have any feasiblesolutions.

3.5. Refinement of global search using SQP

The above results present the performance of only the nested inner- and outer-loop optimizationmethod on constraint handling. As shown in Figure 1, another module using SQP is conducted withthe global search results as initials. Table IX presents the results of this refinement of Test problems4–12. The first row under each test problem presents the global search results of 10 differentruns represented by the metric, Xerr, in Equation (13). It measures the distance of the optimizedsolution from the best-known solution. The second row presents the results after conducting theNLP using SQP. As TS is not a gradient-based method, it is very difficult and time consumingfor it to reach the exact global minimum. But after applying SQP again, the global search resultscan be quickly refined by using the gradient-based local search. It should be noted that except testproblems 11 and 12, each of the other test problems converges to the same minimum after usingthe SQP. The exception of test problem 11 is due to the fact that some of the previous resultsare trapped near a local minimum and SQP cannot lead it out. The exception of test problem 12is due to the non-uniqueness of the global minimum. The reported best-known solution is onlyone of them. In fact, x2 of this problem can be any value in an interval. As a conclusion, furtherrefinement using SQP can effectively help the search to reach a minimum.

4. CONCLUSION

This paper presents a new constraint handling method for continuous tabu search (CTS). A globaland local search method is proposed and demonstrated by combining SQP and CTS. The efficiencyis demonstrated by a number of examples. Although it is focused on TS, we believe that the methodcan be effectively extended to some other global search methods that iterate with random features,for which the pure penalty function may have little effects. In addition, the local search moduleafter global search finishes its work should be universal for refinement of the result. Additionally,the local search method is also not limited to SQP; other efficient gradient-based local searchmethod may also be used in the combination.

ACKNOWLEDGEMENTS

We gratefully acknowledge the financial support of the National Key Basic Research and DevelopmentProgram of China (No. 2002CB312200), National Natural Science Foundation of China (No. 60704029)and 863 Program of China (No. 2007AA04Z192).

REFERENCES

1. Glover F. Tabu search—Part I. ORSA Journal on Computing 1989; 1:190–206.2. Glover F. Tabu search—Part II. ORSA Journal on Computing 1990; 2:4–32.

Copyright q 2008 John Wiley & Sons, Ltd. Int. J. Numer. Meth. Engng 2008; 76:1869–1891DOI: 10.1002/nme

Page 23: A combined global and local search method to deal with constrained optimization for continuous tabu search

CONSTRAINED OPTIMIZATION FOR CTS 1891

3. Dubois N, De Werra D. Epcot: an efficient procedure for coloring optimally with Tabu search. Computers andMathematics with Applications 1993; 25:35–45.

4. Skorin-Kapov J. Tabu search applied to the quadratic assignment problem. Journal on Computing 1990; 2(1):33–41.

5. Bland A, Dawson GP. Tabu search and design optimization. Computer-Aided Design 1991; 23(3):195–201.6. Amellal S, Kaminska B. Scheduling algorithm in data path synthesis using the tabu search technique. IEEE

Proceedings of the EDAC-EUROASIC’93 Conference, Paris, 1993; 398–402.7. Hu N. Tabu search method with random moves for globally optimal design. International Journal for Numerical

Methods in Engineering 1992; 35:1055–1070.8. Siarry P, Berthiau G. Fitting of tabu search to optimize functions of continuous variables. International Journal

for Numerical Methods in Engineering 1997; 40:2449–2457.9. Chelouah R, Siarry P. Tabu search applied to global optimization. European Journal of Operational Research

2000; 123:256–270.10. Wang M, Chen X, Qian J. An improvement of continuous tabu search for global optimization. The 5th World

Congress on Intelligent Control and Automation, Hangzhou, China, 2004; 375–377.11. Michalewicz Z. A survey of constraint handling techniques in evolutionary computation methods. Proceedings

of the 4th Annual Conference on Evolutionary Programming, San Diego, U.S.A., 1995; 135–155.12. Coath G, Halgamuge SK. A comparison of constraint-handling methods for the application of particle swarm

optimization to constrained nonlinear optimization problems. Proceedings of the 2003 Congress on EvolutionaryComputation, Canberra, Australia, 2003; 2419–2425.

13. Homaifar A, Lai SH, Qi X. Constrained optimization via genetic algorithms. Simulation 1994; 64(4):242–254.14. Kalyanmoy D. Optimization for Engineering Design: Algorithms and Examples. Prentice-Hall: New Delhi, 1995.15. Joines J, Houck C. On the use of non-stationary penalty functions to solve nonlinear constrained optimization

problems with GAs. Proceedings of the First IEEE Conference on Evolutionary Computation, Orlando, U.S.A.,vol. 2, 1994; 579–584.

16. Michalewicz Z, Attia N. Evolutionary optimization of constrained problems. Proceedings of the 3rd AnnualConference on Evolutionary Programming, River Edge, U.S.A., 1994; 98–108.

17. Hadj-Alouane AB, Bean JC. A genetic algorithm for the multiple-choice integer program. Operations Research1997; 45:92–101.

18. Lin B, Miller DC. Tabu search algorithm for chemical process optimization. Computers and Chemical Engineering2004; 28:2287–2306.

19. Edgar TF, Himmelblau DM, Ladson LS. Optimization of Chemical Processes. McGraw-Hill: New York, 2003; 288.20. Nocedal J, Wright SJ. Numerical Optimization. Springer: Berlin, 2006; 193–201.21. The Mathworks, Inc. Optimization Toolbox for Use with MATLAB, User’s Guide, version 3. The Mathworks,

Inc.: Natick, U.S.A., June 2004; 29–38.22. Deb K. An efficient constraint handling method for genetic algorithms. Computer Methods in Applied Mechanics

and Engineering 2000; 186:311–338.23. Adjiman CS, Androulakis IP, Floudas CA. A global optimization method, � BB, for general twice-differentiable

constrained NLPs: II—Implementation and computational results. Computers and Chemical Engineering 1998;22(9):1159–1179.

24. Ackley DH. A Connectionist Machine for Genetic Hillclimbing. Kluwer Academic Publishers: Boston, MA, 1987.25. http://www.gamsworld.org/global/globallib/globalstat.htm.

Copyright q 2008 John Wiley & Sons, Ltd. Int. J. Numer. Meth. Engng 2008; 76:1869–1891DOI: 10.1002/nme