mars - a multistart adaptive random search method for global constrained optimization in engineering...

31
This article was downloaded by: [University of Nebraska, Lincoln] On: 17 October 2014, At: 11:53 Publisher: Taylor & Francis Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK Engineering Optimization Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/geno20 MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS VICTOR V. LITINETSKI a & BORIS M. ABRAMZON b a Israel Electric Corporation , POB 10, Haifa, Israel, 31000 b T.A.T. Aero Equipment Industries , POB 80, Gedera, Israel, 70750 Published online: 27 Apr 2007. To cite this article: VICTOR V. LITINETSKI & BORIS M. ABRAMZON (1998) MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS, Engineering Optimization, 30:2, 125-154, DOI: 10.1080/03052159808941241 To link to this article: http://dx.doi.org/10.1080/03052159808941241 PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content. This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http:// www.tandfonline.com/page/terms-and-conditions

Upload: boris-m

Post on 09-Feb-2017

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

This article was downloaded by: [University of Nebraska, Lincoln]On: 17 October 2014, At: 11:53Publisher: Taylor & FrancisInforma Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House,37-41 Mortimer Street, London W1T 3JH, UK

Engineering OptimizationPublication details, including instructions for authors and subscription information:http://www.tandfonline.com/loi/geno20

MARS - A MULTISTART ADAPTIVE RANDOM SEARCHMETHOD FOR GLOBAL CONSTRAINED OPTIMIZATION INENGINEERING APPLICATIONSVICTOR V. LITINETSKI a & BORIS M. ABRAMZON ba Israel Electric Corporation , POB 10, Haifa, Israel, 31000b T.A.T. Aero Equipment Industries , POB 80, Gedera, Israel, 70750Published online: 27 Apr 2007.

To cite this article: VICTOR V. LITINETSKI & BORIS M. ABRAMZON (1998) MARS - A MULTISTART ADAPTIVE RANDOM SEARCHMETHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS, Engineering Optimization, 30:2, 125-154,DOI: 10.1080/03052159808941241

To link to this article: http://dx.doi.org/10.1080/03052159808941241

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) containedin the publications on our platform. However, Taylor & Francis, our agents, and our licensors make norepresentations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of theContent. Any opinions and views expressed in this publication are the opinions and views of the authors, andare not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon andshould be independently verified with primary sources of information. Taylor and Francis shall not be liable forany losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoeveror howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use ofthe Content.

This article may be used for research, teaching, and private study purposes. Any substantial or systematicreproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in anyform to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions

Page 2: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

Eng. Opt.. 1998. Vol. 30, pp. 125-154 Reprinu available directly from the publisher Photaopying permitted by l i a n v only

Q 1998 OPA (Ovcrsss Publirhcn Association) Amrtcrdam B.V. hblirhed under l i a n v

under lhc Cordon and Breach Scicna Publirhcrr imprint.

Primcd in India.

MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR

GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

VICTOR V. LITINETSKIa and BORIS M. ABRAMZON~

'Israel Electric Corporafion, POB 10, Haifa, Israel, 31000; b ~ . ~ . ~ . Aero Equipment Industries, POB 80, Gedera, Israel 70750

(Received 30 Augur 1996)

A multistart, step-controlled random search algorithm for global, constrained optimi- zation is proposed. The method is found to be very efficient for solving a variety of constrained nonlinear optimization problems. The performance of the method and its comparison with another stochastic search algorithm, simulating annealing, are demon- strated through a number of standard test problems involving multimodal objective functions with continuous and mixed-discrete variables. The applications of the method to a number of practical engineering optimization cases in the field of turbine design and compact heat exchangers are discussed.

Keywords: Global constrained optimization; multistart random search; turbines; heat exchangers

INTRODUCTION

The problem of global optimization of a multimodal function with constraints has been intensively studied in the past two decades. Extended reviews on the subject were presented by Dixon and Szego [I] , and Torn and Zilinskas [2]. Objective functions arising in many engineering applications may be non-differentiable and even discon- tinuous, and some of the independent variables may only take discrete values. For these cases, classical "deterministic" methods of optimiza- tion search such as gradient and direct methods described in several

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 3: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

126 V. V. LITINETSKI AND B. M. ABRAMZON

textbooks [3, 4, 51 are not applicable. Particularly, deterministic methods suffer from two major deficiencies: (1) difficulty in treating complex nonlinear constraints, and (2) the search procedure can get stuck in local optima. Another class of methods applicable to the global optimization problem are the so-called stochastic o r random search methods. In the past, random search methods were considered inefficient since they required many more function evaluations than gradient or direct methods. However, the continuous growth of computer power and reduction of the computation cost, promoted the increasing use of stochastic methods.

A number of stochastic search methods are described in Refs. [I, 2, 3, 51 and numerous papers in periodical journals. The mathematical background of stochastic algorithms (also called Monte Carlo methods) has been discussed by Rubinstein [7] and Rinnooy Kan and Timmer [8, 91. Random algorithms allow in principle to find a global minimum. The solution does not depend on the starting point. The methods are simple, easily programmed and can be applied to a wide range of problems with arbitrary constraints. The stochastic methods of global optimization called the simulated annealing (SA) algorithm [6, 10, 1 I, 121, the genetic algorithm (GA) [13, 14, IS, 161, and the tabu search algorithm [17, 181 became very popular in the engineering design community during the last decade. Although originally developed for solving combinatorial optimization problems, these three methods have been later successfully applied to problems with continuous and discrete variables.

The simplest random search method makes random moves in the neighborhood 'o f the current location which is defined by the coordinate vector x ( ~ ) . If some move x(~)* x ( ~ + ' ) satisfies the con- straints and improves the value of the objective function: F(x(~+') ) < F(x(~)) , the new location is accepted as a current solution. Otherwise, the move is rejected. Various modifications of the above algorithm are suggested in the literature [I -51 in order to improve the convergence rate and to avoid becoming trapped in local minima.

The simulated annealing method makes the analogy between finding the minimum of a function and the process of cooling a liquid to its lowest-energy configuration. Unlike the regular random search, the SA may accept moves which do not improve the value of the objective function. The probability of accepting a move is given by the

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 4: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

GLOBAL OPTIMIZATION 127

expression: P = min{l, ~ X ~ [ - ( F ( X ( ~ + ' ) ) - F(x(~)))/KT]}, where K is a normalization constant, and T is a control parameter referred to as "temperature" by analogy with the real annealing process. The temperature is gradually reduced during the search process. The SA method is well suited for solving problems of combinatorial optimization with large numbers of variables, such as the traveling salesman [6] or design of complex integrated circuits. Recently, SA has been successfully used for global optimization over continuous variables (Brooks and Verdini [I I]), and for mixed-discrete variables (Zhang and Wang [12]).

The genetic algorithms are based on the mechanisms of natural selection and natural genetics leading to survival of the fittest individuals in a population. The algorithm starts with a random selection of an initial group of search points, which is considered as a first generation of the population. The genetic algorithms involve an encoding procedure which first maps each problem variable to an integer in a specified range, and then encodes this integer as a binary string of fixed length. The binary codes of all the variables are then concatenated into a binary string representing an artificial chromo- some. Every binary character (0 or 1) of the string is interpreted as an artificial gene. The fi tnessjt of a search point may be evaluated as a scaled value of the objective function a t that point: fit = (Fmax -F)/ (Fmsx-Fmin ), where Fma,, Fmi, are the maximum and minimum values of the objective function in the current population. The reproduction of a new generation from the current one is made by applying the main genetic operators: selection, crossover, mutation, etc. The selection operator selects the strings with highest fitness values as parents to reproduce offspring. The strings with low fitness are excluded from the further process. The stochastic crossover operator exchanges the corresponding segments (or even bits of a string) between the pair of parent strings to be crossed. The mutation operator arbitrarily alters the gene value with some prescribed probability. There are a lot of different modifications of the genetic algorithms in the literature. The state-of-the-art in the field is presented in the detailed review by Srinivas [I31 . The genetic algorithms have been successfully applied to nonlinear mixed-discrete optimization problems [14, 15, 161. The G A (as other stochastic algorithms) has a relatively low convergence rate and therefore requires a high number of function evaluations.

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 5: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

128 V. V. LITINETSKI A N D B. M. ABRAMZON

To improve the overall convergence rate, Syrjakow and Szczerbicka [I51 proposed a method subdividing the optimization process into two steps: a) search by a GA, and b) fine local optimization using a fast converging hill-climbing or steepest-descent strategy. In addition, their approach included a special feature allowing to avoid re-exploration of already investigated regions of the search space in the course of global pre-optimization.

The so-called Tabu Search method (TS) represents a memory-based stochastic technique. In the course of the search procedure, the information about a certain number of recent moves is recorded in the "tabu list" which is used to avoid exploring the already checked solutions. The TS algorithm has been successfully applied to problems with continuous and discrete variables [17, 181.

The above discussion shows that the new stochastic methods SA, GA and TS can be considered as efficient and general approaches for solving global optimization problems. However, since these methods have been originally developed for combinatorial~optimization, their adaptation for problems with continuous and mixed-discrete variables does not have adequate grounds. Furthermore, both the SA and GA methods have originated from Simulation, of natural phenomena (thermodynamics or biology). The artificial character of the analogy between these processes and the optimization search makes it difficult to provide clear,physical meaning for optimization control parameters such as temperature and cooling schedule in SA, or selection, crossover and mutation coefficients in GA. Since the algorithm performance is sensitive to the control parameter values, significant time should be spent by the user to "tune" the method for different classes of problems. ~ o t h GA and tabu search algorithms require storing a considerable amount of information in the computer memory.

This paper presents a new multistart, step-controlled random search algorithm for global, constrained optimization. This method can be considered as a generalization of known methods of random search with systematic reduction of the size of the search region [19, 201. The algorithm performance is investigated over a number of standard test problems including multimodal objective functions, nonlinear con- straints, mixed-discrete variables. The last section of the paper discusses

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 6: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

GLOBAL OPTIMIZATION 129

the authors' experience with the method in the fields of turbomachinery and compact heat exchangers.

THE SOLUTION PROCEDURE

A general formulation of the nonlinear mathematical programming problem is as follows:

Find the minimum of a function of N variables

F = F(X)

subjected to the side constraints

to the inequality constraints

and to the equality constraints

Here, X = {x,, ~ 2 , . . ., xNlT is the vector of the independent variables, and the number of equality constraints k does not exceed the total number of variables N.

Note that, similar to most optimization algorithms, the inequality constraints (3) may be treated by an automatic rejection of moves violating the constraints. Another approach applicable to both inequality and equality constraints converts the problem to an equivalent unconstrained one. This may be done, for instance, by employing the so-called "exterior penalty function" which is added to the objective function [3, 51. The modified objective function for the minimization is expressed in the form

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 7: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

130 V. V. LlTlNETSKI A N D B. M. ABRAMZON

where r,, > 0 is the penalty parameter. Recommendations on selecting the value of r, can be found in the literature (see, for instance, Refs. [3, 5 , 241).

The main ideas of the random search algorithm under consideration were discussed by Rastrigin [19] as early as 1965. Note that a somewhat similar approach is described in Refs. [5, 201. Surprisingly, the algorithm remained almost unnoticed and its performance has not been properly investigated. However, the method. was successhlly applied by the first author of the present paper to several practical optimization problems in the field of turbine design [26, 271.

The proposed algorithm starts a t some point x ( O ) = { x r ) , xr), . . . , (0) T . .

xN } w ~ t h ~ n the domain of interest where all constraints are satisfied. This point is marked as "old", and the objective function at that point is evaluated:

The steps in the search procedure are described as follows:

Step I The next move is performed to a random point defined by the equations: xj"ew) = x Y d ) + Azi ( i = I , 2 , . . . , N ) where Az i is the actual move in the ith coordinate direction calculated as: Az i = Axi.ri . The value A x i represents the so-called current maximum step size, and ri is a random number in the range [-I, I]. The maximum step size Axi decreases in the course of the optimization search. The initial values of the maximum step sizes are taken to be Axi = Sf. ( x y - x y i n ) where the scale factor Sf = O (I). Since the starting maximum step size is of the order of the whole interval length, a certain probability always exists that the new trial point will fall in the vicinity of the global mimimum. The values of the objective function and of the side and inequality constraints are evaluated a t the new point. If any constraint is violated o r the new value of the objective function is greater than at the previous point the move is considered to be "wrong". In this case, Step 1 is repeated. If the random steps fail to improve the value of the objective function after some large enough number of trials (say NW,=50), the maximum step size in each direction is reduced: Ax?" = Srr. Ax;ld (0 < Srr 5 1.0). The maximum step size is reduced with the aim of accelerating the convergence by locating

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 8: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

GLOBAL OPTIMIZATION 131

the search domain in the vicinity of the possible minimum. The parameters N,, (permitted number of wrong trials with the same maximum step size A x ) and Srf (step reduction factor) are problem dependent. SIep 2 If the previous move was successful, the point is renamed to be "old": xjo'd) r xinew), ~ z j o ' d ) + Azi (i = 1,2 , . . . , N ) and a "deterministic" move of the same step size is made in the "right" direction: xinew) = xjo'd) + A Z ~ ~ ) (i = 1,2,. . . ,N ). If the new step along the selected line turns out to be "wrong", the algorithm returns to the random search (Step 1). Otherwise, Step 2 is repeated.

The allowed number of wrong moves a n d t h e step reduction factor are usually varied in the ranges 15 5 N,, 5 100, 0.4 5 Srr I 0.75. Generally, Srr may be different for each variable. The optimization search is terminated when the maximum step size Axi for each independent variable becomes less than the required accuracy for that variable: Axi < E, (i= 1, 2,. . ., N ) .

In the Rastrigin book [19], this method is given the name "Random Search with Punishment by Randomness" (RSPR). The name refers to the situation when the deterministic line search fails to get the better solution. The present algorithm also includes the feature which may be called "Encouragement by Randomness". This feature comes into action when the deterministic line moves continuously improve the solution during a very large number of moves. T o prevent the situation when "right" moves are performed with a vanishingly small step size along a comparatively flat valley, the line search is switched to the random one after some sufficiently large number of right moves in the same direction.

When the search is terminated, there is still some uncertainty whether the approached solution represents the global minimum. Thus, convergence to local minima of the multimodal function may occur if the search region is narrowed too fast (at low values of N,,). The obvious recommendation is to repeat the whole optimization process starting with different initial points and trying several values of parameters Nw, and S r f .

It has been found that the RSPR method performs well even for difficult test problems (see the next section), but in some cases the number of function evaluations required to find the minima was very

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 9: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

132 V. V. LITINETSKI A N D B. M . ABRAMZON

high. In an attempt to improve the convergence rate, the following alteration involving a "self-learning" strategy [7] has been examined. After some prescribed number N,<N,, of "wrong" random moves of Step 1, which are stored in the computer memory, the statistical direction of the "steepest descent" at the point x0Id is evaluated. Next, the deterministic move is tried in that direction. Unfortunately, such a modification has not been successful with a number of tested multimodal functions. Probably, the use of the statistic gradient may have sense only at the final stage of the global search when the search region is narrow.

A remarkable improvement in convergence rate and ability to reach the global minimum has been obtained by combining the basic RSPR method with the multistart approach. The new algorithm is named MARS (Multistart Adaptive Random Search). The algorithm is self- adaptive in the sense that the step size is adjusted and the switching between the random and deterministic moves occurs depending on the search results. In addition to the above three governing parameters N,,, Srf, E of the RSPR method, two new parameters are introduced into the MARS algorithm: N,, - number of independent starts or paths, and P - the prescribed accuracy of the global search during the multistart search. The MARS algorithm involves two stages: I) rough multistart global optimization, and 2) fine global optimization within a reduced search domain near the approximate minimum discovered by the first stage. The method proceeds as follows:

0 Stage 1 Multistart global search (Pre-optimization) N,, points X;')(S = 1,2,. . . ,N,,) are randomly selected within the search domain: x p < x!') < x y ( i = 1,2,. . . ,N). The selected

11 - points should satisfy all constraints. Next, N,, independent global searches starting from the points xi0) are performed in accordance with theabove-described RSPR algorithm. Every particular global search is terminated when the maximum step size for all variables becomes lower than the specified tolerance: Axi < pi, ( i = 1,2,. . .,N). Note that the precision of the multistart search is much rougher than that of the whole method Pi >> E ~ . Finally, the end points of different search paths are compared, and the point with the lowest value of the objective function is taken as a starting point for the second stage of the algorithm.

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 10: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

GLOBAL OPTIMIZATION

a Stage 2 Fine optimization search At this stage, the RSPR algorithm is applied for the optimum search in the vicinity of the "best" point found during the global multistart optimization. The maximum initial step size is set to be: Axi = S' ( x - - xf"'") where the scale factor Sf is now much lower than at the first stage. In addition, the permitted number of wrong moves, N,,, may be increased so as to allow a thorough search for the global minimum. For example, the values Sf = 0.1 and N$ZW = 2 ~ " ~ ws

can be employed, thus reducing the search interval for each variable to 10% of its maximum size and increasing the permitted number of wrong moves by factor of 2. The termination criterion remains the same as for the RSPR algorithm: Axi < E,, ( i = 1,2, . . . , N).

It should be emphasized that in the course of each independent start, the MARS algorithm performs the global search of the minima over the whole domain of interest. In this regard, the MARS algorithm differs fundamentally from the multistart methods (M.S.) described by Dixon and Szego [I], Rinnooy Kan and Timmer [9], and in the recent paper by Jain and Agogino [23] where local deterministic or stochastic optimization searches are performed from random starting points.

The weak point in most of multistart methods is that each local minimum may be found several times. If the stopping criteria of each independent start require high precision in the search, the lot of computational efforts may be wasted. In the MARS method this disadvantage is less pronounced since the local minima are found in the course of the global search with low precision @ >> E ) .

The above-described random search algorithms may be easily extended to cases with mixed-integer o r discrete variables. Thus, if a certain variable xi obtains discrete values from the set {d l < d2 < d3 < . . . < dND} , and a random move from the point dm falls between two points: dkr dk+ 1, the size of this move is so corrected as to reach one of those points.

NUMERICAL TESTS AND DISCUSSION

The MARS algorithm has been tested on a number of test problems collected in Refs. [I, 2, 3, 15, 211. These tests have been used by many

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 11: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

134 V. V. LlTINETSKl A N D B. M. ABP.AMZON

authors [I, 11, 121 for evaluation and comparison of different optimi- zation algorithms. A detailed description of the test problems dis- cussed in the present paper is given in the Appendix.

For each problem, a large number of independent runs (100 o r 1000) was performed starting at randomly selected points within the search domain. The performance of the optimization algorithm is characteri- zed by the following main parameters:

The probability PFGM to find the global minimum. This value is. estimated as a percent ratio of runs. converging to the global minimum in relation to the total number of runs. Note that the run is considered to be ended a t the global minimum, if the final value of the objective function does nut exceed the global minimum by one percent: I.'< { F g m i n + 0.01 IFgmi,(} , where F g m i , is the known value of the global minimum (Fgmin #O). The average number of function evaluations, N v E , required to reach the global minimum. The value N F E is calculated based on the runs which attain the global minimum.

Unconstrained Test Problems

The first group of test problems contains seven multimodal functions of continuous variables without constraints. The so-called Shekel-5 ,

function of four variables has one global minimum Fmi, = -10.1532 a t point X = {4.,4.,4.,4.1Tand four local minima in the interval -10 < xi 5 10 ( i = 1 ,. . .,4). Figure I illustrates the behavior of the Shekel-5 function along the plane .XI = x,, x2 = x4. A case study has been performed in order to find preferable values of control parameters: P, N,,, N,, for the Shekel-5 function optimization. The step reduction factor and the termination criterion were kept constant: Srf=0.5, E = lo-'. The results shown in Table I in terms of the percent of runs achieving the global minimum, P F G M , and the number of function evaluations, N F E , represent the averaged values over 100 independent runs form randomly selected starting points. The cases in Table I with a single start search (N,,= I) are related to the RSRP algorithm.

As can be noticed, in some cases an increase in the number of allowable wrong moves yields a surprising reduction of the percent of runs ending in the global minimum. This fact indicates that due to the

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 12: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

GLOBAL OPTiMlZATlON

FIGURE 1 Two-dimensional view of the Shekel-5 function

stochastic character of the algorithm, the number of independent trials for good statistic estimations should be much higher than 100. In addition, the following observations can be made:

The probability of finding the global minimum in the MARS algorithm rises significantly with the growth of the number of starts N,,. The required number of function evaluations, NFE, increases with the number of permitted wrong moves, N,,,. The value of P F ~ ~ , is not very sensitive to the precision of the multistart search, 0 , and to the permitted number of wrong trials, N W S

Strictly speaking, these conclusions are only valid for the tested Shekel-5 function in the considered range of control parameters. For

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 13: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

136 V. V. LlTlNETSKl A N D B. M. ABRAMZON

TABLE I Performance of the MARS algorithm for the Shekel-5 function at ditTe'rent values of control parameters (Sd;=O.S, E = 1 0 - 7

each particular case, the "optimum" set of control parameters p, N,,, N,, allowing the discovery of the global minimum with almost 100% probability at minimum number of function evaluations, may be established experimentally.,As can be seen from Table I, such a set for the Shekel-5 function is: P = 0.05, N,, = 15 and N,, = 10. Fortu- nately, it was found that the above values can be successfully applied to many other cases. Table I1 illustrates the performance of the MARS algorithm for seven test functions given in Appendix. (Here and below, the sequential number of the problem corresponds to that in Appendix.)

In Table 11, results for the MARS algorithm are averaged over 1000 independent runs. In addition to the percent of runs ending in the global minimum, PFGM, and the average number of function evalua- tions, NFE, the average number of successful moves, N,,,, and the average CPU time expressed as the amount of standardized time units, TSTU, are shown. Note that the standardized time unit is taken as CPU time needed for 1000 calls of the subroutine evaluating the Shekel-5 function at the point (4, 4, 4, 4, lT. This time is about 0.06 sec. for the Pentium 60 MHz personal computer. The final values of the objective functions precisely coincide (with accuracy of w 5 ) with the literature data given in Appendix. Table I1 shows also the computation'results

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 14: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

GLOBAL OPTIMlZATlON 137

TABLE I1 Unconstrained optimization of different multimodal test functions

MARS SRSA GSA B=O.OS N . . = 1 5 . N. . .=10 IRef.11) (ReLl lJ

FUNC- PFGM N F E I N ~ ~ ~ TSTU PFCMINFE TlON

Goldstein- 100.0 201 11167 2.5 9911186 Price

Branin 100.0 24651201 3.1 1001557 (RCOS)

by Brooks and Verdini [I I] for the SRSA (Self-Regulating Simulated Annealing with adjusting the step size and direction) and GSA (Generalized Simulated Annealing with two-phase procedure) meth- ods. The authors [ I l l noted that in some cases, such as for the Hartman-3 function, the algorithm convergence is very sensitive to the control parameter settings, particularly the step size and the "temperature". Therefore, significant time should be spent to select reasonable- values for the control parameters in the SA search. Moreover, these parameters vary from case to case in Table 11. Generally, the results shown in Table I1 indicate that the MARS algorithm outperforms the SA methods for the most difficult functions: Hartman and Shekel family. Note that for the "easy" cases in Table 11, achievement of the global minimum can be obtained by the MARS method with much less effort. For instance, PFGM = 100% and N F ~ = 39 1 are found at the parameter values N,, = 1, N,, = 10 for the Branin (RCOS) function, and PFGM=98.8% and NFE=626 at N,,= 2, N,,= 10 for the Goldstein-Price function, respectively. On the other hand, for the more difficult case of the Shekel functions, the value of PFGM = 100% is achieved at N,, 2 13.

The disadvantage of the simulated annealing strategy for global optimization with continuous variables is explained graphically in Figure 2. In the SA algorithm, the ability to escape the local minimum is provided by accepting uphill moves. Consider the points A and B in Figure 2 which are located in the vicinity of the local and global minimum, respectively. The probability of moving uphill from the

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 15: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

V . V. LlTlNETSKI AND B. M. ABRAMZON

x' FIGURE 2 Uphill and downhill moves in the simulated annealing algorithm.

point A toward the global minimum is essentially the same as the probability of moving uphill from the point B and leaving the global 'minimum. In addition if the step size in the point A is too small, a number of sequential moves in the uphill direction will be required to avoid being trapped in the local minimum. The probability of several sequential moves in the uphill direction decreases rapidly with the number of moves. Therefore, the acceptance of uphill moves can lead to cycling in the procedure. In contrast, the MARS algorithm allows escape from the local minima by exploration of the whole domain of interest in the course of the global pre-optimization search.

One of the most attractive features of the MARS method allowing significant reduction of algorithm "tuning" to a given class of prob- lems is its robustness to the search control parameters. The parameters N,,, N,,, Srr, P, E , have clear physical meanings and predictable effects on the search performance.

The last test problem (Problem 8 of the Appendix) in this section was studied by Price [22] who noted that this function is particularly difficult to optimize due to the presence of exponential terms producing a very deep and narrow multidimensional canyon. This function of nine variables originates from the least-squares approach to the solution of a set of nine simultaneous nonlinear equations arising in the course of transistor modeling. The global minimum of this non-negative function represents, a t the same time, the solution of the equation F ( X ) = O.'It is unknown how many minima exist within

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 16: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

GLOBAL OPTlM IZATlON 139

the search domain 0 5 xi 5 10, but one is located near the point Xmi, shown in Table I11 below. The value of the objective runction at that point calculated with double precision is: F ( X ) = 1 . 8 7 6 ~ lo-'. The behavior of the Price function is illustrated in Figure 3, where only one of variables XI, . . ., x9 (marked on each curve) is varied while the other eight variables remain fixed at the above point Xmi,. In the close vicinity of the needle-shaped minimum the function is changed by seven orders of magnitude. Price [22] employed his CRS (Controlled Random Search) method which combines random sampling over the whole domain of interest with the clustering technique. The CRS algorithm did not find the global minimum after six sequential runs which totally involved about 200,000 function evaluations. The results of the first, second and sixth run are presented in Table 111 in the columns denoted Price-I, 2 and 6 . In all sequential runs, the center of the search interval for each variable was shifted to the best value found in the previous run, while the interval length remained the same: [xi-5, xi+ 51. Starting from the second run, the minimum points were outside

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 17: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

140 V. V. LlTlNETSKI AND B. M. ABRAMZON

TABLE 111 Optimization of the Price test function (Ref. 22)

Xmin Price-1 Price-2 Priced M A RS-I MARS-I laver. 100) + RSPR

F(Xmi.) 1.876 x lo-' 117.33 9.27 1.194 x lo-' 0.0205 9.75 x lo-' fmin (22.6) (3.8 x (3.9 x (6.8 x lo-') (5.85 x lo-')

the initial search interval [0, 101: Finally the lowest function value established by Price is F ( X ) = 3 . 9 ~

It should be noted that the present objective function is very sensitive to small variations in the values of variables in the vicinity of the local minima. Thus, simple rounding off the x-values causes a significant change of the objective function. This effect is illustrated in Table 111, where the actual function values found during the . optimization search, fmi,, are compared with the function values F(Xmi,) which are calculated at the rounded-off values of xi given in the same column. Although such sensitivity is not typical for engineering applications, the Price problem may be recommended for testing different optimization algorithms.

In the present study, the Price problem has been treated by running the MARS algorithm 100 times form random starting points. The following values of the control parameters: P = 0.001, E = 1 x10-', N,, = 50 and N,, = 50 were used. In 22% of cases, the final result was close to the global minimum point described above. The values of xi and fmi, averaged over these successful runs are shown in the column denoted MARS-I. The average number of function evaluations is very high: 3,673,500, which corresponds to the CPU time of 22.6 min. Note that there is also an unusually high number of successful moves: 1,070,000. The last column in Table 111 shows the case where the previous results were taken as a starting point for the fine search by the . RSPR method with the reduced search interval f0.1, increased number of allowed wrong moves N,, = 500 and the tolerance

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 18: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

GLOBAL OFTIMIZATION 141

a = This search converges very closely to the global minimum after 516,000 moves (CPU time is 3.25 min).

Test Problems for Continuous Variables with Constraints

Generally, in constrained optimization by the MARS method, a considerable number of moves appear to be unfeasible due to constraint violations. In some cases, the constraint evaluation involves the function evaluation as well. As a result, many more function evaluations are required in comparison to the unconstrained case.

Problem 9 Quadratic objective function, 5 continuous variables, 6 nonlinear inequality constraints.

The MARS algorithm easily solves this problem at the following values of control parameters: p = 0.05, a = 1 ~10- ' , Srf = 0.5, N,, = 15 and N,: = 25. The moves violating the constraints were rejected. The minimum values found in each of 100 independent runs are below f = -30561.41, which is only 0.36% higher than the known value of the global minimum. The average number of moves is 7615, and the number of right moves is 619. The average value of the vector of variables is X = {78.022, 33.106, 30.054, 44.517, 36.8181T, and the average minimum value of the objective function: fmi, = -30642.22.

Problem 10 Nonlinear optimization, 3 continuous variables, 2 equa- lity constraints.

Using the constraints equations, the variables x2, X, can be expressed in terms of x l . Finally, this problem is reduced to the following one:

Minimize f (X) = 1000 - 4 - 2x: - x: - xlx2 - ~ 1 x 3

where x2 = (4- -)/(2a)

a = 5 ; b = -32(1 -xl/7).jAc=64(1 - x 1 / 7 ) ~ - 2 5 + 4

and x j = 8(1 - x1/7) - 2x2

0 5 XI 5 5.

The RSPR algorithm was first applied to this single variable un- constrained problem with the search control parameters: E = 1 x lo-',

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 19: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

142 V. V. LlTlNETSKl AND B. M . ABRAMZON

N,, = 1 and N,, = 15. The global minimum x l = 3.51241, Fmin= 961.7151 is found in 100% of runs at the average number of function evaluations NFE = 485. Note that the calculations should be performed with double precision since the effect of round-off errors in the square root evaluation in the vicinity of the minimum is very high. Then, t h e full three-variable equality-constrained problem was solved by the MARS method using the exterior penalty approach. The value of the penalty multiplier for constraints was selected to be r,, = 10. The following values of control parameters were used: p = 0.001, E = 1 x N,,= 50 and N,, = 50. The statistics made over 100 independent runs shows that 76% of cases reached a minimum function value below 961.719, which exceeds the known value of the global minimum by only 0.0005%. The average number of function evaluations is 56,740. The average value of the vector of variables is ~={3:5201, 0.21645, 3.5441)'which yields the objective function value/= 961.717, and the constraint values: h, = 1 . 9 6 ~ h2 = 3 . 9 5 ~ lo-'.

Test Problems for Mixed-Discrete Variables

In this section the MARS algorithm performance is examined for a number of problems with mixed-discrete variables.

Problem 1 1 Unconstrained optimization, 4 integer variables.

The best global minimum value for this problem has been found using genetic methods [14, 161: fmi, = 2 . 7 ~ lo-". Wu and Chow [I41 used the population size between 20 to 60, and the number of generations about 500. The values fmi, = 2 . 3 3 ~ and fmi, =

2 . 3 6 ~ represent the best results obtained by the deterministic methods (see Refs. [12, 141) and by the SA method [12], respectively. The MARS algorithm has been run over 100 trials with the following control parameters: N,, = 25, N,, = 25, Srf = 0.5, P = 2.0, E = 1.0. In 95% of trials the minimum values were the same or better than those given by the SA method. The global minimum: Fmin = 2 . 7 ~ 10-12, X = 119, 16, 43, 49)' has. been found in 6% of trials with the average number of function evaluations NFE x 5100.

Problem 12 Nonlinear optimization, 2 discrete and 2 continuous variables, 6 inequality constraints.

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 20: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

GLOBAL OPTIMIZATION 143

The best minimum value for this problem found by the simulated annealing [I21 and genetic [I61 algorithms is: fmi, = 7197.7. Wu and Chow [I41 reported a somewhat higher value: fii, = 7207.5 obtained by a meta-genetic algorithm. The MARS method has been run 100 times with the following control parameters: N,, = 5, N,, = 50, Srf = 0.7, p = 0.05, E = In 56% of trials the above best minimum value:

fmin = 7197.729, X = {I ,125, 0.625, 58.29012, 44.69286)r was reached with the average number of function evaluations NFE z 12,050.

EXPERIENCE WITH ENGINEERING PROBLEMS

The algorithms presented in this paper have been successfully used by the authors in solving various engineering design problems in the fields of turbomachinery and compact heat exchangers. Some of these studies have been published in open literature (Refs. [26-301). For example, the RSPR and MARS methods have been applied to the following types of problems (partial list):

Nonlinear regression analysis; Optimal experimental design for testing turbine and compressor characteristics. Analysis of hydraulic networks. Geometric shape optimization of turbine blades. Optimum design of airborne compact heat exchangers and devices for cooling of the electronic equipment.

For most of the problems, the objective functions could not be explicitly expressed. In some cases, the objective functions are evaluated as a result of a computational procedure involving numerical errors (noise). The nonlinear constraints include parameters which depend on the objective function itself. The following is a brief description of three typical problems which were solved by the above-described algorithms.

Hydraulic Analysis of Pipe-Networks for Internal Cooling of Turbine Blades

The cooling fluid is a compressible gas flowing at high subsonic velocities in the system of interconnected ducts made within the

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 21: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

1 44 V. V. LITINETSKI AND 6. M. ABRAMZON

turbine blade. The problem is to calculate the flow rate and pressure drop through each branch, provided that the inlet and exit pressures of the network are known. The schematic of a simple pipe-network is shown in Figure 4. Mathematically, the problem is reduced to a system of algebraic equations containing two types of equations: a) nonlinear equations expressing the relationships between the flow rate Gi,j and the pressure values at the ends of each branch ( i - t j ) , and b) mass conservation equations at each node.

Gi,, = f (Pi, Pj); Gi,k = 0 (for all branches and nodes) (all k or node i)

This is a highly nonlinear system, and the standard methods of solution, such as Newton Raphson, did not converge. The alternative approach was to apply the RSPR algorithm to the minimization of the sum of absolute residuals of the mass balances in all nodes: F = C abs (C Gi,k) -+ min. The minimum of the objective function should approach zero at the solution point. The optimization variables are the pressure values in different nodes. Thus, in this case, the optimization program has been used as an efficient solver of nonlinear algebraic equations.

Optimization of the Blade Twist Angle of a Radial Turbine (Reference 1261)

The problem is to achieve the maximum thermodynamic efficiency o f a radial turbine, q, by specifying the optimum distribution of the

FIGURE 4 Pipe network scheme.

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 22: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

GLOBAL OPTIMlZATlON 145

impeller exit angle P = P(r) along the radius r. The problem geometry is schematically shown in Figure 5. Mathematically, in the context of a one-dimensional gasdynamics model, the problem is described as:

Maximize the function:

where r is the exit radius of the streamline varying within the range [r l , r2]; rl and r2 are the hub and top radii at impeller exit. The lower bound, rl , of the integral is prescribed, while the top bound r2, is evaluated from the equations:

where G is a constant (mass flow rate through the turbine); f l , f2, f3, f4

are the prescribed functions.

Menn stream line

Exit radius of the streamline

Developed

'~rnpe~~er body

FIGURE 5 Geometry of the radial impeller

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 23: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

146 V. V. LITlNETSKl A N D B. M. ABRAMZON

The unknown distribution of the twist angle p = P(r) was specified a t six points and then approximated by the cubic spline. Finally, the problem had 6 variables and both equality and inequality constraints. A special feature of the problem is that the objective function, 7, experiences fluctuations as a result of truncation errors appearing in the course of numerical evaluation of the above finite integrals. In such a case, the RSPR method exhibited good performance. Attempts to improve the convergence at the end of the search by switching to the Gradient Steepest-Descent method failed because of the low accuracy of the numerically evaluated derivatives.

Optimum Design of a Compact Airborne Heat Exchanger (Reference 1281)

The problem of optimum design for compact heat exchangers is of major importance in the aircraft industry because of the demands for highly effective, light and reliable components. The typical core of a fin- and-plate heat exchanger represents a sandwich-type construction where the hot and cold fluid streams pass through alternate finned layers separated by flat (parting)'plates. The schematic of the core for a cross-flow arrangement is shown in Figure 6a. Here, LA, LC and Lnf are the core dimensions termed as hot flow length, cold flow length, and no-Row length, respectively. Compact heat exchangers are built using a variety of plate-fin geometries such as the plain straight fins shown in Figure 6b. The most important parameter characterizing heat exchanger performance is the thermal effectiveness, E , representing the ratio of the actual heat transfer rate between the hot and cold fluids to the theoretically maximum heat transfer rate under certain ideal conditions. The simplest method to improve the effectiveness is to increase the area of heat transfer surfaces from both hot and cold sides. That may be achieved either by increasing the core dimensions, (L,, x LC x Lnf), or by reducing the fins' pitch, s, and height, b. In both cases, the heat exchanger weight grows. In addition, making the fins with lower height and pitch results in the reduction of the hydraulic diameter of the finned passage, which, in turn, causes the growth of flow pressure drops in the hot and cold circuits, Ap,, Ap,. (subscripts "h" and "c" relate to the hot and cold flows, respectively). Usually, the customer specifications contain the minimum permitted value of the

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 24: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

GLOBAL OPTIMIZATION

(a) Compact heat

(b) Finned passage

FLOW/ geometry

FIGURE 6 Compact fin-plate heat exchanger.

effectiveness E > E * , and maximum available values of flow pressure drops Aph 5 Ap;; ApC 5 Ap: under a number of operating condi- tions. The core dimensions should be kept within the specified limits dictated by the available space and the installation interface:

and the heat exchanger weight should not exceed the permitted value: w 5 Wm,,.

The optimal heat exchanger is defined as one that, while satisfying the imposed performance and construction requirements, achieves the highest thermal effectiveness o r lowest weight (or volume). Thermal effectiveness and heat exchanger weight represent the nonlinear multivariable objective functions. The core dimensions and fin

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 25: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

148 V. V. LITINETSKI A N D B. M . ABRAMZON

geometry parameters are the design variables. Note that the objective functions may be discontinuous. Thus, for example, the number of finned layers which can be packed within the fixed height of the heat exchanger is always an integer value. A small variation of the fin height on one or both sides of the heat exchanger may result in a change of number of the finned layers of stacks, and, in turn, in a discrete change of the performance values: E , Ap,, Ap,. The maximum allowable pressure drop in the hot and cold flows, along with the restrictions on the heat exchanger weight, core size and minimum permitted thermal effectiveness are typical examples of nonlinear implicit constraints for this problem.

CONCLUSIONS

In this paper a new Multistart, Adaptive, Random Search algorithm (MARS) has been presented for global, constrained optimization with cont,inuous and mixed discrete design variables. The effectiveness of the method was demonstrated with twelve bench mark problems found in the literature. The results indicate that the MARS method is able to discover the best known minimum obtained by other techniques. The method requires a substantial number of objective function evaluations which is, however, comparable with that of other stochastic methods: simulated annealing and genetic algorithms. However, for all the considered test problems, the MARS algorithm generally outperforms simulated annealing. The clear physical mean- ing of the algorithm control parameters and their predictable effect on the search performance allow considerable time to be saved in the course of the algorithm "tuning" to a specific class of problems. The method is easily programmed on the computer, and does not require extensive memory storage.

The authors' encouraging experience with the MARS algorithm for solving optimization problems in the field of turbomachinery and compact heat exchangers allows them to recommend the method for various engineering design applications.

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 26: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

GLOBAL OPTIMIZATION

References

[I] L. C. W. Dixon and G. P. Szego (Eds.) (1971). Towards Global Optimization. North Holland Publishing Company, New York.

[2] Torn, A. and Zilinskas, A. (1988). Global Optimization. Lecture Notes in Computer Sciences, 350, Springer-Verlag. New York.

131 Etgar, T. F. and Himmelblau. D. M. (1988). Optimization of Chemical Processes. McGraw-Hill, New-York.

[4] Himmelblau, D. M. (1972). Applied Nonlinear Programming. McGraw-Hill. New- York.

[5] Vanderpiaats, G. N. (1984). Numerical Optimization Techniques for Engineering Design with Applicationr. McGraw-Hill Book Company, New-York.

[q Press, W. H., Flannery, B. P., Teukolsky, S. A. and Vetterling, W. T. (1989). Numerical Recipes, The Art of Scientific Computing. Cambridge University Press.

[q Rubinstein.3. Y. (1981). Simulation and Monte Carlo Methods. John Wiley and Sons.

[8] Rinnooy Kan, A. H: G. and Timmer, G. T. (1984). Stochastic methods for global optimization: American Journal of Mathemarical and Management Sciences, 4, 7-40.

[9] Rinnooy Kan, A. H. G. and Timmer, G. T. (1987). Stochastic global optimization methods. Part I: Clustering methods, and Part 2: Multi level methods. Mathemarical Programming, 39, 27-78.

[lo] Kirkpatrick, S.. Gelatt, C. D. Jr. and Veccbi, M. P. (1983). Optimization by simulating annealing. Science, 220(4598), 671 -680.

[Il l Brooks, D. G. and Verdini, W. A. (1988). Computational experience with generalized simulated annealing over continuous variables. American Journal of Mathematical and Management Sciences, 8(3 & 4). 425 -449.

[I21 Zhang, C. and Wang, H. P. (1993). Mixed-discrete nonlinear optimization with simulated annealing. Engineering Optimization, 21, 277-291.

[I31 Srinivas, M. (1994). Genetic algorithms: A survey. Computer, 17-26, June 1994. 1141 Wu, S. J. and Chow, P. T. (1995). Genetic algorithms for nonlinear mixed discrete

integer optimization problems via meta-genetic parameter optimization. Engineer- ing Optimization, 24, 137- 159.

[I51 Syrjakow, M. and Szczerhicka, H. (1994). Optimization of simulation models with REMO. In Proceedings of the Conference on Modeling and Simulation, ESM94, Barcelona, Spain, Eds. A. Guasch and R. M. Huber, 274-281.

[I61 Lin, S., Zhang. C. and Wang, H. P. (1995). On mixed-discrete nonlinear optimization problem: A comparative study. Engineering Optimization, 23, 287- 300.

117) Hu, N.'(1992). Tabu search method with random moves for globally optimum design. International Journal for Numerical Methods in Engineering, 35, 1055- 1070.

[I81 Dhingra, A. K. and Bennage, W. A. (1995). Discrete and continuous variable optimization using tabu search. Engineering Optimization, 24, 177- 196.

[I91 Rastrigin, L. A. (1965). Random Optimization Search in Multi-Parametric Systems, (in Russian). Zinatne, Riga.

[20] Luus, R. and Jaakola, T. H. 1. (1973). Optimization by direct search and systematic reduction of the size of the search region. AIChE Journal, 19(4), 760-765.

[21] Floudas, C. A. and Pardalos, P. M. (1989). A Collection of Test Problems for Constrained Global Optimization Algorithms. Lecture Notes in Computer Sciences, 455, Springer-Verlag, New York.

[22] Price, W. L. (1971). A controlled random search procedure for global optimization. In Towards Global Optimization, Eds. L. C. W. Dixon and G . P. Szego, North- Holland Publishing Company, New York, 71 -84.

123) Jain, P: and Agogino, A. M. (1993). Global optimization using the multistart method. A S M E Journal of Mechanical Design, 115, 770-775.

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 27: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

150 V. V. LlTlNETSKl AND B. M. ABRAMZON

1241 s in t i , L. M., Townsend, M. A. and Johnson, G. E. (1982). Noniterative penalty function technique for constrained optimization. Engineering Optimization, 6, 63-76.

[25] Sandgren, E. (1990). Nonlinear integer and discrete programming in mechanical design optimization. Transaction of the ASME, Journal of Mechanical Design, 112(2), 223-229.

[26] Kirillov, A. I., Birdjakov, M. B. and Litinetski, V. V. (1975). 0 ratsionalnoi zakrulke rabotchich lopatok radialno-osevoi turbinnoi stupeni. lzvestijo Vischich Uchcbnich Zawdenij SSSR, Energetika, 6 , 64-68. (In Russian). (On optimal blade twisting in a centripetal turbine).

1271 Kirillov, I . I . and Litinetski. V. V. (1985). 0 vibore optimalnich parametrov radialno-osevych turbinnich stupenej. Izvestija Vischich Uchebnich ZavedenijSSSR, Energetika, 3, 118- 121. (In Russian). (Optimal parameters of radial turbines).

[28] Abramzon, B. and Ostersetzer, S. (1993). Optimal thermal and hydraulic design of compact heat exchangers and cold plates for cooling of electronic components: In Aerospace Hear Exchanger Technology 1993. Eds. R. K. Shah and A. Hashemi, Elsevier, Amsterdam, 349-368.

[29] Abramzon. B. (1996). Optimum design of the compact fin-and-plate heat exchangers. Proceedings of the 26th Israel Conference an Mechanical Engineering, Haifa. 312-338.

[30] Abramzon, B. (1995). Thermal analysis and optimal design of sinks for cooling of electronic components. Proceedings of the S k t h Israel Conference on Packaging and Cooling of the Electronic Equipment. Herzlia, Israel.

APPENDIX: TEST FUNCTIONS FOR GLOBAL OPTIMIZATION

This Appendix contains some of the test functions for global optimization used in the present study. The vector of variables is denoted as: X = {XI , . . . ,x,} T

Problem. I Goldstein and Price function (Refs. [ I , 2]), two variables.

F(x1,xz) =p(xt,x2) *q(x1,x2)

p(xl,x2)=[1 + ( X I + . ~ ~ + 1 ) ~ ( 1 9 - 14xl+3x;- 1 4 x 2 + 6 ~ 1 ~ 2 + 3 4 ) ]

q(xl ,.q) = [30+ (2x1 - 3 ~ 2 ) ~ ( 1 8 - 32x1 + 1 2 4 f . 4 8 ~ 2 - 36x1x2 -t 27x:)l

Region of interest: -25 x l , ~ 5 2 There are four local minima in this region. Global minimum: X =

{O, - I} T; Fmin = 3.

Problem 2 Brunin (RCOS). (Refs. [I, 2]), two variables

where: a = I, b = 5.1/(4$), c = 5/n, d = 6, e = 10, f = 1/(8n).

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 28: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

GLOBAL OPTIMIZATION I51

Region of interest: -5 5 x, 5 10, 0 5 x2 5 15 There are three minima, all global, in this region: Fmi, = 0.39789 at

XI = 1-3.1416, 12.2751T, X2 = (3.1416, 2.275lT, Xj = 19.4348, 2.4751

Problems 3 and 4 Hartman's functions (Refs. [ I , 21)

where the coefficients a!,,, pi,, ( i = 1,2, . . . ,m; j= 1,2,. . . ,n) are given in Tables AI, A11 below for n= 3, 6 (three and six variables). The search domain is: 05x j 5 1 , j = I , . . . ,n.

The function has one global minimum at the following points:

TABLE A1 Coefficients of the Hartman function: n = 3

TABLE All Coefficients of the Hartman function: r1=6

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 29: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

152 V. V. LlTlNETSKl A N D B. M. ABRAMZON

Problems 5, 6 and 7 Shekel's functions (Refs. [l , 21)

where the coefficients aiSj, ci (i = 1,2,. . .,m; j = 1,2,. . .,n) are given in Table A111 below for n = 4 (four variables). The search domain is: 0 5 x 0 j = 1,. . . n. The Shekel functions,have one global minimum at the point X = {4.0,4.0,4.0,4.0} with the following values: at m = 5: Fmi, = - 10.1532; at m = 7: Fmi, = -10.4029; and at m = 10: Fmi, =

-10.5364. There are a number of local minima near the points: XI,, =

azVi, ~ 3 , ~ , with the values approximately equal to (-l/ci)

Problem 8 Price's Function (Ref. [22]), nine variables

TABLE All1 Coefficients of Shekel's function: n = 4; m= 5, 7, 10

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 30: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

GLOBAL OPTIMIZATION 153

Numerical constants gi.k are given in the Table AIV. Region of interest: - 125 xi 5 12, i = 1 ,. . .,9. The Price functions

have one known global minimum F,i, = 1.876.10-' at point X = {0.9, 0.45, 1.0, 2.0, 8., 8., 5., l., 2.)

Problem 9 (Cited in Ref. [24] also Ref. [4], problem 11). Nonlinear optimization, 5 continuous variables, 6 inequality constraints

Minimize F(X) = 5.3578547~: + 0 . 8 3 5 6 8 9 1 ~ ~ ~ ~ + 37.293239~1 - 40792.141

Subject to 0 < 85.334407 + 0 . 0 0 5 6 8 5 8 ~ ~ ~ ~ + 0 . 0 0 0 6 2 6 2 ~ ~ ~ ~ - 0 . 0 0 2 2 0 5 3 ~ ~ ~ ~ 5 92

90 < 80.51249 + 0 . 0 0 7 1 3 1 7 ~ ~ ~ ~ + 0 . 0 0 2 9 9 5 5 ~ ~ ~ ~

f0.0021813x~ I 110 20 < 9.300961 + 0 . 0 0 4 7 0 2 6 ~ ~ ~ ~ + 0 . 0 0 1 2 5 4 7 ~ ~ ~ ~

+ 0 . 0 0 1 9 0 8 5 ~ ~ ~ ~ 5 25

78 5 x 1 I 102; 33 5 x2 5 45; 27 5 x3 5 45; 27 5 x4 5 45;

27 5 xs 5 45

Constrained minimum: F,,,;,= -30665.5 at X = {78.0, 33.0, 29.995, 45.0, 36.7761~

Feasible starting point: X = {78.0, 33.0, 27., 27., 2 7 . 1 ~

Problem 10 (Cited in Ref. [4] problem 5) Nonlinear optimization, 3 continuous variables, 2 equality constraints.

Minimize F(X) = 1000 - x t - 2x: - x: - ~ 1 x 2 - xlx3

Subject to hl(X) = x ~ + x ~ + x : - 2 5 = 0 ;

h2(X) = 8x1 + 14x2 + 7x3 - 56 = 0; O I x i I 5 . ( i = 1 , 2 , 3 )

TABLE AIV Constants gi* for the Price function

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014

Page 31: MARS - A MULTISTART ADAPTIVE RANDOM SEARCH METHOD FOR GLOBAL CONSTRAINED OPTIMIZATION IN ENGINEERING APPLICATIONS

154 V. V. LlTlNETSKl A N D B. M. ABRAMZON

Constrained minimum: Fmi, = 961.715 at X = {3.512, 0.217, 3.552)'.

Feasible starting point: X = {2.,2.,2.)'

Problem 11 Gear Train Design. (Sandgren [25]). Unconstrained optimization, 4 integer variables, The objective function for this problem is expressed as:

where the design variables x are integers varying in the range 125xi 5 60, ( i = I,. . .,4) The best global minimum value for this problem has been found using the genetic methods [12, 131 : Fmi, = 2.7 10-12.

Problem 12 Pressure Vessel Design. (Sandgren [25]). Nonlinear optimization, 2 discrete and 2 continuous variables, 6 inequality constraints

Minimize F(X) = 0 . 6 2 2 4 ~ 1 ~ ~ ~ ~ + 1 . 7 7 8 1 ~ 1 ~ : + 3.'1611xix4

+ 1 9 . 8 4 ~ : ~ ~

Subject to gl (X) = XI - 0.0193~3 2 0;'g2(x) = x2 - 0 . 0 0 9 5 4 ~ ~ 2 0;

g3(X) = ( 4 7 ~ 1 3 ) ~ : + 7~x:x4 - 750 x 1728.0 2 0;

g4(X) = 240.0 - x4 2 O;g5(X) = XI - l .I 2 0; gg(X) = ~ 2 - 0 . 6 2 0;

X I , x2 are discrete variables varying with the step 0.0625;

0.625 5 X I 50.3125; 1.125 5 x2 50.3125; x3, x4 are continuous variables; 40 5 x3 5 80;

20 5 x4 5 60;

The best minimum is: Fmi,= 7197.7 a t X = {1.125, 0.625, 58.29012, 44.692861~.

Dow

nloa

ded

by [

Uni

vers

ity o

f N

ebra

ska,

Lin

coln

] at

11:

53 1

7 O

ctob

er 2

014