Tabu search for fuzzy optimization and applications

Download Tabu search for fuzzy optimization and applications

Post on 26-Jun-2016

214 views

Category:

Documents

2 download

TRANSCRIPT

<ul><li><p>Keywords: Tabu search; Fuzzy optimization; Fuzzy linear programming; Fuzzy linear</p><p>approaches of resolution have been proposed too. Among the many methods</p><p>presented in literatures, we mention the fuzzy genetic algorithm (FGA), pro-</p><p>posed by Buckley and Hayashi [5], which can produce good approximate</p><p>Information Sciences 158 (2004) 313</p><p>www.elsevier.com/locate/insregression; Fuzzy neural network</p><p>1. Introduction</p><p>Fuzzy optimization problems play a very important role in many fuzzy</p><p>systems. In the last past years, several kinds of fuzzy optimization problemshave appeared in literatures [25,7,12,13] and obviously with them dierentTabu search for fuzzy optimizationand applications</p><p>Chunguang Li *, Xiaofeng Liao, Juebang Yu</p><p>Institute of Electronic Systems, College of Electronic Engineering, University of Electronic</p><p>Science and Technology of China, Chengdu, Sichuan 610054, PR China</p><p>Received 10 June 2003; accepted 23 July 2003</p><p>Abstract</p><p>In this paper, we present a tabu search technique to approximately solve fuzzy op-</p><p>timization problems. We demonstrate the performance of the proposed method by</p><p>applying it to an elementary fuzzy optimization problem. Other applications of the</p><p>method to fuzzy linear programming, fuzzy regression and the training of fuzzy neural</p><p>networks are also presented.</p><p> 2003 Elsevier Inc. All rights reserved.* Corresponding author. Tel.: +86-28-83202508.</p><p>E-mail address: cgli@uestc.edu.cn (C. Li).</p><p>0020-0255/$ - see front matter 2003 Elsevier Inc. All rights reserved.doi:10.1016/j.ins.2003.07.015</p></li><li><p>ceptually much simpler than simulated annealing (SA) or genetic algorithm</p><p>(GA). Besides, it has been shown that TS is superior to SA and GA both in thetime required to obtain a solution and in the solution quality in solving many</p><p>optimization problems [6,811].The rest of this paper is organized as follows. In next section, we briey</p><p>describe the basic TS technique. And in Section 3, we will present our FTS</p><p>algorithm. The FTS will be applied in Section 4 to generate good approximate</p><p>solutions to an elementary fuzzy optimization problem. Simulation results</p><p>show that it outperforms the FGA in solving this problem. More substantial</p><p>applications are given in Section 5, where we discuss FTS solutions to fuzzy</p><p>linear programming (FLP), fuzzy linear regression (FLR) and the training of</p><p>fuzzy neural networks (FNNs). Finally conclusions are summarized in Sec-tion 6.</p><p>Let us now introduce some basic notations to be used in this paper. We</p><p>place a bar over a symbol if it represent a fuzzy set. So, A;B;C; . . . ;X ; Y ; Z allrepresent fuzzy sets. All our fuzzy sets will be fuzzy subsets of the real numbers.</p><p>If A is a fuzzy set, then Ax denotes its membership function evaluated at x.A triangular fuzzy number N is dened by three numbers a &lt; b &lt; c where (1)Nx 0 for x6 a and xP c, and Nb 1; and (2) the graph of Nx y is astraight line segment from a; 0 to b; 1 on a; b and a straight line segmentfrom b; 1 to c; 0 on b; c. We write N a; b; c for triangular fuzzy num-bers. We use the standard arithmetic of fuzzy sets based on the extension</p><p>principle.</p><p>2. General description of tabu search</p><p>The TS is a metaheuristic global optimization method originally developed</p><p>by Glover [8,9] for large combinatorial optimization tasks, and extended to</p><p>continuous-valued function in [6]. It is dierent from the well-known hill-</p><p>climbing local search techniques in the sense that it does not become trapped in</p><p>local solutions, i.e. the TS allows moves out of current solution that makes the</p><p>objective function worse in hope that it will achieve a better solution. The TSsolution in solving fuzzy optimization problems. But it should be pointed out</p><p>that the FGA is too computational expensive.</p><p>We introduce here a tabu search (TS) method, a stochastic global optimi-</p><p>zation method originally developed by Glover [8,9] for very large combinato-rial optimization tasks and extend to continuous-valued functions in [6], for the</p><p>fuzzy optimization problems. For notational convenience, we denote it as FTS</p><p>in the rest of this paper.</p><p>Our work has been inspired by the fact that TS is very general and con-</p><p>4 C. Li et al. / Information Sciences 158 (2004) 313technique has been shown in literatures [1,6,811] that it is superior to SA and</p></li><li><p> Se</p><p>a specied duration (tabu list size), they are removed from the list and are</p><p>free to be visited again.</p><p> Aspiration criteria: these are rules that override tabu restrictions, i.e. if a cer-tain move is forbidden by tabu restriction, then the aspiration criterion,</p><p>when satised, can make this move allowable.</p><p>Given the above basic elements, the TS schemes can be described as follows.</p><p>Start with a certain conguration, evaluate the objective function for that</p><p>conguration, then follow a certain set of candidate moves. If the best of themoves is not tabu or if the best is tabu, but satises the aspiration criterion,</p><p>then pick that move and consider it to be the new current conguration;</p><p>otherwise, pick the best move that is not tabu and consider it to be the new</p><p>current conguration. Repeat the procedure for a certain number of iterations.</p><p>On termination, the best solution obtained so far is the solution obtained by</p><p>the TS approach. A recent trend in the TS eld is the use of dynamic tabu lists</p><p>[10,11], which in their most common form use variable list sizes, as a way of</p><p>managing short-term memory. In this paper, we use a dynamic tabu list sizestrategy during the course of the search.</p><p>3. Tabu search for fuzzy optimization</p><p>A general function F with fuzzy input and output can be expressed as</p><p>Y F X 1where X is the input and it is a fuzzy subset of some interval 0;M , M &gt; 0. Y isthe output from F given X . In this section we assume that there is only oneindependent variable and we will generalize it to more independent variables</p><p>case in Section 5. Also, we can work with any interval for X and we speciedthe interval 0;M only for notational convenience. As described below, thesolving of fuzzy optimization is not a trivial work. The fuzzy optimizationsible moves out of a current conguration.</p><p> Tabu restrictions: there are certain conditions imposed on moves whichmake some of them forbidden, these forbidden moves are known as tabu.</p><p>And are maintained in short-term memory on a list, called tabu list. Afterproblt of candidate moves (neighborhood, or trial solutions) is the set of all pos-GA both in time required to obtain a solution and in the solution quality in</p><p>solving some combinatorial optimization problems.</p><p>The TS requires the following basic elements to be dened [1]:</p><p> Conguration is a solution or an assignment of values to variables. A move is a transition (s0 ! s00) from one trial solution (s0) to another (s00).</p><p>C. Li et al. / Information Sciences 158 (2004) 313 5ems are much more complex than their crisp counterpart. Moreover, the</p></li><li><p>optim</p><p>mapsN</p><p>Th</p><p> Gi</p><p>vis</p><p>p0;</p><p>6P p0; p1; . . . ; pN determines a unique partition of S into cells. Thus spec-ies the address of each cell. At each iteration step, we draw ns samplepoint from a uniform distribution over nc chosen cells. These points arethe neighbors of the current solution X c (trial solutions). The number ofthe trial solutions is ncns. Thus, the address of a cell can be expressedas the following array A.</p><p>n0 n1 n2 nNion of the coordinate interval along the x0; x1; . . . ; xN axes intop1; . . . ; pN parts. The problem-specic empirical partition parameterstrategies. We use here the following neighborhood structure as described</p><p>in [6]. The solution space S 0; 1N1 is partitioned into disjunct cells by di-ven a current solution X c, one can generate a trial solution using severalFTS.</p><p> Let X c, X t, X b denote the current, trial, and best solution and hc, ht, hb denotethe corresponding current, trial and best objective value, respectively.ere are some important things require to be described before we state our0; 1 to maximize h. FTS will be designed to nd this vector in 0; 1 inthis paper.0; 1N1 into the real numbers. Now we want to nd a vector X in1 N1dierentiating the objective function with respect to the fuzzy variables. We</p><p>provide a TS method for solving the fuzzy optimization in this paper.</p><p>We wish to nd X in 0;M to maximize Y . However, we cannot maximizeY since it is a fuzzy set. We use the centroid of Y to measure the largeness of Y ,which is the same as that in [5]. Now we wish to nd X in 0;M to maximizethe centroid h</p><p>h centroidY 2h will be the objective function in our FTS. Next, we do the same discretizationwork for the fuzzy sets X as in [5]. Let N be a positive integer and choose</p><p>zi i M=N ; 06 i6N 3Thus we discretize 0;M into N 1 points zi, with z0 0 and zN M . We willnow use a vector X , comprised by the N 1 points, to represent fuzzy variableas shown below:</p><p>X x0; x1; . . . ; xN 4where</p><p>xi X zi; 16 i6N 5We input the discrete version of X to F and obtain h of F x0; x1; . . . ; xN . So, Fal solutions of the fuzzy optimization problem cannot be obtained by</p><p>C. Li et al. / Information Sciences 158 (2004) 313</p></li><li><p>where 16 ni6 pi, 16 i6N denotes the nith interval of the ith axis. We usethe following strategy to choose cell at each iteration: given X c, which be-longs to cell Ac, and a probability threshold P , for i 0; 1; 2; . . . ;N , draw arandom number r u0; 1, if r &lt; P , then Ati Aci; otherwise, drawrandomly an integer l from the following set fl : l 1; 2; . . . ; pi; l 6 Acig,and let Ati l, where Ac and At are the address of the current and trialsolutions.</p><p> The probability threshold P controls the shake-up that is performed on acertain solution to choose a neighbor. The higher the value of P , the lessshake-up is allowed and consequently the closer the neighbor to the current</p><p>solution and vice versa.</p><p> The cell, from which a move is picked at a certain iteration is tabu and ismaintained on tabu list. After a specied duration (tabu list size, or tabu pe-</p><p>riod), it is removed from the list and is free to be visited again. We use dy-</p><p>namic tabu list size strategy in this paper.</p><p> The dynamic tabu list strategy presented in literature [10] is adopted in thispaper. There are three list sizes: small (S), medium (M) and large (L). They</p><p>are set equal to N , 1:5N and 2N , respectively. The value of tabu list size isinitially set to S and then systematically changed every N iterations follow-ing the sequence {S, M, S, L, M, L}. The sequence is repeated as many timesas necessary until the end of the search.</p><p> The number of trial solutions (NTS ncns) controls the number of trial movesto be generated from a current one to decide on the next move. Clearly, the</p><p>larger the value of NTS, the better, simply because one has more choices to</p><p>select from. However, this is done at the expense of more computational ef-</p><p>fort. Therefore, if one were to decide a priori on a certain amount of compu-</p><p>tational eort to be made by the algorithm, then larger values of NTS at each</p><p>iteration means less number of iterations are made and vice versa. Let the aspiration condition be hF X t &gt; hb, i.e. if a move to X t is forbid-</p><p>den by tabu restriction, then the aspiration condition, hF X t &gt; hb, whensatised, can make this move allowable.</p><p>Now, we are ready to state our FTS</p><p>(1) Initialization: Let X c be an arbitrary initial solution generate randomlyfrom 0; 1N1 and hc be the corresponding objective function value com-puted using Eq. (1) and (2), let X b X c, hb hc. Select values for the fol-lowing parameters: P (probability threshold), NTS ncns (number of trialsolutions), Nmax (the maximal number of iterations allowed without im-provement), NNI (number of non-improved iterations) 0. Go to step 2.</p><p>(2) Using X c to generate NTS trial solutions X1</p><p>t ;X2</p><p>t ; . . . ;XNTS</p><p>t , and evaluate1 2 NTS</p><p>C. Li et al. / Information Sciences 158 (2004) 313 7their corresponding objective function value ht ; ht ; . . . ; ht , go to step 3.</p></li><li><p>cending order to make it be a fuzzy triangular number.In next section, we will use our algorithm to calculate an approximate so-</p><p>lution to an elementary fuzzy optimization problem.</p><p>4. An example</p><p>We will demonstrate the performance of the proposed method by applying itto an elementary fuzzy optimization example in this section. The corre-</p><p>sponding crisp optimization problem of it is simple, but the fuzzy version is</p><p>very complex.We wish to solve the fuzzy optimization problem [5]:</p><p>max Y X 1 X 6for X a fuzzy subset of 0; 1. We choose the centroid h of Y as a measure of thelargeness of Y . So we wish to nd X , a fuzzy subset of 0; 1 to maximize h.</p><p>Now let us assume that X is a discrete fuzzy subset of 0; 1 as required in ourFTS. Let X zi xi, for zi i=N , i 0; 1; . . . ;N . We do not know what are thevalues of xi to maximize h. From the functional dependence of h on the xi asshown below, we know that we cannot dierentiate h with respect to the xi toobtain an optimal solution [5].</p><p>Let Y yi ei, 06 i6M , then h PM</p><p>i0 yiei=PM</p><p>i0 ei. For simplicity letN 10 and let us compute e4 corresponding to y4 0:04. Due to fuzzy mul-tiplication, e4 is not a simple function of the xi. We may obtain y4 0:04 threeThe above algorithm is designed for solving fuzzy optimization task when</p><p>the independent variable X is a general fuzzy subset of 0;M . When the in-dependent variable X is a special type of fuzzy set (triangular, trapezoidal,Gaussian, . . . fuzzy numbers), TS can also be applied to produce (approximate)solution. Assume X be a triangular fuzzy number, or X a; b; c,06 a &lt; b &lt; c6M . We can also partition the solution space 0;M 3 into cells.And do the same search as described above except that in each iterations we</p><p>should sort the three numbers of the new produced trial solution in an as-ways:(3) Ordering h1t ; h2t ; . . . ; h</p><p>NTSt in an descending order, and denote them by</p><p>h1t ; h2t ; . . . ; h</p><p>NTSt . If h</p><p>1t is not tabu, or if it is tabu but h</p><p>1t &gt; hb, then let</p><p>X c X 1t , hc h1t and go to step 4; otherwise let X c XLt , hc hLt , where</p><p>hLt is the maximum objective function of X2t ; . . . ;X</p><p>NTSt that is not tabu</p><p>and go to step 4. If all X2t ; . . . ;X</p><p>NTSt are tabu, let NNI NNI 1 and</p><p>go to step 2.(4) Let Ac keep tabu status for a number of iterations that corresponding to the</p><p>current size of tabu list. If hc &gt; hb, let X b X c, hb hc and NNI 0; ifhc6 hb, let NNI NNI 1, if NNI &lt; Nmax, go to step 2, otherwise stop.</p><p>8 C. Li et al. / Information Sciences 158 (2004) 313z11 z6, z21 z8, z41 z9 from X and 1 X , respectively. Fuzzy</p></li><li><p>multiplication then implies the e4 maxfminfx1; x6g;minfx2; x8g;minfx4; x9gg.Obviously, we cannot dierentiate h with respect to the xi to obtain an optimalsolution [5]. After computing every yi, we know that M 42 when N 10.</p><p>Now, let us apply our FTS algorithm to the problem. The parameters for thealgorithm are</p><p>(1) The discretization of 0; 1 is z0 0, z1 0:1; . . . ; z10 1:0 (N 10).(2) The partition of the solution space is P 2; 2; 2; 2; 2; 2; 2; 2; 2; 2; 2. That</p><p>is, each axes are parted into two parts.</p><p>(3) The number of trial solutions is nc; ns 1; 10.(4) The probability threshold is P...</p></li></ul>

Recommended

View more >