abdel rahman ahmed ali, mohamed h. gadallah, hesham a ...ieomsociety.org › dc2018 › papers ›...

12
Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A. Hegazi,, A Modification to Multi Objective NSGA II Optimization Algorithm, Proceedings of the Int’l Conference on Industrial Engineering and Operations Management, Washington DC, USA, September 27-29, 2018 © IEOM Society International A Modification to Multi Objective NSGA II Optimization Algorithm Abdel Rahman Ali M. Ahmed, Master Student, Mechanical Design & Production Department, Faculty of Engineering, Cairo University E.mail: [email protected] Mohamed H. Gadallah, Professor Mechanical Design & Production Department, Faculty of Engineering, Cairo University E.mail: [email protected] Hesham A. Hegazi, Associate Professor, Mechanical Design & Production Department, Faculty of Engineering, Cairo University Email: [email protected] Abstract — This Paper provides a new approach to handle constrained multi objective problems using a modification to the fast elitist multi objective Genetic Algorithm, NSGA II (Non-dominated Sorting Genetic Algorithm II). The modification overcomes shortcomings of some previous approaches to handle constrained multi objective problems like Penalty Function Approach, Jimenez-Verdegay-Gomez-Skarmeta's approach, and Ray-Tai-Seow's Method. Comparative studies are conducted using a group of quality indices [2,19] to compare among three different algorithms: regular NSGA, NSGA-II with penalty function and M-NSGA-II with constraint handling techniques. Results show that the new modification is the best among the three approaches. Due to limited space, only recent studies on the subject of multi objective optimization are reviewed for brevity. Keywords — Multi-objective Problems, NSGA II Algorithm, Quality Indices. Nomenclature P t Parent Population Pp Total number of different non-dominated sets in the population P Pareto optimal Front Qt offspring population Rt combined Population of off spring and parent population Di normalized Euclidean distance MOEAs Multi Optimization Evelutionary Algorithms CLu (P) Number of Cluster on the obtained Pareto Frontier Do Scaled Area: Dominant Area AC Accuracy of observed Pareto Frontier NDEM Non-Dominated Evaluation Metric MPFE Maximum Pareto-Optimal Front Error NDC Number of Distinct Choices CLu (P) Number of Cluster on the obtained Pareto Frontier S Spacing Spread OS overall Pareto spread I. INTRODUCTION Multi-objective optimization is enjoying lots of interests by researchers in the field. Unlike single-objective optimization, finding an optimal trade-off among conflicting objectives in a multi-objective problem is often more complex and challenging [20]. The Pareto Front developed over years and resulting from different techniques adds more to the difficulty of Multi objective optimization compared with the single objective optimization techniques. Several quality indices are developed to assess the resulting solutions. Convergence, diversity of solutions, time consumed to reach a set of solutions, ability to handle the different constraints are some measures. In an earlier paper [19], various performance indices are used to measure the quality of Pareto-optimal sets to compare 1221

Upload: others

Post on 26-Jun-2020

9 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A ...ieomsociety.org › dc2018 › papers › 313.pdf · Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A. Hegazi,, A Modification

Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A. Hegazi,, A Modification to Multi Objective NSGA II Optimization Algorithm, Proceedings of the Int’l Conference on Industrial Engineering and Operations Management, Washington DC, USA, September 27-29, 2018

© IEOM Society International

A Modification to Multi Objective NSGA II Optimization Algorithm Abdel Rahman Ali M. Ahmed, Master Student,

Mechanical Design & Production Department, Faculty of Engineering, Cairo University E.mail: [email protected]

Mohamed H. Gadallah, ProfessorMechanical Design & Production Department, Faculty of Engineering, Cairo University

E.mail: [email protected]

Hesham A. Hegazi, Associate Professor, Mechanical Design & Production Department, Faculty of Engineering, Cairo University

Email: [email protected]

Abstract — This Paper provides a new approach to handle constrained multi objective problems using a modification to the fast elitist multi objective Genetic Algorithm, NSGA II (Non-dominated Sorting Genetic Algorithm II). The modification overcomes shortcomings of some previous approaches to handle constrained multi objective problems like Penalty Function Approach, Jimenez-Verdegay-Gomez-Skarmeta's approach, and Ray-Tai-Seow's Method. Comparative studies are conducted using a group of quality indices [2,19] to compare among three different algorithms: regular NSGA, NSGA-II with penalty function and M-NSGA-II with constraint handling techniques. Results show that the new modification is the best among the three approaches. Due to limited space, only recent studies on the subject of multi objective optimization are reviewed for brevity.

Keywords — Multi-objective Problems, NSGA II Algorithm, Quality Indices.

Nomenclature Pt Parent Population Pp Total number of different non-dominated sets in the population P Pareto optimal Front Qt offspring population Rt combined Population of off spring and parent population Di normalized Euclidean distance MOEAs Multi Optimization Evelutionary Algorithms CLu (P) Number of Cluster on the obtained Pareto Frontier Do Scaled Area: Dominant Area AC Accuracy of observed Pareto Frontier NDEM Non-Dominated Evaluation Metric MPFE Maximum Pareto-Optimal Front Error NDC Number of Distinct Choices CLu (P) Number of Cluster on the obtained Pareto Frontier S Spacing ∆ Spread OS overall Pareto spread

I. INTRODUCTION

Multi-objective optimization is enjoying lots of interests by researchers in the field. Unlike single-objective optimization, finding an optimal trade-off among conflicting objectives in a multi-objective problem is often more complex and challenging [20]. The Pareto Front developed over years and resulting from different techniques adds more to the difficulty of Multi objective optimization compared with the single objective optimization techniques. Several quality indices are developed to assess the resulting solutions. Convergence, diversity of solutions, time consumed to reach a set of solutions, ability to handle the different constraints are some measures. In an earlier paper [19], various performance indices are used to measure the quality of Pareto-optimal sets to compare

1221

Page 2: Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A ...ieomsociety.org › dc2018 › papers › 313.pdf · Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A. Hegazi,, A Modification

Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A. Hegazi,, A Modification to Multi Objective NSGA II Optimization Algorithm, Proceedings of the Int’l Conference on Industrial Engineering and Operations Management, Washington DC, USA, September 27-29, 2018

© IEOM Society International

the performance of three different multi-objective optimization (MOO) algorithms: Regular NSGA, M-NSGA-II, and NSGA with Penalty function approach. In this paper, twelve quality indices are used to compare the different algorithms in a meaningful manner to assist the practicing engineers to interpret different solutions. In addition, this paper develops a selection procedure to develop NSGA II with respect to constraint handling. Four bench mark problems are used for comparative analysis. Past studies are reviewed next.

II. BACKGROUND Non-Dominated Sorting Genetic Algorithm NSGA, first implemented by Srinivas and Deb [18]. It depends on sorting an initial population into a number of different of non-domination based sets. Each set contains non-dominated solutions between each other. It is worth to be mentioned that all solutions in the first set, are the best in the population where any solution dominates all solutions in all other sets. Solutions in the second set are all dominated only by any solution of first set while it dominates all solutions in other sets, and so on, the worst solutions belong to the last rank of solution sets where solutions are dominated by solutions in all other sets. Fitness and niche count assigned to each solution to calculate the shared fitness function. Although this approach provides a tradeoff solution close to true Pareto in one single run, the sharing function approach requires fixing Ϭ-share which is not practically applicable. Elitist Non-Dominated Sorting Genetic Algorithm (NSGA-II), was proposed by Deb et al., 2000a, 2000b. it employs an explicit diversity-preserving mechanism. Offspring population Qt is created by using the parent population Pt. Non-domination sorting is performed on a combined population Rt = Qt U Pt. to form the first non-dominated front, and continue with solutions of the second non-dominated front, followed by the third non-dominated front and so on. The algorithm depends mainly on five major stages: 1- Combine parent and offspring population in new one population, 2- Perform a non-domination sorting to identify different fronts, F1, F2, …Fn where F1, is the best front, followed by F2, …and

Fn. 3- Create an empty Pt+1 = Ǿ and set 2N is the total number of its solution 4- Add solution of F1 to the population Pt+1, then F2, F3, and Fn until number of total solution is become more than 2N, then stop

adding any more fronts. 5- Apply Crowded Tournament Selection to guarantee the solutions which lie in the last added front to Pt+1. Solutions residing in

a less crowded area with the larger crowding distance wins and the rest are omitted to make population Pt+1 of 2N only. Figure 1 depicts the various stages of NSGA II procedure.

Figure 1 Schematic of the NSGA-II procedure [18]. Adopted from “Multi-Objective Optimization, using Evolutionary Algorithms” by John Wiley

One major disadvantage of NSGA II like any other MOEAs assume that the optimization problem is unconstrained. Since constraints are inevitable in any real-world multi-objective optimization problem, lots of approaches are developed to handle the constrained problems, some of these approaches are concluded here: - Penalty Function Approach [1]: The objective functions are added to the set of constraints by a penalty function. Sum of

Constraints violation are multiplied by a penalty parameter and then the product is added to each function to form what so called the penalized function, a consequence any optimization algorithm can be used to solve it, and results in Pareto-optimal set. Penalty Function approach requires several trials and more efforts to get solutions as close to the true Pareto optimal solutions.

- Jimenez-Verdegay-Gomez-Skarmeta's Method [18] deals only with inequality constraints. It requires a careful consideration of feasible and infeasible solutions and the use of niching to maintain diversity in the obtained Pareto-optimal solutions.

- Ray-Tai-Seow's Method [1] A non-domination check of the constraint violations is made. Three different non-dominated rankings of the population are first performed, ranking the solutions of objective functions, the constraints violations, and the third ranking is performed by using a combined objective function and constraint violation values. The major disadvantage of this approach occurs when all population members are feasible and belong to a suboptimal non-dominated front, the algorithm stagnates, due to the all of the population members will have a rank equal to 1, since all solutions are feasible, they fill up the population and no further processing is made. During the crossover operation, three offspring are created.

1222

Page 3: Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A ...ieomsociety.org › dc2018 › papers › 313.pdf · Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A. Hegazi,, A Modification

Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A. Hegazi,, A Modification to Multi Objective NSGA II Optimization Algorithm, Proceedings of the Int’l Conference on Industrial Engineering and Operations Management, Washington DC, USA, September 27-29, 2018

© IEOM Society International

- Monkey Algorithm and the Krill Herd Algorithm are developed (Khalil et al., 2015). It provides a compromise between exploration/ diversification and exploration/ intensification to solve large scale problems. Zixing and Wang (2006) reviewed existing evolutionary algorithms and introduced a new algorithm in which a parent population is replaced if it is dominated by a non-dominated individual in the offspring population. And then crossover is used as a recombination operator to enrich the exploration and exploitation abilities of the proposed approach.

III. MODIFICATION TO NSGA II ALGORITHM

The fast elitist multi objective Genetic Algorithm, NSGA II is developed by (Kalyanmoy Deb, Amrit Pratap, Sameer Agrawal, T. Meyarivan, 2002). Constraints divide the search space into two domains: feasible and infeasible regions. A constrained multi-objective optimization problem can be stated as: Minimize / Maximize

⎩⎪⎪⎨

⎪⎪⎧

𝑓𝑓 𝑚𝑚 (𝑥𝑥), 𝑚𝑚 = 1,2, … . ,𝑀𝑀;

𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆 𝑆𝑆𝑡𝑡: 𝑔𝑔 𝐽𝐽 (𝑥𝑥) ≥ 0, 𝐽𝐽 = 1,2, … 𝐽𝐽; ℎ 𝑘𝑘 (𝑥𝑥) = 0, 𝑘𝑘 = 1,2, … 𝑘𝑘; 𝑥𝑥 𝑖𝑖 (𝐿𝐿) ≤ 𝑥𝑥 𝑖𝑖 ≤ 𝑥𝑥 𝑖𝑖 (𝑈𝑈), 𝑖𝑖 = 1,2, …𝑛𝑛;

Where: 𝑓𝑓 𝑚𝑚 (𝑥𝑥), 𝑚𝑚 = 1,2, … . ,𝑀𝑀 is a set of multi objective functions, 𝑔𝑔 𝐽𝐽 (𝑥𝑥) ≥ 0, 𝐽𝐽 = 1,2, … 𝐽𝐽 is refer to inequality constraint set while ℎ 𝑘𝑘 (𝑥𝑥) = 0, 𝑘𝑘 = 1,2, … 𝑘𝑘 refers to equality constraint set. 𝑥𝑥 𝑖𝑖 (𝐿𝐿) ≤ 𝑥𝑥 𝑖𝑖 ≤ 𝑥𝑥 𝑖𝑖 (𝑈𝑈), 𝑖𝑖 = 1,2, … 𝑛𝑛 are the lower and upper decision variable limits. Handling constraints, main shortcomings result from the old methods are taken into consideration in constructing the new approach which depends mainly on predefining the objective functions and coding it to detect constraints in order to enable the optimization algorithm to converge directly to the true optimal solutions and get solutions entirely over the objective space. Better convergence and diversity of solutions are obtained from the developed algorithm. This is the essence of the NSGA II modification given next in Assumption of the New Approach. Main Advantages of the New Approach: 1- Contrary of ignoring infeasible solution method which only valid to solve a single objective problem, the approach is valid to

solve multi objective optimization problems with constraints. 2- No penalty function is needed here which save time trial and error for determining the best penalty parameter in order to have a

solution closer to the true Pareto. 3- The approach is valid for both equality and inequality constraints 4- Contrary of Ray-Tai-Seow's Method [18] only one single run is required to solve the multi objective.

The new approach will use Binh and Korn (1997) 2-variable constrained problem:

Minimize =�𝑓𝑓1(𝑥𝑥,𝑦𝑦) = 4𝑥𝑥2 + 4𝑦𝑦2

𝑓𝑓2(𝑥𝑥,𝑦𝑦) = (𝑥𝑥 − 5)2 + (𝑦𝑦 − 5)2

Subject to: 𝑔𝑔1 (𝑥𝑥,𝑦𝑦) = (𝑥𝑥 − 5)2 + 𝑦𝑦2 ≤ 25 𝑔𝑔2 (𝑥𝑥,𝑦𝑦) = (𝑥𝑥 − 8)2 + (𝑦𝑦 + 3)2 ≥ 7.7 and 0 ≤ x ≤ 5; 0 ≤ y ≤ 3 Figure 2 illustrates the bounds of BNH solutions, the line OCD represents the Pareto Optimal set when no constraints exist, and thus all solutions above this line are feasible, boundary OA(A’)EDCB(B’)O represents the solutions in the feasible space which result from the interaction between the constraints, part of the dashed line ABC lies under the feasible solution of the objective functions f1, and f2 and this is neglected, as a result the line ABD is a boundary for feasible solution which satisfy all the constraints. All penalty function approaches depend on penalty factor which moves the solution away from constraint violation. The solutions on the constraint in the objective space are the closer solution to the optimal set when exist, consequently converging closer to the Pareto optimal set. This requires more time and iterations for convergence. In the Modified NSGA II algorithm, constraints are normalized and converted from inequality to equality form, objective functions are coded and solved only where decision variables are lie on the constraints boundaries which will lead the algorithm to go directly to the solutions that will satisfy all constraints and near to the true Pareto.

1223

Page 4: Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A ...ieomsociety.org › dc2018 › papers › 313.pdf · Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A. Hegazi,, A Modification

Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A. Hegazi,, A Modification to Multi Objective NSGA II Optimization Algorithm, Proceedings of the Int’l Conference on Industrial Engineering and Operations Management, Washington DC, USA, September 27-29, 2018

© IEOM Society International

Figure 2 Objective Space for the test problem BNH [18]

Adopted from “Multi-Objective Optimization, using Evolutionary Algorithms” by John Wiley

Steps of the M-NSGA-II Algorithm is illustrated as:

1: Coding constraints and objective functions then transforming constraints from inequality to equality form. 2: build one code for objective functions and constraints to force algorithm resulting in trade off solution only where the

constraints are violated. 3: procedure NSGA-II (N’, g, fk (xk)) N’ members evolved g generations to solve fk (x)

4: Initialize Population P’ 5: If solution does not violate the constraints (go to step 3) other with go to step 7 6: Repeat steps (1-4), to construct the initial population. 7: Generate random population - size N ‘ 8: Evaluate Objective Values 9: Assign Rank (level) Based on Pareto dominance - sort 10: Generate Child Population 11: Binary Tournament Selection 12: Recombination and Mutation 13: for i = 1 to g do 14: for each Parent and Child in Population do 15: Assign Rank (level) based on Pareto - sort 16: Generate sets of no dominated vectors along PF known 17: Loop (inside) by adding solutions to next generation starting from the first front until N‘ individuals found determine

crowding distance between points on each front 18: end for 19: Select points (elitist) on the lower front (with lower rank) and are outside a crowding distance 20: Create next generation 21: Binary Tournament Selection 22: Recombination and Mutation 23: end for 24: end procedure Several case studies from literature are studied further using the proposed algorithm next.

Figure 3 Schematic of the M-NSGA-II procedure

1224

Page 5: Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A ...ieomsociety.org › dc2018 › papers › 313.pdf · Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A. Hegazi,, A Modification

Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A. Hegazi,, A Modification to Multi Objective NSGA II Optimization Algorithm, Proceedings of the Int’l Conference on Industrial Engineering and Operations Management, Washington DC, USA, September 27-29, 2018

© IEOM Society International

IV. CASE STUDIES

The regular NSGA, NSGA-II (with penalty approach) and the developed M-NSGA-II are employed to four benchmark problems. In order to measure the quality of the new approach with respect to other algorithms, the following aspects are discussed: 1. Comparison among the three approaches with respect to convergence, diversity and number of iterations to the true Pareto. 2. Comparison among the three approaches with respect to quality indices used as a measure of convergence and diversity of solutions. Quality indices like Maximum Pareto front error (MPFE), Accuracy of Pareto frontier (AC), Spacing, Overall Spread (OS), Objective Spread, Maximum Spread, and Number of Distinct Choices (NDC), Cluster (CLu (P), Hyper Area (HV), Dominant Area (Do), Non-dominated Evaluation (NDEM), and Crowding distances are employed to assess the resulting Pareto Front.

Case study 1: CONSTR_EX

Minimize = �f1(x, y) = x

f2(x, y) = 1+yx

Subject to � g1(x, y) = y + 9x ≥ 6 g2(x, y) = −y + 9x ≥ 1

Table 1 gives a comparison between the 3 approaches employed and the set of quality indices. Figure 4 gives the Pareto Front of the 3 approaches.

Figure 4 M-NSGA II, vs. NSGA II with Penalty and Regular NSGA II

Table 1 Quality indices obtained by M-NSGA II, NSGA + Penalty, and Regular NSGA for CONSTR_EX Problem

CONSTR_EX Quality Indices Developed NSGA II NSGA + Penalty Regular NSGA

1st Obj. 2nd Obj. 1st Obj. 2nd Obj. 1st Obj. 2nd Obj. Min 0.36148 1.0393 0.57845 1.0599 0.57509 1.1305 Max 1.6937 9 1.6571 3.1029 1.601 3.3005 Rang 1.3322 7.9607 1.0787 2.043 1.026 2.17 StD. 0.36866 1.8166 0.32187 0.58967 0.32016 0.65252 Mean 0.99115 2.3654 1.0331 1.8713 0.96541 2.0834 Obj. Spread 1 1 1 1 1 1 O.S 1 1 1 Max. Spread 5.7073 1.6336 1.6973 Crowding distances 0.10281 0.20308 0.11028 Scaled H.V 0.76815 0.67079 0.67523 Scaled Dominant Area 0.17065 0.30866 0.30099 Accuracy of observed Pareto frontier 16.3395 48.656 42.0482 Non-Dominated Evaluation Metric 50 50 50 Maximum Pareto-Optimal Front Error 0 0.014149 0.0029519 NDC(at u= 0.1) 34 34 33 Spacing 1.4756 0.63307 0.75967 CLu(P) 1.4706 1.4706 1.5152

Comparing between the three approaches with respect to number of iterations. (CONSTR_EX)

1225

Page 6: Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A ...ieomsociety.org › dc2018 › papers › 313.pdf · Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A. Hegazi,, A Modification

Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A. Hegazi,, A Modification to Multi Objective NSGA II Optimization Algorithm, Proceedings of the Int’l Conference on Industrial Engineering and Operations Management, Washington DC, USA, September 27-29, 2018

© IEOM Society International

Figure 5 shows the behavior of different algorithms over the entire solution space with respect to number of iterations. The developed M-NSGA II is the fastest among the three algorithms to converge to true Pareto and has the best diversity as well.

Figure 5: M-NSGA II, NSGA II + Penalty, Regular NSGA II with respect iterations 50, and 100

Quality indices for NSGA II, M-NSGA II, and NSGA II with Penalty function The quality indices are plotted on x Axis ordered as follow: 1-OS1, 2-OS, 3- Max Spread, 4- Crowding distances, 5- HV, 6- Do, 7- AC, 8- NDEM, 9- MPFE, 10- NDC, 11- Spacing, and 12- CLu (P). The values are calculated using the three mentioned algorithm for the problem CONSTR_EX. Figure 6 gives a comparison between different quality indices. The developed M-NSGA II has the advantages of:

- Objective Overall Spread as a measure for Diversity - Max Spread - Hyper Volume which is a good measure for both Diversity and Convergence - Non-Dominated Evaluation Metric and - Spacing

The NSGA II and NSGA with penalty function have almost the same Crowding Distances and Maximum Pareto-Optimal Front Error (MPFE). Thus, the M-NSGA II is better than Regular NSGA and NSGA II with penalty function.

Figure 6. Comparison between Quality indices for NSGA II, M-NSGAII and NSGA II + Penalty function

Figure 6 shows that M-NSGA II wins the other two approaches in the following: - Objective and Overall Spread which is a measure for Diversity - Max Spread - Hyper Volume which is a good measure for both Diversity and Convergence - Non-Dominated Evaluation Metric - Spacing

And both of M-NSGA II and NSGA II with penalty almost the same and win Regular NSGA in the following:

1226

Page 7: Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A ...ieomsociety.org › dc2018 › papers › 313.pdf · Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A. Hegazi,, A Modification

Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A. Hegazi,, A Modification to Multi Objective NSGA II Optimization Algorithm, Proceedings of the Int’l Conference on Industrial Engineering and Operations Management, Washington DC, USA, September 27-29, 2018

© IEOM Society International

- Crowding Distances - Maximum Pareto-Optimal Front Error (MPFE)

From the above study and analysis, we can conclude that M-NSGA II is better than Regular NSGA and NSGA II with Penalty Case Study 2: MOPC1

Minimize = � f1(x, y) = 4𝑥𝑥2 + 4𝑦𝑦2 ;f2(x, y) = (x − 5)2 + (y − 5)2;

Subjected to � g1(x, y) = (x − 5)2 + 4.1y2 − 25 ≤ 0 ; g2(x, y) = −(x − 8)2 − (y + 3)2 + 7.7 ≤ 0;

0 ≤ x ≤ 5, 0 ≤ y ≤ 5,

Table 2 gives a comparison between the 3 approaches employed and the set of quality indices. Figure 7 gives the Pareto Front of the 3 approaches.

Figure 7 M- NSGA II, vs. NSGA II with Penalty and Regular NSGA II for MOPC1

Table 2 Quality indices obtained by Developed NSGA II, NSGA + Penalty, and Regular NSGA for MOPC1

MOPC1

Quality Indices Developed NSGA II NSGA + Penalty Regular NSGA 1st Obj. 2nd Obj. 1st Obj. 2nd Obj. 1st Obj. 2nd Obj.

Min 256 9 52.9049 6.4185 99.9485 0.015932 Max 355.9447 33.9961 123.5625 12.9019 193.9704 20.2255 Rang 99.9447 24.9961 70.6576 6.4834 94.0219 20.2096 StD. 32.7995 7.861 19.9953 1.9925 28.2308 5.1023 Mean 292.6528 17.0758 83.3958 8.7153 147.8387 4.4134 Obj. Spread 1 1 1 1 1 1 O.S 1 1 1 Max. Spread 72.8483 50.1724 68.002 Crowding distances Inf 2.5165 10.8654 Scaled H.V 0.8243 0.70035 0.76159 Scaled Dominant Area 0.15839 0.28141 0.21154 Accuracy of observed Pareto frontier 57.7957 54.8276 37.2242 Non-Dominated Evaluation Metric 50 50 47 Maximum Pareto-Optimal Front Error 0 0 0.19187 NDC(at u= 0.1) 31 31 30 Spacing 140.5009 40.1027 74.8671 CLu(P) 1.6129 1.6129 1.5667

Comparing between the three approaches with respect to number of iterations. (MOPC1) Figure 8 shows the behavior of each algorithm over the entire solution space with respect to diversity and conversion vs. iterations. It is obviously seen that M- NSGA II is the fastest among the three algorithms to converge closer to true Pareto and has the best diversity as well. It goes directly to the optimum solutions set.

1227

Page 8: Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A ...ieomsociety.org › dc2018 › papers › 313.pdf · Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A. Hegazi,, A Modification

Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A. Hegazi,, A Modification to Multi Objective NSGA II Optimization Algorithm, Proceedings of the Int’l Conference on Industrial Engineering and Operations Management, Washington DC, USA, September 27-29, 2018

© IEOM Society International

Figure 8: Regular NSGA, NSGA + Penalty, M-NSGA II vs. iterations (MOPC1)

Quality indices for NSGA II, M-NSGA II, NSGA II with Penalty function

Figure 9 shows the behavior of different algorithms over the entire solution space with respect to number of iterations. The developed M-NSGA II is the fastest among the three algorithms to converge to true Pareto and has the best diversity as well.

Figure 9. Quality indices plot for MOPC1

Fig. 9 shows the M-NSGA II is superior to the other algorithms with respect to maximum spread, spread, and hyper volume. It can also be concluded that NSGA II with penalty approach can miss its search to solution. Case Study 3: BinhKorn Function

Minimize =�𝑓𝑓1(𝑥𝑥,𝑦𝑦) = 4𝑥𝑥2 + 4𝑦𝑦2

𝑓𝑓2(𝑥𝑥, 𝑦𝑦) = (𝑥𝑥 − 5)2 + (𝑦𝑦 − 5)2

Subject to: 𝑔𝑔1 (𝑥𝑥,𝑦𝑦) = (𝑥𝑥 − 5)2 + 𝑦𝑦2 ≤ 25 𝑔𝑔2 (𝑥𝑥,𝑦𝑦) = (𝑥𝑥 − 8)2 + (𝑦𝑦 + 3)2 ≥ 7.7 and 0 ≤ x ≤ 5; 0 ≤ y ≤ 3 Table 3 gives a comparison between the 3 approaches employed and the set of quality indices, While Figure 10 gives the Pareto Front of the 3 approaches.

1228

Page 9: Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A ...ieomsociety.org › dc2018 › papers › 313.pdf · Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A. Hegazi,, A Modification

Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A. Hegazi,, A Modification to Multi Objective NSGA II Optimization Algorithm, Proceedings of the Int’l Conference on Industrial Engineering and Operations Management, Washington DC, USA, September 27-29, 2018

© IEOM Society International

Figure 10. M-NSGA II, vs. NSGA II with Penalty and Regular NSGA II for BinhKorn Function

Table 3 Quality indices obtained by Developed NSGA-II, NSGA + Penalty, and regular NSGA for BinhKorn Function

BinhKorn

Quality Indices Developed NSGA II NSGA + Penalty Regular NSGA 1st Obj. 2nd Obj. 1st Obj. 2nd Obj. 1st Obj. 2nd Obj.

Min 0.0032746 7.0687e-006 14.2925 2.0253 0.79522 0.0011349 Max 200.0893 49.6888 211.9004 51.9105 200.9968 44.0043 Rang 200.086 49.6888 197.6079 49.8852 200.2016 44.0032 StD. 60.4636 15.3539 60.1117 16.0311 61.6188 13.7956 Mean 63.2803 17.8917 84.8387 18.3387 71.0469 15.2682 Obj. Spread 1 1 1 1 1 1 O.S 1 1 1 Max. Spread 145.7796 144.1135 144.943 Crowding distances 4.8357 3.004 5.5304 Scaled H.V 0.82112 0.81879 0.80437 Scaled Dominant Area 0.16055 0.1618 0.17833 Accuracy of observed Pareto frontier 54.5631 51.5096 57.808 Non-Dominated Evaluation Metric 50 50 50 Maximum Pareto-Optimal Front Error 0.24156 0.69844 0.374 NDC(at u= 0.1) 33 34 31 Spacing 49.4608 55.0671 52.5274 CLu(P) 1.5152 1.4706 1.6129

Comparing between the three approaches with respect to number of iterations. (BinhKorn Function). Figure 11 shows the behavior of different algorithms over the entire solution space with respect to number of iterations. The developed M-NSGA II is the fastest among the three algorithms to converge to true Pareto and has the best diversity as well.

Figure 11 Regular NSGA, NSGA + Penalty, Developed NSGA II with respect to number of iterations

0 50 100 150 200 2500

10

20

30

40

50

60

1st Objective

2nd O

bject

ive

Developed NSGA II, vs. NSGA II with Penalty and Regular NSGA II

Developed New NSGAII ApproachNSGAII with Penalty FunctionRegular NSGAII

0 50 100 150 200 2500

10

20

30

40

50

60

1st Objective

2nd O

bjecti

ve

Regular, NSGA+Penalty, Developed NSGA II with Respect to Iteration At iterations 50, and 100

Regular NSGA[ite 50]NSGA+Pen.[ite 50]Developed NSGA II[ite 50]Regular NSGA[ite 100]NSGA+Pen.[ite 100]Developed NSGA II[ite 100]

1229

Page 10: Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A ...ieomsociety.org › dc2018 › papers › 313.pdf · Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A. Hegazi,, A Modification

Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A. Hegazi,, A Modification to Multi Objective NSGA II Optimization Algorithm, Proceedings of the Int’l Conference on Industrial Engineering and Operations Management, Washington DC, USA, September 27-29, 2018

© IEOM Society International

Again, the M-NSGA II gives better and faster convergence and diversity than the regular NSGAII or with penalty functions. Quality indices for NSGA II, M-NSGA II, NSGA II with Penalty function Figure 12 gives a comparison between Quality indices obtained by Regular NSGA II, M-NSGA II, and NSGAII+ Penalty function (BinhKorn Function).

Figure 12. Quality indices plot for BinhKorn Function

Figure 12 shows that M-NSGA II is superior to the other algorithms with respect to maximum spread, crowding distance, spread, and hyper volume. It can also be concluded that NSGA II with penalty approach can miss its search to solution. Case Study 4: SRN

Minimize =�𝑓𝑓1(𝑥𝑥,𝑦𝑦) = (x − 2)2 + (y − 1) 2 + 2

𝑓𝑓2(𝑥𝑥,𝑦𝑦) = 9x + (𝑦𝑦 − 1)2

Subject to: 𝑔𝑔1 (𝑥𝑥, 𝑦𝑦) = -x - y + 225; 𝑔𝑔2 (𝑥𝑥,𝑦𝑦) = -x + 3y-10; 0 ≤ x ≤ 6; 0 ≤ y ≤ 2.25 Table 4 gives a comparison between the 3 approaches employed and the set of quality indices. Figure 13 shows the observed Pareto Front for SRN Test problem using the three different algorithms. The same conclusion holds.

Figure 13 M-NSGA II vs. NSGA II with Penalty and Regular NSGA II for SRN Function

Table 4 Quality indices obtained by M-NSGA II, NSGA + Penalty, and Regular NSGA for SRN Function

SRN Quality Indices Developed NSGA II NSGA + Penalty Regular NSGA

1st Obj. 2nd Obj. 1st Obj. 2nd Obj. 1st Obj. 2nd Obj. Min 14.1001 -54.6041 53.6315 -53.6533 9.6329 -52.8773 Max 495.3465 38.8549 130.0986 19.0137 65.2305 -6.4547 Rang 481.2464 93.459 76.467 72.667 55.5976 46.4226 StD. 166.8636 30.5476 25.0216 22.5332 16.6956 13.9729 Mean 198.5193 -17.3044 88.3413 -23.2208 33.8199 -30.7653 Obj. Spread 1 1 1 1 1 1 O.S 1 1 1

0 50 100 150 200 250 300 350 400 450 500-60

-50

-40

-30

-20

-10

0

10

20

30

40

1st Objective

2nd O

bjecti

ve

Developed NSGA II, vs. NSGA II with Penalty and Regular NSGA II

Developed New NSGAII ApproachNSGAII with Penalty FunctionRegular NSGAII

1230

Page 11: Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A ...ieomsociety.org › dc2018 › papers › 313.pdf · Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A. Hegazi,, A Modification

Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A. Hegazi,, A Modification to Multi Objective NSGA II Optimization Algorithm, Proceedings of the Int’l Conference on Industrial Engineering and Operations Management, Washington DC, USA, September 27-29, 2018

© IEOM Society International

Max. Spread 346.6502 74.5912 51.216 Crowding distances Inf 4.0856 1.7084 Scaled H.V 0.74119 0.62813 0.57818 Scaled Dominant Area 0.23618 0.34428 0.39555 Accuracy of observed Pareto frontier 44.1954 36.2372 38.0715 Non-Dominated Evaluation Metric 50 50 50 Maximum Pareto-Optimal Front Error 0 0.069263 0.42172 NDC(at u= 0.1) 30 32 33 Spacing 161.2624 60.8617 35.8879 CLu(P) 1.6667 1.5625 1.5152

Comparing between the three approaches with respect to number of iterations. Figure 14 shows the behavior of different algorithms over the entire solution space with respect to number of iterations. The developed M-NSGA II is the fastest among the three algorithms to converge to true Pareto and has the best diversity as well.

Figure 14 Regular NSGA, NSGA + Penalty, and M-NSGA II with respect to iterations (SRN)

Quality indices for NSGA-II, M-NSGA-II, NSGA-II with Penalty function Figure 15 gives a comparison between Quality indices for function (SRN).

Fig.15 Quality indices plot for SRN Function

Figure 15 shows that M-NSGA II is superior to the other algorithms with respect to maximum spread, crowding distance, spread, Pareto accuracy, Spacing, and hyper volume. It can also be concluded that NSGA II with penalty approach can miss its search to solution. V. Conclusion A modification to the regular NSGA II algorithm is proposed to deal with the multi objective optimization problems with constraints. A Comparative Analysis between regular NSGA II, NSGA with penalty functions, and M-NSGA II using a set of quality indices. Results indicate that the M-NSGA is the best among the three algorithms in terms of convergence and diversity. Besides, The M-NSGA can handle various multi objective constrained problems in a single run and can reach the optimal set very fast.

1231

Page 12: Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A ...ieomsociety.org › dc2018 › papers › 313.pdf · Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A. Hegazi,, A Modification

Abdel Rahman Ahmed Ali, Mohamed H. Gadallah, Hesham A. Hegazi,, A Modification to Multi Objective NSGA II Optimization Algorithm, Proceedings of the Int’l Conference on Industrial Engineering and Operations Management, Washington DC, USA, September 27-29, 2018

© IEOM Society International

VI References [1] Jin Wu, Shapour Azarm, “Metrics for Quality Assessment of a Multi-objective Design Optimization Solution Set”,

Transactions of the ASME, Vol. 123, 2001, pp. x-x. [2] Kalyanmoy Deb Kanpur Genetic Algorithms Laboratory, Indian Institute of Technology Kanpur, PIN 208 016, India

([email protected]), “Scalable Test Problems for Evolutionary Multi-Objective Optimization”, TIK-Technical Report No. 112, July 17, 2001.

[3] Dr. R. V Dharaskar, Dr. V. M. Thakare, Mrs.P.M.Chaudhari, G. H. Raisoni College of Engineering, Nagpur,India “Computing the Most Significant Solution from Pareto Front obtained in Multi-objective Evolutionary”, (IJACSA) International Journal of Advanced Computer Science and Applications, Vol. 1, No. 4, 2010, pp. xx-xx.

[4] Ran Cheng, Miqing Li, Ye Tian, Xingyi Zhang, Shengxiang Yang, Yao Chu Jin and Xin Yao, “A Benchmark Test Suite for Evolutionary Many Objective Optimization”, Complex Intelligent System, Springer, 2017.

[5] Kalyanmoy Deb, Amrit Pratap, Sameer Agrawal, T. Meyarivan, “A Fast and Elitist Multi-Objective Genetic Algorithms: NSGA-II”, IEEE Transactions on Evolutionary Computation, Vol. 6, no. 2, 2002, pp. 182-197.

[6] Eckart Zitzler, “Evolutionary Algorithms for Multi Objective Optimization: Methods and Applications”, Ph.D. Dissertations, Swiss Fedral Institute of Technology Zurich, 1999.

[7] Margarita Reyes – Sierra and Carlos A. Coello Coello, “Multi Objective Particle Swarm Optimizers: A Survey of the State-of-the-Art”, International J. of Computational Intelligence Research, Vol. 2, no. 3, 2006, pp. 287-308.

[8] Miqing Li, Shengxiang Yang and Xiaohui Liu, “Diversity Comparison of Pareto Front Approximations in Many-Objective Optimization”, Engineering and Physical Sciences Research Council (EPSRC) of U.K, under Grant EP/K001310/1.

[9] Siewei Jiang, Yew-Soon Ong, Jie Zhang, Liang Feng, “Consistencies and Contradictions of Performance Metrics in Multi Objective Optimization”, IEEE Transactions on Cybernetics, 2014, pp. 1-14.

[10] Dan A. Iancu, Nikolaos Trichakis, “Pareto Efficiency in Robust Optimization”, Management Science, Vol. 60, no. 1, 2014, pp. 130-147.

[11] Test problems on Wikipedia web site, “https://en.wikipedia.org/wiki/Test_functions_for_optimization”. [12] Castro Jr., Olacir Rodrigues, “BioInspired Optimization Algorithms for Multiobjective Problems”, Ph.D. Thesis, Department

of Exact Sciences of the Fedral University of Parana, Brasil, 2017. [13] Nafiseh Naji, “A Review of the Metaheursitic Algorithms and their Capabilities (Particle Swarm Optimization, Firefly and

Genetic Algorithms)”, International J. of Current Engineering & Technology, Vol. 7, no. 3, 2017, pp. 921-925. [14] Bernd Bischl, Jakob Richter, Jakob Bossek, Daniel Horn, Janek Thomas, Michel Lang, “mlrMBO: A Modular Framework for

Model Based Optimization of Expensive Black Box Functions”, Preprint submitted to Computational Statistics and Data Analysis, 2017.

[15] Ahmed M.E. Khalil, Seif Eddeen K. Fateen, Adrian Bonilla Petriciolet, “MAKHA – A New Hybrid Swarm Intelligence Global Optimization Algorithm”, Algorithms, 2015, Vol. 8, pp. 336-365.

[16] Zixing, Cai, Yong, Wang, “A Multi Objective Optimization Based Evolutionary Algorithm for Constrained Optimization”, IEEE Transactions on Evolutionary Computation, Vol. 10, no. 6, 2006, pp. 658-675.

[17] Tan, Liu, Xianwen, Gao, Qingyun, Yuan, “An Improved Gradient Based NSGA II Algorithm by a New Chaotic Map Model”, Soft Computing, 2016, Springer.

[18] Kalyanmoy Deb, “Multi-Objective Optimization using Evolutionary Algorithms”, John Wiley & Sons, Inc. New York, NY, USA ©2001, ISBN: 047187339X.

[19] Abdel Rahman, Mohamed H. Gadallah, Hesham A. Hegazi “Multi-objective Optimization Indices: A comparative analysis”, AJBAS (Australian Journal of Basic and Applied Sciences), Vol. 10(15), 2016, pp. 10-25.

[20] Sh. Lotfi and F. Karimi, “A Hybrid MOEA/D-TS for solving multi-objective problems” Department of Computer Science, University of Tabriz, Tabriz, Iran.

Acknowledgments I would like to express my gratitude and appreciation to my supervisors Prof. Mohamed H. Gadallah, and Associate Prof. Hesham A. Hegazi for their great support and encouragement to conduct this research. Their wide knowledge and great experience helped me a lot to overcome all the obstacles during my research. BIOGRAPHIES Abdel Rahman Ali M. Ahmed, Master Student, Mechanical Design & Production Department, Faculty of Engineering, Cairo University, E.mail: [email protected], Contributions: Multi-objective Optimization Indices A comparative analysis Mohamed H. Gadallah, Technical Advisor, Education Development Fund (EDF) - The Egyptian Cabinet of Ministers, Professor of Industrial Engineering & Operations Research, Faculty of Engineering, Cairo University 12613, Tel (Office): +202-35678165, Fax: +202-35693025 Cell: + 0122-213-9310, +0109-760-2342 Email: [email protected], [email protected], https://aucegypt.academia.edu/MohamedHGadallah, http://staff.eng.cu.edu.eg/mhg, http://scholar.cu.edu.eg/mgadallah Hesham A. Hegazi, Associate Professor, Mechanical Design & Production Department, Faculty of Engineering, Cairo University, Email: [email protected]

1232