Improved Memetic NSGA-II Using a Deep Neighborhood Search

Improved Memetic NSGA-II Using a Deep Neighborhood Search

Samir Mahdi, Brahim Nini
Copyright: © 2021 |Pages: 17
DOI: 10.4018/IJAMC.2021100108
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Elitist non-sorted genetic algorithms as part of Pareto-based multi-objective evolutionary algorithms seems to be one of the most efficient algorithms for multi-objective optimization. However, it has some shortcomings, such as low convergence accuracy, uneven Pareto front distribution, and slow convergence. A number of review papers using memetic technique to improve NSGA-II have been published. Hence, it is imperative to improve memetic NSGA-II by increasing its solving accuracy. In this paper, an improved memetic NSGA-II, called deep memetic non-sorted genetic algorithm (DM-NSGA-II), is proposed, aiming to obtain more non-dominated solutions uniformly distributed and better converged near the true Pareto-optimal front. The proposed algorithm combines the advantages of both exact and heuristic approaches. The effectiveness of DM-NSGA-II is validated using well-known instances taken from the standard literature on multi-objective knapsack problem. As will be shown, the performance of the proposed algorithm is demonstrated by comparing it with M-NSGA-II using hypervolume metric.
Article Preview
Top

Introduction

Multi-objective optimization problems (MOPs) appear in a natural way in various fields of industry, engineering, environment, economy and healthcare. They have two or three conflicting objective functions to be optimized at the same time. Such problems include a very large number of solutions, each of which represents the trade-off between the different objectives. The final aim is to find the optimal compromise solution known as Pareto-optimal front. However, the computation time to find the Pareto-optimal front by exact methods increases exponentially with the size of the problem. Large problems, which cannot be solved exactly in a reasonable amount of time, usually use heuristics or metaheuristics. They do not guarantee optimal solutions, but they try to find a set of solutions as diverse and close as possible to the Pareto-optimal front. From the many metaheuristics in current use, Multi-objective Evolutionary Algorithms (MOEAs) are, clearly, the most popular in today's specialized literature and still have several research opportunities to offer to newcomers (Carlos & Coello, 2017).

The ‘race’ to improve the heuristic results to effectively approach the true Pareto front is an important challenge for researchers. In response, a large number of methods have been developed either by designing new efficient algorithms or by improving existing algorithms. In this view, Deb et al. (2002) proposed a fast and elitist multi-objective genetic algorithm (NSGA-II) to enhance the robustness, the convergence and the computational complexity of NSGA proposed by Srinivas and Deb (1994). NSGA-II as part of MOEAs seems to be one of the reference algorithms, because of the characteristics that it presents, including sorting speed and elitism. However, NSGA-II has some shortcomings, such as low convergence accuracy and slow convergence. Hence, it is imperative to improve it by increasing its convergence speed, and improve its solving accuracy (Xiaoyun & Yi, 2018). To improve the performance of EAs, the concept of memetic algorithms (MAs) was introduced by Moscato (1989), by combining the global search ability of evolutionary algorithm and local search optimization method (LS). It is shown to be an effective way to solve the MOPs (Fang et al., 2018). The Memetic MOEAs can speed up the convergence and obtain a high-performance approximate Pareto front (Gong et al., 2016). A number of review papers using memetic technique to improve NSGA-II have been published: a memetic NSGA-II (M-NSGA-II) (Wang et al., 2017), a memetic evolutionary algorithm NSGA-CMA, which combines NSGA-II with a local search strategy based on the covariance matrix adaptation evolution strategy (Zhang & Ma, 2015), a multi-objective memetic algorithm based on NSGA-II and Simulated Annealing NSGA-II-SA (Cobos et al., 2016).

In this paper, the authors try to gain benefits from the strengths of exact methods in order to enhance the performance of Memetic non-Sorted Genetic Algorithm (M-NSGA-II). The M-NSGA-II is an integrative hybrid optimizer that incorporates Hill-Climbing First-Improvement Local Search component (HCFI-LS) into a standard NSGA-II. The proposed algorithm to improve M-NSGA-II is a novel collaborative hybrid scheme, combining M-NSGA-II as Pareto global search method and Branch-and-Bound algorithm as Pareto local search method (B&B-PLS). B&B-PLS which is based on dominance criterion and neighborhood search concept, applied a 'deep search' on some non-dominated solutions during the optimization process, contributing at improving the quality of solutions. The proposed method named Deep Memetic Non-dominated Sorting Genetic Algorithm (DM-NSGA-II) has been tested on standard benchmarks for the multi-objective knapsack problem. As will be shown, the proposed algorithm produces better results than M-NSGA-II using hypervolume performance metric (HV).

The remainder of this paper is organized as follows: In Section 2, some background concepts on multi-objective optimization and hybrid methods are presented. The mathematical model of multi-objective knapsack optimization is described in section 3. In section 4, the proposed algorithm is designed on multi-objective knapsack problem. Section 5 discusses the computational tests and the effectiveness of the proposed approach, which is compared with M-NSGA-II. Finally, section 6 summarizes the work with a discussion of the contribution of this paper, as well as the future direction.

Complete Article List

Search this Journal:
Reset
Volume 15: 1 Issue (2024)
Volume 14: 1 Issue (2023)
Volume 13: 4 Issues (2022): 2 Released, 2 Forthcoming
Volume 12: 4 Issues (2021)
Volume 11: 4 Issues (2020)
Volume 10: 4 Issues (2019)
Volume 9: 4 Issues (2018)
Volume 8: 4 Issues (2017)
Volume 7: 4 Issues (2016)
Volume 6: 4 Issues (2015)
Volume 5: 4 Issues (2014)
Volume 4: 4 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing