Adaptive Multiobjective Memetic Optimization

Adaptive Multiobjective Memetic Optimization

Hieu V. Dang, Witold Kinsner
DOI: 10.4018/IJCINI.2016100102
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Multiobjective memetic optimization algorithms (MMOAs) are recently applied to solve nonlinear optimization problems with conflicting objectives. An important issue in an MMOA is how to identify the relative best solutions to guide its adaptive processes. In this paper, the authors introduce a framework of adaptive multiobjective memetic optimization algorithms (AMMOA) with an information theoretic criterion for guiding the adaptive selection, clustering, local learning processes, and a robust stopping criterion of AMMOA. The implementation of AMMOA is applied to several benchmark test problems with remarkable results. The paper also presents the application of AMMOA in designing an optimal image watermarking to maximize the quality of the watermarked images and the robustness of the watermark.
Article Preview
Top

1. Introduction

Multiobjective optimization deals with the function of more than one objective. In most practical decision making problems, there are multiple conflicting objectives or multiple criteria. Unfortunately, these real world problems are often difficult, if not impossible, to solve without advanced and efficient optimization techniques. This is because these problems are characterized by multiple objectives that are much more complex than single-objective problems. Consequently, a multiobjective optimization problem has been mostly solved as a single-objective optimization problem. However, this method approaches one solution instead of a set of optimal solutions.

An evolutionary algorithm (EA) mimics the nature's evolutionary principles to formulate search procedures. Such an EA is a population-based algorithm that uses a population of solutions in each iteration, instead of a single solution in classical methods. The outcome of EA is also a population of solutions. Thus, an EA can be efficiently used to capture multiple optimal solutions in its final population for multiobjective optimization problems. EAs have the important advantage of being able to sample multiple solutions simultaneously. This feature makes EAs common-used in multiobjective optimization (called multiobjective optimization using EAs - MOEA). Many MOEAs have been proposed in the literature. Most of them are based on the models of genetic algorithms (GA) (Deb, 2001). Recently, biologically inspired models, such as particle swarm (PS), differential evolution (DE), and memetic algorithms (MA) have been introduced for multiobjective optimization (Lee & Kim, 2013; Wang & Cai, 2012; Ishibuchi et al., 2009). The main difference between these approaches is in the method of generating new candidate solutions.

The term “meme” was first introduced and defined by Richard Dawkins as the basic unit of cultural transmission or imitation (Dawkins, 1989). Inspired by Darwinian's evolutionary theory and Dawkin's theory of memes, the term memetic algorithm (MA) was first introduced by Moscato in 1989 (Moscato, 1989). In this work, Moscato viewed MAs as extentions of EA that adopt the hybridization between EA and an individual learning procedure performing local refinements. The use of MA for multiobjective optimization (mutiobjective memetic optimization algorithm, MMOA) has attracted much attention and effort in recent years. In the literature, MMOA has been demonstrated to be much more effective and efficient than MOEAs and the traditional optimization searches for some specific optimization problem domains (Krasnogor & Smith, 2006; Kenedy & Eberhart, 2001; Ishibuchi et al., 2009; Neri & Cotta, 2012; Chen et al., 2011; Bergmeir et al., 2012; Dang & Kinsner, 2014). The performance of MMOA not only relies on the evolutionary framework, but also depends on the local searches.

Complete Article List

Search this Journal:
Reset
Volume 18: 1 Issue (2024)
Volume 17: 1 Issue (2023)
Volume 16: 1 Issue (2022)
Volume 15: 4 Issues (2021)
Volume 14: 4 Issues (2020)
Volume 13: 4 Issues (2019)
Volume 12: 4 Issues (2018)
Volume 11: 4 Issues (2017)
Volume 10: 4 Issues (2016)
Volume 9: 4 Issues (2015)
Volume 8: 4 Issues (2014)
Volume 7: 4 Issues (2013)
Volume 6: 4 Issues (2012)
Volume 5: 4 Issues (2011)
Volume 4: 4 Issues (2010)
Volume 3: 4 Issues (2009)
Volume 2: 4 Issues (2008)
Volume 1: 4 Issues (2007)
View Complete Journal Contents Listing