Firefly Algorithm Based on Euclidean Metric and Dimensional Mutation

Firefly Algorithm Based on Euclidean Metric and Dimensional Mutation

Jing Wang, Yanfeng Ji
DOI: 10.4018/IJCINI.286769
Article PDF Download
Open access articles are freely available for download

Abstract

Firefly algorithm is a meta-heuristic stochastic search algorithm with strong robustness and easy implementation. However, it also has some shortcomings, such as the "oscillation" phenomenon caused by too many attractions, which makes the convergence speed is too slow or premature. In the original FA, the full attraction model makes the algorithm consume a lot of evaluation times, and the time complexity is high. Therefore, In this paper, a novel firefly algorithm (EMDmFA) based on Euclidean metric (EM) and dimensional mutation (DM) is proposed. The EM strategy makes the firefly learn from its nearest neighbors. When the firefly is better than its neighbors, it learns from the best individuals in the population. It improves the FA attraction model and dramatically reduces the computational time complexity. At the same time, DM strategy improves the ability of the algorithm to jump out of the local optimum. The experimental results show that the proposed EMDmFA significantly improves the accuracy of the solution and better than most state-of-the-art FA variants.
Article Preview
Top

1. Introduction

Many problems in the real world can be transformed into optimization problems, but with the development of the times, these problems become more and more complex. One of the greatest challenges is high-dimensional nonlinear problems. Therefore, some new meta-heuristic swarm intelligence algorithms have been proposed, such as artificial bee colony algorithm (ABC) (Karaboga & Basturk, 2007), particle swarm optimization algorithm (PSO) (Kennedy & Eberhart, 1995) and cuckoo search (CS) (Yang & Deb, 2009). As a member of the swarm intelligence algorithm, the Firefly algorithm (FA) (Yang, 2010) has attracted many scholars' attention due to its simple structure, few parameters and easy implementation. Until now, FA has been successfully applied in various fields such as image recognition and path planning (Wang, Guo, Duan, Liu & Wang, 2012; Yang & He, 2013; Zhou, Tian, Zhao & Zhao, 2015).

However, FA also has some shortcomings. The fixed-step factor and attractiveness do not seem to match the actual situation. Too many times of mutual attraction between fireflies lead to the oscillation phenomenon (Wang, Wang, Zhou, Sun, Zhao, Yu & Cui, 2017), and is not conducive to the rapid convergence of the algorithm.

To overcome these shortcomings, some researchers first adjusted the parameters of FA. Yu et al. (Yu, Zhu, Ma & Mao, 2015) proposed a variable step size of firefly algorithm (VSSFA), which explores the space with a larger step size in the early stage, and mines the optimal solution with a smaller step size in the later stage. Gandomi et al. (Gandomi, Yang, Talatahari & Alavi, 2013) used 12 chaotic maps to adjust IJCINI.286769.m01, IJCINI.286769.m02 and IJCINI.286769.m03. Experiments have found that the best chaos algorithm is an algorithm that uses Gauss map to adjust attractiveness parameters. Yelghi (Yelghi & Köse, 2018) introduced tidal force into FA, replacing the original attractiveness parameter IJCINI.286769.m04. Liu et al. (Liu, Li, Deng & Ren, 2020) proposed an attraction formula based on sigmoid, which reconstructed the definition of attraction IJCINI.286769.m05 and enhanced the local search ability of the algorithm. A novel courtship learning framework was proposed by Peng et al (Peng, Zhu, Deng & Wu, 2020). It uses an archive mechanism to allow a female firefly to lead a weaker male firefly. This framework uses a logistic regression function to replace IJCINI.286769.m06 when solving the problem that the distance between two fireflies is too large and the attraction is close to 0. This improvement well balances the exploration and exploitation of the algorithm.

Complete Article List

Search this Journal:
Reset
Volume 18: 1 Issue (2024)
Volume 17: 1 Issue (2023)
Volume 16: 1 Issue (2022)
Volume 15: 4 Issues (2021)
Volume 14: 4 Issues (2020)
Volume 13: 4 Issues (2019)
Volume 12: 4 Issues (2018)
Volume 11: 4 Issues (2017)
Volume 10: 4 Issues (2016)
Volume 9: 4 Issues (2015)
Volume 8: 4 Issues (2014)
Volume 7: 4 Issues (2013)
Volume 6: 4 Issues (2012)
Volume 5: 4 Issues (2011)
Volume 4: 4 Issues (2010)
Volume 3: 4 Issues (2009)
Volume 2: 4 Issues (2008)
Volume 1: 4 Issues (2007)
View Complete Journal Contents Listing