Domain Learning Particle Swarm Optimization With a Hybrid Mutation Strategy

Domain Learning Particle Swarm Optimization With a Hybrid Mutation Strategy

Zixuan Xie, Xueyu Huang, Wenwen Liu
Copyright: © 2022 |Pages: 27
DOI: 10.4018/IJSIR.303572
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

When traditional particle swarm optimization algorithms deal with highly complex, ultra-high-dimensional problems, traditional particle learning strategies can only provide little help. In this paper, a particle swarm optimization algorithm with a hybrid variation domain dimension learning strategy is proposed, which uses the domain dimension average of the current particle dimension to generate guiding particles. At the same time, an improved inertia weight is also used, which effectively avoids the algorithm from easily falling into local optimum. To verify the strong competitiveness of the algorithm, the algorithm is tested on nineteen benchmark functions and compared with several well-known particle swarm algorithms. The experimental results show that the algorithm proposed in this paper has a significant effect on unimodal functions, and has a better effect on multimodal functions. Guided particles, improved inertia weight and mutation strategy can effectively balance local search and global search, and can better converge to the global optimal solution.
Article Preview
Top

1. Introduction

The swarm intelligence optimization algorithm has a long history and has been developed for decades. Common intelligent optimization algorithms include particle swarm algorithm (PSO) (Yada, 2018), artificial bee colony algorithm (ABC) (Xue, 2018), ant colony algorithm (ACO) (Engin, 2018), Fish Swarm Algorithm (FSA) (Neshat, 2014), Cuckoo Search Algorithm (CS) (Yang, 2014), Hunting Algorithm (HuS) (Oftadeh, 2010), Differential Evolution Algorithm (DE) (Li, 2019), etc. Evolutionary computing, inspired by biological evolution, aims to solve complex problems in less time (Nayyar, 2018). Metaheuristic algorithms have made great progress in recent decades, and the above algorithms are all proposed on this basis, which are used to optimize and find solutions to computer science problems (Nayyar, 2018). The current popular stochastic biological optimization techniques also include genetic algorithms, which perform well in solving multi-objective optimization problems (Nayyar, 2018). Ant colony algorithm is still an efficient method for solving discrete optimization problems (Nayyar, 2016).

The particle swarm optimization algorithm was first proposed by Dr. Eberhart and Dr. Kennedy in 1995. The algorithm has a simple structure, is easy to implement, and can also handle some highly complex problems. The algorithm simulates the bird's foraging behavior in order to find the best food source (global optimal solution). The entire flock of birds divides and cooperates, and while looking for things, the flocks transmit their position information to each other (Kennedy & Eberhart, 1995; Eberhart & Kennedy, 1995). At the end of the algorithm iteration, the whole flock of birds gathers near the food source. This phenomenon can be called that the flock of birds has found the optimal solution to the problem, and the problem converges at this position.

Once the particle swarm optimization algorithm was proposed, it has caused extensive influence in the academic circle. The reason is that the algorithm is simple in structure, easy to understand, easy to implement, and has few parameter settings, and it has been applied to many real-world problems. Such as function optimization problems (Chen, 2018), motion planning problems (Kim & Lee, 2015), resource allocation problems (Gong, 2012), image processing problems (Setayesh, 2013) and so on.

Complete Article List

Search this Journal:
Reset
Volume 15: 1 Issue (2024)
Volume 14: 3 Issues (2023)
Volume 13: 4 Issues (2022)
Volume 12: 4 Issues (2021)
Volume 11: 4 Issues (2020)
Volume 10: 4 Issues (2019)
Volume 9: 4 Issues (2018)
Volume 8: 4 Issues (2017)
Volume 7: 4 Issues (2016)
Volume 6: 4 Issues (2015)
Volume 5: 4 Issues (2014)
Volume 4: 4 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing