Social Spider Algorithm for Training Artificial Neural Networks

Social Spider Algorithm for Training Artificial Neural Networks

Burak Gülmez, Sinem Kulluk
Copyright: © 2019 |Pages: 18
DOI: 10.4018/IJBAN.2019100103
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Artificial neural networks (ANNs) are one of the most widely used techniques for generalization, classification, and optimization. ANNs are inspired from the human brain and perform some abilities automatically like learning new information and making new inferences. Back-propagation (BP) is the most common algorithm for training ANNs. But the processing of the BP algorithm is too slow, and it can be trapped into local optima. The meta-heuristic algorithms overcome these drawbacks and are frequently used in training ANNs. In this study, a new generation meta-heuristic, the Social Spider (SS) algorithm, is adapted for training ANNs. The performance of the algorithm is compared with conventional and meta-heuristic algorithms on classification benchmark problems in the literature. The algorithm is also applied to real-world data in order to predict the production of a factory in Kayseri and compared with some regression-based algorithms and ANNs models. The obtained results and comparisons on classification benchmark datasets have shown that the SS algorithm is a competitive algorithm for training ANNs. On the real-world production dataset, the SS algorithm has outperformed all compared algorithms. As a result of experimental studies, the SS algorithm is highly capable for training ANNs and can be used for both classification and regression.
Article Preview
Top

Introduction

Artificial neural networks are mathematical models inspired from the biological nervous system (Kulluk, 2013). ANNs are trained by changing the weights between neurons. The training dataset is showed to the neural network for learning and weights are changed to improve the network. After the training process, the network is tested with the testing dataset. To achieve high accuracy, the network should be trained well. There are various algorithms for training neural networks. The most common is the back-propagation algorithm which is a gradient-descent method. Besides to a number of benefits, such as its simplicity to learn and giving effective results, the BP algorithm has some drawbacks like trapping into local optima and needing a differentiable activation function. To overcome the local minimum problem of the BP algorithm, meta-heuristic algorithms can be used.

There are lots of studies about training neural networks. Most of them use meta-heuristic algorithms because of their advantages. Among these studies, Sexton and Gupta (2000) trained ANNs with a genetic algorithm and obtained better results than the BP algorithm. Gudise and Venayagamoorthy (2003) compared particle swarm optimization algorithm with the BP algorithm for training ANNs. They compared the error rates and particle swarm optimization algorithm gave the lowest error rates. Blum and Socha (2005) used ant colony optimization algorithm to train ANNs. They evaluated the performance of their algorithm on medicine dataset. The results showed that the hybrid version of ant colony algorithm gave the best results. Kim et al. (2005) trained ANNs with a modified genetic algorithm. Their algorithm was faster and gave better results than the compared algorithms.

Dengiz et al. (2009) used tabu search algorithm to train ANNs. Tabu search algorithm obtained better and faster results than genetic and simulated annealing algorithms. Harmony search algorithm was used for training ANNs by Kattan et al. (2010). They compared the performance of their algorithm with BP algorithm and harmony search algorithm gave better results. Zamani and Sadeghi (2010) trained ANNs with particle swarm optimization algorithm. They used some benchmark datasets in their experimental study and particle swarm optimization algorithm got high accuracy ratios. Mirjaliliet al. (2012) proposed a hybrid of particle swarm optimization and gravitational search algorithm (PSOGSA) as a new training method. Their experimental studies shown that PSOGSA outperforms both particle swarm optimization and gravitational search algorithms for training feed-forward ANNs in terms of converging speed and avoiding local minima. Kawam and Mansour (2012) implemented cuckoo search algorithm in training a feed-forward multi-layer perceptron (MLP) network. They compared the performance of cuckoo search algorithm with other competing meta-heuristic algorithms. Green II et al. (2012) compared central force optimization algorithm with particle swarm optimization algorithm on training ANNs. Performances of the algorithms were evaluated on XOR and iris datasets. The results showed that the central force optimization algorithm gave better results than the particle swarm optimization algorithm.

Firefly algorithm was used for training ANNs by Nandy et al. (2012a). The results on some benchmark problems were compared with the genetic algorithm. The comparison showed that firefly algorithm was more effective than the genetic algorithm for training ANNs. Then they also trained ANNs with artificial bee colony algorithm (Nandy, Sarkar, & Das, 2012b). Artificial bee colony algorithm also gave good results in training neural networks. Askarzadeh and Rezazadeh (2013) used bird matching algorithm for training ANNs. Bird matching algorithm did not give best results but gave acceptable results on the test datasets. ANNs were trained with a hybrid of harmony search and hunting search algorithms by Kulluk in 2013. The results of her algorithm were compared with some traditional and meta-heuristic algorithms. Mostly her hybrid algorithm gave better results than the compared algorithms.

Complete Article List

Search this Journal:
Reset
Volume 11: 1 Issue (2024)
Volume 10: 1 Issue (2023)
Volume 9: 6 Issues (2022): 4 Released, 2 Forthcoming
Volume 8: 4 Issues (2021)
Volume 7: 4 Issues (2020)
Volume 6: 4 Issues (2019)
Volume 5: 4 Issues (2018)
Volume 4: 4 Issues (2017)
Volume 3: 4 Issues (2016)
Volume 2: 4 Issues (2015)
Volume 1: 4 Issues (2014)
View Complete Journal Contents Listing