Functional Link Neural Network with Modified Artificial Bee Colony for Data Classification

Functional Link Neural Network with Modified Artificial Bee Colony for Data Classification

Tutut Herawan, Yana Mazwin Mohmad Hassim, Rozaida Ghazali
Copyright: © 2017 |Pages: 14
DOI: 10.4018/IJIIT.2017070101
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Functional Link Neural Network (FLNN) has emerged as an important tool for solving non-linear classification problem and has been successfully applied in many engineering and scientific problems. The FLNN structure is much more modest than ordinary feed forward network like the Multilayer Perceptron (MLP) due to its flat network architecture which employs less tuneable weights for training. However, the standard Backpropagation (BP) learning uses for FLNN training prone to get trap in local minima which affect the FLNN classification performance. To recover the BP-learning drawback, this paper proposes an Artificial Bee Colony (ABC) optimization with modification on bee foraging behaviour (mABC) as an alternative learning scheme for FLNN. This is motivated by good exploration and exploitation capabilities of searching optimal weight parameters exhibit by ABC algorithm. The result of the classification accuracy made by FLNN with mABC (FLNN-mABC) is compared with the original FLNN architecture with standard Backpropagation (BP) (FLNN-BP) and standard ABC algorithm (FLNN-ABC). The FLNN-mABC algorithm provides better learning scheme for the FLNN network with average overall improvement of 4.29% as compared to FLNN-BP and FLNN-ABC.
Article Preview
Top

Introduction

Classification is one of the most frequent studies in the area of Artificial Neural Networks (ANNs) and mostly involved in decision making task of human activity. The recent vast research activities in neural classification have established that ANNs are undeniably a promising tool and have been widely applied to various real world classification task includes bankruptcy prediction, handwriting recognition, speech recognition, application usage patterns, fault detection and medical diagnosis (Banu & Nagaveni, 2012; Chen, 2010; Kumar and Rath, 2016; Grace & Williams, 2016; Anitha & Acharjya, 2016). One of the best-known types of Neural Networks is the Multilayer Perceptron (MLP). The MLP structure consists of multiple layers of nodes which give the network the ability to solve problems that are not linearly separable. However, MLP usually requires a fairly-large amount of available measures in order to achieve good classification ability. When the number of input to the model and the number of hidden nodes become large, the MLP architecture will become complex and thus resulting in slower operation. Furthermore, difficulties in fixing appropriate number of neurons in a layer and number of hidden layers in a network has also make MLP architecture not that easy to train. An alternative approach of avoiding this problem is by removing the hidden layers from the architecture which prompted to an alternative network architecture named Functional Link Neural Network (FLNN) (Pao & Takefuji, 1992). The FLNN is a flat network (without hidden layers) and with this type topology it reduced the neural architectural complexity while at the same time possesses the ability to provide a nonlinear decision boundary for solving non-linear separable classification tasks.

The FLNN network is usually trained by adjusting the weight of connection between neurons. The standard method for tuning the weight in FLNN is using a Backpropagation (BP) learning algorithm. The BP-learning algorithm developed by Rumelhart et al. (Rumehart, 1986), is the most well-known and widely used for training a Neural Networks. The idea of BP-learning algorithm is to reduce network error, until the networks learned the training data. During the FLNN-BP training phase, the feedforward calculation is combined with backward error propagation for the purpose of adjusting the connection weights. However, one of the crucial problems with the standard BP-learning algorithm is that it can easily get trapped in local minima especially for those non-linearly separable classification problems (Dehuri & Cho, 2010a). The performance of the network learning also strictly dependent on the shape of the error surface, values of the initial connection weights, parameters such as learning rate and momentum (Liu et al., 2004). To improve this, the Artificial Bee Colony (ABC) optimization algorithm is be used to optimize the weights of FLNN instead of the BP-learning algorithm (Hassim & Ghazali, 2013). The ABC algorithm simulates the intelligent foraging behavior of a honey bee swarm was proposed by Karaboga (Karaboga, 2005) for solving numerical optimization problem. Training a connection weight in neural network is considered as an optimization task (Karaboga et al., 2007) since it is desired to find the optimal weight set of a neural network in training process. Training the FLNN using ABC algorithm (FLNN-ABC) as learning scheme on a Boolean function classification (Hassim & Ghazali, 2013) and 2-class classification datasets (Hassim & Ghazali, 2012) had shown to have good exploration and exploitation capabilities in searching optimal weight with better accuracy results. However the random foraging behaviour of employed bees’ in exploiting FLNN weight parameters has resulted in poor classification accuracy when applying on multiclass classification problems.

Complete Article List

Search this Journal:
Reset
Volume 20: 1 Issue (2024)
Volume 19: 1 Issue (2023)
Volume 18: 4 Issues (2022): 3 Released, 1 Forthcoming
Volume 17: 4 Issues (2021)
Volume 16: 4 Issues (2020)
Volume 15: 4 Issues (2019)
Volume 14: 4 Issues (2018)
Volume 13: 4 Issues (2017)
Volume 12: 4 Issues (2016)
Volume 11: 4 Issues (2015)
Volume 10: 4 Issues (2014)
Volume 9: 4 Issues (2013)
Volume 8: 4 Issues (2012)
Volume 7: 4 Issues (2011)
Volume 6: 4 Issues (2010)
Volume 5: 4 Issues (2009)
Volume 4: 4 Issues (2008)
Volume 3: 4 Issues (2007)
Volume 2: 4 Issues (2006)
Volume 1: 4 Issues (2005)
View Complete Journal Contents Listing