Improving Performance of Higher Order Neural Network using Artificial Chemical Reaction Optimization: A Case Study on Stock Market Forecasting

Improving Performance of Higher Order Neural Network using Artificial Chemical Reaction Optimization: A Case Study on Stock Market Forecasting

Sarat Chandra Nayak, Bijan Bihari Misra, Himansu Sekhar Behera
DOI: 10.4018/978-1-5225-0788-8.ch070
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Multilayer neural networks are commonly and frequently used technique for mapping complex nonlinear input-output relationship. However, they add more computational cost due to structural complexity in architecture. This chapter presents different functional link networks (FLN), a class of higher order neural network (HONN). FLNs are capable to handle linearly non-separable classes by increasing the dimensionality of the input space by using nonlinear combinations of input signals. Usually such network is trained with gradient descent based back propagation technique, but it suffers from many drawbacks. To overcome the drawback, here a natural chemical reaction inspired metaheuristic technique called as artificial chemical reaction optimization (ACRO) is used to train the network. As a case study, forecasting of the stock index prices of different stock markets such as BSE, NASDAQ, TAIEX, and FTSE are considered here to compare and analyze the performance gain over the traditional techniques.
Chapter Preview
Top

Introduction

Artificial Neural Networks (ANN) are found to be good universal approximator which can approximate any continuous function to desired accuracy. It also allows the adaptive adjustment to the model and nonlinear description of the problems. Some earlier use of ANN for the financial forecasting purpose can be found in the research work carried out by Refenes et al. (1994), Schoeneburg (1990), Yoon et al. (1994), Yoon and Swales (1991), Choi et al. (1995), Gately (1996), and Drossu and Obradovic (1996). The ANNs have recently been applied to many areas such as data mining, stock market analysis, medical and many other fields. Gradient based methods are one of the most widely used error minimization methods used to train back propagation networks. Back propagation algorithm is a classical domain dependent technique for supervised training. It works by measuring the output error calculating the gradient of this error, and adjusting the ANN weights and biases in the descending gradient direction. Back propagation is the most commonly used and the simplest feed forward algorithm used for classification. Back propagation based ANNs are very popular methods to predict stock market with better calculation, spreading abilities and stronger nonlinear mapping ability.

Complete Chapter List

Search this Book:
Reset