Adaptive Hybrid Higher Order Neural Networks for Prediction of Stock Market Behavior

Adaptive Hybrid Higher Order Neural Networks for Prediction of Stock Market Behavior

Sarat Chandra Nayak, Bijan Bihari Misra, Himansu Sekhar Behera
DOI: 10.4018/978-1-5225-0063-6.ch007
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This chapter presents two higher order neural networks (HONN) for efficient prediction of stock market behavior. The models include Pi-Sigma, and Sigma-Pi higher order neural network models. Along with the traditional gradient descent learning, how the evolutionary computation technique such as genetic algorithm (GA) can be used effectively for the learning process is also discussed here. The learning process is made adaptive to handle the noise and uncertainties associated with stock market data. Further, different prediction approaches are discussed here and application of HONN for time series forecasting is illustrated with real life data taken from a number of stock markets across the globe.
Chapter Preview
Top

1. Introduction

The chapter gives a deep insight into the architecture, background, and applications of higher order neural networks to the area of data mining, control, as well as function approximation. Particularly, three HONNs have been developed and applied to the task of short and long term prediction of daily closing prices of five fast growing global stock market data. The concept of stock market prediction, problems involved with it, pitfalls of statistical methods, and application of HONN have been addressed by this chapter.

During the last two decades there are tremendous development in the areas of soft computing which include Artificial Neural Network (ANN), evolutionary algorithms, and fuzzy systems. This improvement in computational intelligence capabilities has been enhanced the modeling of complex, dynamic and multivariate nonlinear systems. These soft computing methodologies have been applied successfully to the area data classification, financial forecasting, credit scoring, portfolio management, risk level evaluation etc. and are found to be producing better performance. The advantage of ANN applied to the area of stock market forecasting is that it incorporates prior knowledge in ANN to improve the prediction accuracy. It also allows the adaptive adjustment to the model and nonlinear description of the problems. ANNs are found to be good universal approximator which can approximate any continuous function to desired accuracy.

It has been found in most of the research work in financial forecasting area used ANN, particularly Multilayer Perceptron (MLP). The ability of MLP to perform complex nonlinear mappings and tolerance to noise in financial time series has been well established. Suffering from slow convergence, sticking to local minima are the two well-known lacunas of a MLP. In order to overcome the local minima, more number of nodes can be added to the hidden layers. Multiple hidden layers and more number of neurons in each layer also add more computational complexity to the network. Also, various feed forward and multilayer neural networks are found to be characterized with several drawbacks such as poor generalization, nonlinear input-output mapping capability as well as slow rate of learning capacity.

In the other hand, HONN are described as type of feed forward networks which provide nonlinear decision boundaries, hence offering better classification capability as compared to linear neuron by Guler and Sahin (1994). They are different from ordinary feed forward networks by the introduction of higher order terms into the network. HONN have fast learning properties, stronger approximation, greater storage capacity, higher fault tolerance capability and powerful mapping of single layer trainable weights as described by Wang et al. (2008). In most of the neural network models, neural inputs are combined using summing operation, where in HONN, not only summing units, but also units that find the product of weighted inputs called as higher order terms are present. Due to single layer of trainable weights needed to achieve nonlinear separability, they are simple in architecture and require less number of weights to capture the associated nonlinearity as suggested in Shin and Ghosh (1995), and Park et al. (2000). As compared to networks utilizing summation units only, higher order terms in HONN can increase the information capacity of the network. This representational power of higher order terms can help solving complex nonlinear problems with small networks as well as maintaining fast convergence capabilities discussed by Leerink et al. (1995).

Evolutionary training algorithms are capable of searching better than gradient descent based search techniques. The evolutionary hybrid networks received wide application in nonlinear forecasting due to its broad adaptive and learning ability discussed by Kwon and Moon (2007). The hybrid iterative evolutionary learning algorithm is more effective than the conventional algorithm in terms of learning accuracy and prediction accuracy as discussed by Yu and Zhang (2005).

Complete Chapter List

Search this Book:
Reset