Higher Order Neural Network for Financial Modeling and Simulation

Higher Order Neural Network for Financial Modeling and Simulation

Partha Sarathi Mishra, Satchidananda Dehuri
DOI: 10.4018/978-1-5225-0788-8.ch030
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Financial market creates a complex and ever changing environment in which population of investors are competing for profit. Predicting the future for financial gain is a difficult and challenging task, however at the same time it is a profitable activity. Hence, the ability to obtain the highly efficient financial model has become increasingly important in the competitive world. To cope with this, we consider functional link artificial neural networks (FLANNs) trained by particle swarm optimization (PSO) for stock index prediction (PSO-FLANN). Our strong experimental conviction confirms that the performance of PSO tuned FLANN model for the case of lower number of ahead prediction task is promising. In most cases LMS updated algorithm based FLANN model proved to be as good as or better than the RLS updated algorithm based FLANN but at the same time RLS updated FLANN model for the prediction of stock index system cannot be ignored.
Chapter Preview
Top

1. Introduction

The higher order neural network has re-awakened the scientific and engineering community to the modeling and processing of numerous quantitative phenomenons specifically in the field of financial domain using neural network. These networks are specifically designed for handling linearly non-separable problems using appropriate input representation. Thus, suitable enhanced representation of input data has to be found out. This can be achieved by increasing the dimensions of the input space. The input data which is expanded is used for training instead of the actual input data. In this case, higher order input terms are chosen so that they are linearly independent of the original pattern components. Thus, the input representation has been enhanced and linear-separability can be achieved in the extended space.

The increasing development in the field of NN has made their structure more complex in nature. This complexity has been raised as a result of combining a large number of hidden layers and a large number of neurons in those layers, making the NN model behavior more impracticable in the length of their training time. On the other hand HONN alleviate this problem by providing simpler NNs with all of the possible higher order multiplicative or functional interactions between the elements of the input vectors being provided explicitly.

HONN is a different type of neural network with the presence of expanded input space in it single layer feed-forward architecture. HONN contains summing units and product units that multiply their inputs. These high order terms or product units can increase the information capacity for the input features and provides nonlinear decision boundaries to give a better classification and prediction capability than the linear neuron (Sahin, 1994). A major advantage of HONNs is that only one layer of trainable weight is needed to achieve nonlinear separable, unlike the typical MLP or feed-forward neural network (Mishra and Dehuri, 2007).

Although most neural networks models share a common goal in performing functional mapping, different network architecture may vary significantly in their ability to handle different types of problems. For some tasks, higher order architecture of some of the inputs or activations may be appropriate to help good representation for solving the problems. HONNs are needed because ordinary feed-forward network like MLP cannot avoid the problem of slow learning, especially when involving highly complex nonlinear problems (Chen and Leung, 2004).

Complete Chapter List

Search this Book:
Reset