Artificial Multi-Polynomial Higher Order Neural Network Models

Artificial Multi-Polynomial Higher Order Neural Network Models

DOI: 10.4018/978-1-4666-2175-6.ch001
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This chapter introduces Multi-Polynomial Higher Order Neural Network (MPHONN) models with higher accuracy. Using Sun workstation, C++, and Motif, a MPHONN Simulator has been built. Real world data cannot always be modeled simply and simulated with high accuracy by a single polynomial function. Thus, ordinary higher order neural networks could fail to simulate complicated real world data. However, the MPHONN model can simulate multi-polynomial functions, and can produce results with improved accuracy through experiments. By using MPHONN for financial modeling and simulation, experimental results show that MPHONN can always have 0.5051% to 0.8661% more accuracy than ordinary higher order neural network models.
Chapter Preview
Top

Introduction

HONN Applications

Artificial Higher Order Neural Network (HONN) has a lot of applications in different areas. Barron, Gilstrap, and Shrier (1987) develop polynomial and neural networks for analogies and engineering applications. An, Mniszewski, Lee, Papcun, and Doolen (1988) test a learning procedure, based on a default hierarchy of high-order neural networks, which exhibited an enhanced capability of generalization and a good efficiency to learn to read English. Mao, Selviah, Tao, and Midwinter (1991) design a holographic high order associative memory system in holographic area. Mendel (1991) study higher-order statistics (spectra) system theory and use it in signal processing. Rovithakis, Kosmatopoulos, and Christodoulou (1993) research robust adaptive control of unknown plants using recurrent high order neural networks for the application of mechanical systems. Miyajima, Yatsuki, and Kubota (1995) build up higher order neural networks with product connections that hold the weighted sum of products of input variables. It is shown that they are more superior in ability than traditional neural networks in applications. Xu, Liu, and Liao (2005) explore global asymptotic stability of high-order Hopfield type neural networks with time delays. There are two major ways of encoding a neural network into a chromosome, as required in design of a Genetic Algorithm (GA). These are explicit (direct) and implicit (indirect) encoding methods. Siddiqi (2005) genetically evolve higher order neural networks by direct encoding method. Ren and Cao (2006) provide LMI-based criteria for stability of high-order neural networks with time-varying delay for nonlinear analysis. Recently, Selviah (2009) describes the progress in using optical technology to construct high-speed artificial higher order neural network systems. The chapter reviews how optical technology can speed up searches within large databases in order to identify relationships and dependencies between individual data records, such as financial or business time-series, as well as trends and relationships within them. Epitropakis, Plagianakos, and Vrahatis (2010) intend evolutionary Algorithm Training of Higher Order Neural Networks, for the aims to further explore the capabilities of the Higher Order Neural Networks class and especially the Pi-Sigma Neural Networks. Selviah and Shawash (2010) celebrate 50 years of first and Higher Order Neural Network (HONN) implementations in terms of the physical layout and structure of electronic hardware, which offers high speed, low latency in compact, low cost, low power, mass produced systems. Low latency is essential for practical applications in real time control for which software implementations running on Center Process Units (CPUs) are too slow. Gupta, Homma, Hou, Solo, and Bukovsky (2010) give fundamental principles of Higher Order Neural Units (HONUs) and Higher Order Neural Networks (HONNs). An essential core of HONNs can be found in higher order weighted combinations or correlations between the input variables. By using some typical examples, this chapter describes how and why higher order combinations or correlations can be effective. Das, Lewis, and Subbarao (2010) seek a dynamically tuned higher order like neural network approach; look into the control of quad-rotor. The dynamics of a quad-rotor is a simplified form of helicopter dynamics that exhibit the same basic problems of strong coupling, multi-input/multi-output design, and unknown nonlinearities. Yu (2010) offers a robust adaptive control using higher order neural networks and projection, and presents a novel robust adaptive approach for a class of unknown nonlinear systems. The structure is composed by two parts: the neuron-observer and the tracking controller. The simulations of a two-link robot show the effectiveness of the proposed algorithm.

Complete Chapter List

Search this Book:
Reset