Artificial Higher Order Neural Network Nonlinear Models: SAS NLIN or HONNs?

Artificial Higher Order Neural Network Nonlinear Models: SAS NLIN or HONNs?

Ming Zhang
Copyright: © 2009 |Pages: 47
DOI: 10.4018/978-1-59904-897-0.ch001
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This chapter delivers general format of Higher Order Neural Networks (HONNs) for nonlinear data analysis and six different HONN models. This chapter mathematically proves that HONN models could converge and have mean squared errors close to zero. This chapter illustrates the learning algorithm with update formulas. HONN models are compared with SAS Nonlinear (NLIN) models and results show that HONN models are 3 to 12% better than SAS Nonlinear models. Moreover, this chapter shows how to use HONN models to find the best model, order and coefficients, without writing the regression expression, declaring parameter names, and supplying initial parameter values.
Chapter Preview
Top

Introduction

Background of Higher-Order Neural Networks (HONNs)

Although traditional Artificial Neural Network (ANN) models are recognized for their great performance in pattern matching, pattern recognition, and mathematical function approximation, they are often stuck in local, rather than global minima. In addition, ANNs take unacceptably long time to converge in practice (Fulcher, Zhang, and Xu 2006). Moreover, ANNs are unable to manage non-smooth, discontinuous training data, and complex mappings in financial time series simulation and prediction. ANNs are ‘black box’ in nature, which means the explanations for their output are not obvious. This leads to the motivation for studies on Higher Order Neural Networks (HONNs).

HONN includes the neuron activation functions, preprocessing of the neuron inputs, and connections to more than one layer (Bengtsson, 1990). In this chapter, HONN refers to the neuron type, which can be linear, power, multiplicative, sigmoid, logarithmic, etc. The first-order neural networks can be formulated by using linear neurons that are only capable of capturing first-order correlations in the training data (Giles & Maxwell, 1987). The second order or above HONNs involve higher-order correlations in the training data that require more complex neuron activation functions (Barron, Gilstrap & Shrier, 1987; Giles & Maxwell, 1987; Psaltis, Park & Hong, 1988). Neurons which include terms up to and including degree-k are referred to as kth-order neurons (Lisboa and Perantonis, 1991).

Rumelhart, Hinton, and McClelland (1986) develop ‘sigma-pi’ neurons where they show that the generalized standard BackPropagation algorithm can be applied to simple additive neurons. Both Hebbian and Perceptron learning rules can be employed when no hidden layers are involved (Shin 1991). The performance of first-order ANNs can be improved by utilizing sophisticated learning algorithms (Karayiannis and Venetsanopoulos, 1993). Redding, Kowalczy and Downs (1993) develop a constructive HONN algorithm. Zhang and Fulcher (2004) develop Polynomial, Trigonometric and other HONN models. Giles, Griffin and Maxwell (1988) and Lisboa and Pentonis (1991) show that the multiplicative interconnections within ANNs have been used in many applications, including invariant pattern recognition.

Others suggest groups of individual neurons (Willcox, 1991; Hu and Pan, 1992). ANNs can simulate any nonlinear functions to any degree of accuracy (Hornik, 1991; and Leshno, 1993).

Zhang, Fulcher, and Scofield (1997) show that ANN groups offer superior performance compared with ANNs when dealing with discontinuous and non-smooth piecewise nonlinear functions. Compared with Polynomial Higher Order Neural Network (PHONN) and Trigonometric Higher Order Neural Network (THONN), Neural Adaptive Higher Order Neural Network (NAHONN) offers more flexibility and more accurate approximation capability. Since using NAHONN the hidden layer variables are adjustable (Zhang, Xu, and Fulcher, 2002). In addition, Zhang, Xu, and Fulcher (2002) proves that NAHONN groups are capable of approximating any kinds of piecewise continuous function, to any degree of accuracy. In addition, these models are capable of automatically selecting both the optimum model for a particular time series and the appropriate model order.

Complete Chapter List

Search this Book:
Reset