A Hybrid Higher Order Neural Structure for Pattern Recognition

A Hybrid Higher Order Neural Structure for Pattern Recognition

Mehdi Fallahnezhad, Salman Zaferanlouei
DOI: 10.4018/978-1-4666-2175-6.ch017
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Considering high order correlations of selected features next to the raw features of input can facilitate target pattern recognition. In artificial intelligence, this is being addressed by Higher Order Neural Networks (HONNs). In general, HONN structures provide superior specifications (e.g. resolving the dilemma of choosing the number of neurons and layers of networks, better fitting specs, quicker, and open-box specificity) to traditional neural networks. This chapter introduces a hybrid structure of higher order neural networks, which can be generally applied in various branches of pattern recognition. Structure, learning algorithm, and network configuration are introduced, and structure is applied either as classifier (where is called HHONC) to different benchmark statistical data sets or as functional behavior approximation (where is called HHONN) to a heat and mass transfer dilemma. In each structure, results are compared with previous studies, which show its superior performance next to other mentioned advantages.
Chapter Preview
Top

Introduction

Among many different structures of Artificial Neural Networks (ANNs), backpropagation networks (BPNs; Rumlhart, et al., 1986) are most widely used in many different areas. Although these networks are recognized for their excellent performance in mathematical function approximation, nonlinear classification, pattern matching, and pattern recognition, they have several limitations. They do not excel in discontinuous or non-smooth (small changes in inputs cause large changes in outputs) data mining and knowledge learning (Fulcher, et al., 2006; Zhang, et al., 2002). In addition, they cannot deal well on incomplete or noisy data (Dong, 2007; Peng & Zhu, 2007; Wang, 2003). A traditional ANN with single connections having no hidden layer only can map neuron inputs linearly to neuron outputs. Single hidden layer Feedforward Neural Networks (FNNs) are known for their ability of function approximation and learning pattern recognition but cannot realize every nonlinear mapping. FNNs incorporating more hidden layers are well known for their long time convergence and usually stuck in local minima rather than a global.

In order to improve various limitations of traditional neural network, Higher Order Neural Networks (HONNs) can be considered. By combining inputs, nonlinearly, HONNs make a higher order correlation among inputs (Zurada, 1992). Leaving necessity of hidden layers, HONN structures turn out simpler than FNN structures and initialization of learning parameters (weights) will become less challenging. Especially method presented in this chapter, there is no need for hidden layers (and consequently hidden neurons) which, in turn, eases the process of finding appropriate network structure. In addition, unlike BPN, HONN successfully can provide an efficient open-box model of nonlinear input-output mapping, which provides easier understanding of data mining. Moreover, HONNs most often run faster than FNNs. Examples in implementation of two-input and three-input XOR functions by using Second Order Neural Network (SONN) proved that SONN is several times faster than FNN (Gupta, et al., 2009; Zhang, 2009).

In following sections, we introduce a hybrid higher order structure, which is firstly proposed as a classifier form in Fallahnezhad et al. (2011) and were developed afterwards to more general form. We discuss its pros and cons either as a classifier (where is called HHONC) in the classification of the Iris data set, the breast cancer data set, the Wine recognition data set, the Glass identification data set, the Balance scale data set, and the Pima diabetes data set, or as functional behavior approximation (where it is called HHONN) to a heat and mass transfer dilemma.

Complete Chapter List

Search this Book:
Reset