Data Classification Using Ultra-High Frequency SINC and Trigonometric Higher Order Neural Networks

Data Classification Using Ultra-High Frequency SINC and Trigonometric Higher Order Neural Networks

DOI: 10.4018/978-1-7998-3563-9.ch007
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This chapter develops a new nonlinear model, ultra high frequency sinc and trigonometric higher order neural networks (UNT-HONN), for data classification. UNT-HONN includes ultra high frequency sinc and sine higher order neural networks (UNS-HONN) and ultra high frequency sinc and cosine higher order neural networks (UNC-HONN). Data classification using UNS-HONN and UNC-HONN models are tested. Results show that UNS-HONN and UNC-HONN models are more accurate than other polynomial higher order neural network (PHONN) and trigonometric higher order neural network (THONN) models, since UNS-HONN and UNC-HONN models can classify data with error approaching 10-6.
Chapter Preview
Top

Introduction

The contributions of this chapter will be:

  • Introduce the background of HONNs with the applications of HONNs in classification area.

  • Develop a new HONN models called UNS-HONN and UNC-HONN

  • Provide the UNS-HONN and UNC-HONN learning algorithm and weight update formulae.

  • Compare UNS-HONN and UNC-HONN models with other HONN models.

  • Applications of UNS-HONN and UNC-HONN models for ultra-high frequency data.

This chapter is organized as follows: the background section gives the background knowledge of HONN and HONN applications in classification area. Section HONN models introduces UNS-HONN and UNC-HONN structures. Section update formula provides the UNS-HONN and UNC-HONN model update formulae, learning algorithms, and convergence theories of HONN. Section test describes UNS-HONN and UNC-HONN testing results in the data classification area. Conclusions are presented in last section.

Top

Background

Artificial Neural Network (ANN) has been widely used in the classification areas. Lippman (1989) studies pattern classification using neural networks. Moon and Chang (1994) learn classification and prediction of the critical heat flux using fuzzy clustering and artificial neural networks. Lin and Cunningham (1995) develop a new approach to fuzzy-neural system modelling. Behnke and Karayiannis (1998) present a competitive Neural Trees for pattern classifications. Bukovsky, Bila, Gupta, Hou, and Homma (2010) provide foundation and classification of nonconventional neural units and paradigm of non-synaptic neural interaction.

Artificial higher order neural network models have been widely used for pattern recognition with the benefit of HONNs being open box models (Bishop (1995); Park, Smith, & Mersereau (2000); Spirkovska, & Reid (1992, and1994); and Zhang, Xu, & Fulcher (2007)). Shin and Ghosh (1991) introduce a novel feedforward network called the pi-sigma network. This network utilizes product cells as the output units to indirectly incorporate the capabilities of higher order networks while using a fewer number of weights and processing units. The pi-sigma network is an efficient higher order neural network for pattern classification and function approximation. Linhart and Dorffner (1992) present a self-learning visual pattern explorer and recognizer using a higher order neural network, which could improve the efficiency of higher order neural networks, is built into a pattern recognition system that autonomously learns to categorize and recognize patterns independently of their position in an input image. Schmidt and Davis (1993) explore alternatives that reduce the number of network weights while pattern recognition properties of various feature spaces for higher order neural networks. Spirkovska and Reid (1993) describe coarse-coded higher order neural networks for PSRI object recognition. The authors describe a coarse coding technique and present simulation results illustrating its usefulness and its limitations. Simulations show that a third-order neural network can be trained to distinguish between two objects of 4096×4096 pixels. Wan and Sun (1996) show that the higher order neural networks (HONN) have numerous advantages over other translational rotational scaling invariant (TRSI) pattern recognition techniques for automatic target recognition. Morad and Yuan (1998) present a method for automatic model building from multiple images of an object to be recognized. The model contains knowledge which has been computed during the learning phase from large 2D images of an object for automatic model building and 3D object recognition. A neuro-based adaptive higher order neural network model has been developed by Zhang, Xu, and Fulcher (2002) for data model recognition. Voutriaridis, Boutalis, and Mertzios (2003) propose ridge polynomial networks in pattern recognition. Ridge polynomial networks (RPNs) are special class of high order neural networks with the ability of high order neural networks for perform shift and rotation.

Key Terms in this Chapter

HONN: Artificial higher order neural network.

UNS-HONN: Ultra high frequency sinc and sine higher order neural networks.

UCSHONN: Artificial ultra-high frequency trigonometric higher order neural network.

THONN: Artificial trigonometric higher order neural network.

SPHONN: Artificial sigmoid polynomial higher order neural network.

PHONN: Artificial polynomial higher order neural network.

SSINCHONN: Artificial sine and sinc higher order neural network.

UNT-HONN: Ultra high frequency sinc and trigonometric higher order neural networks.

UNC-HONN: Ultra high frequency sinc and cosine higher order neural networks.

Complete Chapter List

Search this Book:
Reset