Symbolic Function Network: Theory and Implementation

Symbolic Function Network: Theory and Implementation

George S. Eskander (ETS, Quebec University, Canada) and Amir Atiya (Cairo University, Egypt)
DOI: 10.4018/978-1-4666-2175-6.ch014
OnDemand PDF Download:
No Current Special Offers


This chapter reviews a recent HONN-like model called Symbolic Function Network (SFN). This model is designed with the goal to impart more flexibility than both traditional and HONNs neural networks. The main idea behind this scheme is the fact that different functional forms suit different applications and that no specific architecture is best for all. Accordingly, the model is designed as an evolving network that can discover the best functional basis, adapt its parameters, and select its structure simultaneously. Despite the high modeling capability of SFN, it is considered as a starting point for developing more powerful models. This chapter aims to open a door for researchers to propose new formulations and techniques that impart more flexibility and result in sparser and more accurate models. Through this chapter, the theoretical basis of SFN is discussed. The model optimization computations are deeply illustrated to enable researchers to easily implement and test the model.
Chapter Preview

1. Introduction

Symbolic Function Network (SFN) is designed with the goal to impart more flexibility than the traditional neural networks (Eskander & Atiya, 2009). It is related to Higher Order Neural Networks (HONN) in the sense that both aim to model the complex nonlinearities as early as possible, so avoid the need of having huge connectivity between simple networks inputs (like the first order elements in ordinary NNs), by using more complex network inputs (like the higher order elements in HONNs). The difference between traditional HONNs and SFNs appears in the methodology to achieve this aim.

In HONNs, the complex correlations among system inputs are modeled early in the network so permits simple synaptic and somatic activations able to model the complex system. In this case, modeling operation starts with discovering the correlation between either linear or simple nonlinear terms (powers) of system inputs, and then applies simple synaptic and somatic activations on these complex terms. On the other hand, SFN design operation starts by learning suitable, and somehow complex, somatic activations that affects individual system inputs, and then correlation between these complex terms is modeled by cascading them in a tree-based network structure. The wider range of available activation functions, and the evolutionary structure of SFN, is expected to permit higher modeling power with sparser structure when compared to traditional NNs and even HONNs.

The symbolic function network can be represented as a tree based neural network. The terminals of the tree are the most relevant features that affect the system output, and the weights of the neural tree are the parameters that determine its overall functionality. Through the training phase, the best fitting symbolic function of the training data is discovered. There are two main goals that have to be achieved simultaneously: goodness of fit and sparsity. Sparse representation of a system is motivated by two aims: achieving good generalization performance, and achieving a simple and concise presentation of the modeled system that result in fast recall speed.

Although the concept of representing systems in a flexible evolving symbolic form is realized in the SFN scheme, we believe that this realization has just opened a door to design of more reliable models. In SFN, small range of elementary functions such as powers, the exponential function, and the logarithm are used as building blocks. However, a very large pool of basic functions can be rather investigated. Moreover, the tree structure of the SFN scheme can be replaced by various evolving network structures. Finally, the forward-backward evolving mechanisms and the steepest-descent-based optimization methods used in building the Symbolic Function Network are just samples from a huge pool of candidate methodologies.

Through this chapter, we introduce some of the topics related to the SFN model and some implementations of them. We represent a survey on the literature related to this work. Then, we discuss the evolutionary neural networks techniques as the SFN model has some aspects of evolving construction. A special case of such evolving networks is the tree-based networks; we presented some models that follow this type of structures. Then, the model implementation details have been discussed and clearly explained. These details include the elementary functions used in the model, the network structure, the model notations and computations, the network optimization techniques, and the constructive algorithms used to build the network. Finally, SFN power in system modeling is investigated by considering some approximation, regression, and time series prediction problems.

Complete Chapter List

Search this Book: