Higher Order Neural Networks for Symbolic, Sub-symbolic and Chaotic Computations

Higher Order Neural Networks for Symbolic, Sub-symbolic and Chaotic Computations

João Pedro Neto
DOI: 10.4018/978-1-61520-711-4.ch002
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This chapter deals with discrete and recurrent artificial neural networks with a homogenous type of neuron. With this architecture we will see how to perform symbolic computations by executing high-level programs within this network dynamics. Next, using higher order synaptic connections, it is possible to integrate common sub-symbolic learning algorithms into the previous architecture. Thirdly, taking advantage of the chaotic properties of dynamical systems, we present some uses of chaotic computations with the same neurons and synapses, and, thus, creating a hybrid system of three computation types.
Chapter Preview
Top

Background

Despite the first paper by McCulloch & Pitts (1943) which introduced neural networks as logic computing devices, symbolic computation is not a common thread in neural networks scientific works. However, there are some articles and books in the literature that focus on symbolic capabilities, (Gruau, 1995; Siegelmann, 1996; Carnell, 2007).

The jannet system (see Gruau (1995) for details) introduces a dialect of Pascal with some parallel constructs. This algorithmic description is translated, using several automated steps (first on a tree-like data structure and then on a low-level code, named cellular code), to produce a non-homogenous neural network (there are four different neuron types) able to perform the required computations. In jannet, every neuron is activated only when all its synapses had transferred their values. Since this may not occur at the same instant, the global dynamics is not synchronous. A special attention is given to design automation of the final neural network architecture.

The neural language project nil, outlined in Siegelmann (1993) and Siegelmann (1996), is also able to perform symbolic computations by using certain sets of constructions that are compiled into an appropriate neural net (using the same homogeneous neural architecture presented herein). It has a complex set of data types, from boolean and scalar types, to lists, stacks or sets that are kept inside a single neuron, using fractal coding.

Complete Chapter List

Search this Book:
Reset