Artificial Neural Networks Tutorial

Artificial Neural Networks Tutorial

Crescenzio Gallo
Copyright: © 2015 |Pages: 10
DOI: 10.4018/978-1-4666-5888-2.ch626
(Individual Chapters)
No Current Special Offers

Chapter Preview



Artificial neural networks (Bandy, 1997; Haykin, 1999) are information processing structures providing the (often unknown) connection between input and output data (Honkela, Duch, & Girolami, 2011) by artificially simulating the physiological structure and functioning of human brain structures.

The natural neural network, instead, consists of a very large number of nerve cells (about ten billion in humans), said neurons, and linked together in a complex network. The intelligent behavior is the result of extensive interaction between interconnected units. The input of a neuron is composed of the output signals of the neurons connected to it. When the contribution of these inputs exceeds a certain threshold, the neuron - through a suitable transfer function - generates a bioelectric signal, which propagates through the synaptic weights to other neurons.

Significant features of this network, which artificial neural models intend to simulate, are:

  • The parallel processing, due to the fact that neurons process simultaneously the information;

  • The twofold function of the neuron, that acts simultaneously as memory and signal processor;

  • The distributed nature of the data representation, i.e. knowledge is distributed throughout the network, not circumscribed or predetermined;

  • The network’s ability to learn from experience.

This last but fundamental capacity enables neural networks to self-organize, adapt to new incoming information and extract the input-output connections from known examples that are the basis of their organization. An artificial neural network captures this attitude in an appropriate “learning” stage.

Despite the great success achieved by artificial neural networks, it is however better to remain aware of the limits of this technology due to the necessary reduction of the real system to be examined.


Structure Of A Neural Network

Artificial neural networks are composed of elementary computational units called neurons (McCulloch & Pitts, 1943) combined according to different architectures. For example, they can be arranged in layers (multi-layer network), or they may have a connection topology. Layered networks consist of:

  • Input layer, made of n neurons (one for each network input);

  • Hidden layer, composed of one or more hidden (or intermediate) layers consisting of m neurons;

  • Output layer, consisting of p neurons (one for each network output).

The connection mode allows distinguishing between two types of architectures:

  • The feedback architecture, with connections between neurons of the same or previous layer;

  • The feedforward architecture (Hornik, Stinchcombe, & White, 1989), without feedback connections (signals go only to the next layer’s neurons).

Key Terms in this Chapter

Artificial Neural Network (ANN): An artificial neural network defines a mathematical model for the simulation of a network of biological neurons (e.g. human nervous system). It simulates different aspects related to the behavior and capacity of the human brain, such as: intelligent information processing; distributed processing; high level of parallelism; faculty of learning, generalization and adaptation; high tolerance to inaccurate (or wrong) information.

Perceptron: A type of binary classifier that maps its inputs (a vector of real type) to an output value (a scalar real type). The perceptron may be considered as the simplest model of feed-forward neural network, as the inputs directly feeding the output units through weighted connections.

Mathematical Model: A mathematical model is a model built using the language and tools of mathematics. A mathematical model is often constructed with the aim to provide predictions on the future ‘state’ of a phenomenon or a system.

Synapse: The synapse (or synaptic junction) is a highly specialized structure that enables communication of nerve cells together (neurons) or with other cells (muscle cells, sensory or endocrine glands).

Neuron: The neuron is the unit cell, which constitutes the nervous tissue, contributing to the formation (together with the fabric of the neuroglia and the vascular tissue) of the nervous system.

Artificial Intelligence: The term “artificial intelligence” generally refers to the ability of a computer to perform functions and reasoning typical of the human mind. It covers the theory and techniques for the development of algorithms that allow computers to show an ability and/or intelligent activity, at least in specific domains.

Learning: The acquisition or modification of knowledge (new or existing), behaviors, skills, values, or preferences and may involve the synthesis of different types of information.

Complete Chapter List

Search this Book: