Neural Networks and Their Accelerated Evolution From an Economic Analysis Perspective

Neural Networks and Their Accelerated Evolution From an Economic Analysis Perspective

Copyright: © 2018 |Pages: 16
DOI: 10.4018/978-1-5225-2255-3.ch571
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Since 1943, the neuropshychologist Warren McCulloch and the mathematician Walter Pitts published the paper “A Logical Calculus of the Ideas Immanent in Nervous Activity” establishing the foundation of the neural networks. The transition from the biological neuron to the artificial one, from the perceptron to the multi-layer perceptron, from Hopfield networks to Kohonen networks, from bi-directional associative memories to Boltzmann machines, from basic radial functions to Hamming networks, all of these represent a strong proof of the long journey in the study of the neural networks. Using the characteristics of a neural network, an overview of the works in the neural computing are included in the paperwork, namely a taxonomy of the neural networks based on a number of criteria. The paperwork includes the conclusion of the research. at the end, possible further research directions are highlighted.
Chapter Preview
Top

Background

The history of neural networks can be divided into five stages: the beginning of neural networks; the golden age; the quiet years, years of renewed enthusiasm showing the interaction between biological experimentation, modelling and computer stimulation, with hardware implementation, finishing with the fifth stage – permanent development.

1940-1950: The Beginning of Neural Networks

In 1943 the neuropsychologist Warren McCulloch and the mathematician Walter Pitts published the paper “A Logical Calculus of the Ideas Immanent in Nervous Activity”, laying the foundations of neural networks. The first precursors of computers were developed as true electronic brains, being supported by Konrad Zuse, who calculated ballistic trajectories using manual procedure. In 1941, in Berlin, at the German Institute for Aviation Research, Z3, Konrad Zuse (1993) designed an electromechanical computer, which was the first programmable computing machine, fully automated, being used to perform statistical analysis for wings vibrations.

Warren McCulloch and Walter Pitts (1947) indicated a practical field for the application and recognition of spatial models by neural networks.

Another researchers like Norbert Wiener and von Neumann, showed that research on the human brain design using computers could be a very interesting thing.

In 1949, Donald Hebb wrote The Organization of Behaviour and showed that neuronal connection is becoming stronger as it is used, being a fundamental concept for the learning process of a network. Moreover, Hebb developed the rule that bears his name, and which is the basis for almost all of neuronal learning procedures. Hebb could not postulate this rule due to the absence of neurological research, the only able to confirm this result.

Karl Lashley (1950) argued, as a neuro-psychologist, that the storing of information by the brain is designed as a distributed system. His thesis was based on experiments on rats.

Key Terms in this Chapter

Arbor: Usually used in the context of a dendritic arbor, the tree-like structure associated with dendritic branching.

Soma: The cell body.

Dendrite: One of the branching fibres of a neuron, which convey input information via PSPs.

Action Potential: The stereotypical voltage spike that constitutes an active output from a neuron. They are propagated along the axon to other neurons.

Synapse: The site of physical and signal contact between neurons. On receipt of an action potential at the axon terminal of a synapse, neurotransmitter is released into the synaptic cleft and propagates to the postsynaptic membrane. There it undergoes chemical binding with receptors, which, in turn, initiates the production of a postsynaptic potential (PSP).

Potential Difference: The voltage difference across the cell membrane.

Feed-Forward Neural Networks (Where the Propagation Arises Before the Signal): Those networks in which the neurons of a layer are interconnected with those of the next layer whereas the neurons of the same later are not connected.

PSP: Postsynaptic Potential. The change in membrane potential brought about by activity at a synapse.

Axon: The fibre that emanates from the neuron cell body or soma and that conducts action potentials to other neurons.

Recurrent Neural Networks (Feedback): That RNA forward type, where outputs are connected with their inputs.

Receptor Sites: The sites on the postsynaptic membrane to which molecules of neurotransmitter bind. This binding initiates the generation of a PSP.

Presynaptic Membrane: That part of a synapse which is located on the axon terminal.

Feedback Neural Networks: Those networks in which the output neurons can be interconnected with those of the input layer, thereby giving rise to an iterative process.

Complete Chapter List

Search this Book:
Reset