Fifty years of Electronic Hardware Implementations of First and Higher Order Neural Networks

Fifty years of Electronic Hardware Implementations of First and Higher Order Neural Networks

David R. Selviah (University College London, UK) and Janti Shawash (University College London, UK)
DOI: 10.4018/978-1-61520-711-4.ch012
OnDemand PDF Download:
$30.00
List Price: $37.50

Abstract

This chapter celebrates 50 years of first and higher order neural network (HONN) implementations in terms of the physical layout and structure of electronic hardware, which offers high speed, low latency, compact, low cost, low power, mass produced systems. Low latency is essential for practical applications in real time control for which software implementations running on CPUs are too slow. The literature review chapter traces the chronological development of electronic neural networks (ENN) discussing selected papers in detail from analog electronic hardware, through probabilistic RAM, generalizing RAM, custom silicon Very Large Scale Integrated (VLSI) circuit, Neuromorphic chips, pulse stream interconnected neurons to Application Specific Integrated circuits (ASICs) and Zero Instruction Set Chips (ZISCs). Reconfigurable Field Programmable Gate Arrays (FPGAs) are given particular attention as the most recent generation incorporate Digital Signal Processing (DSP) units to provide full System on Chip (SoC) capability offering the possibility of real-time, on-line and on-chip learning.
Chapter Preview
Top

2. Chronological Review Of Electronic Hardware Implementations Of Anns And Honns

1950s

After several years studying biological neurons culminating in a paper in Nature on lateral inhibition and adaptation in the mammalian retina, Dr Wilfred K. Taylor published the first paper discussing how pattern recognition can be performed on analog electronic hardware (Taylor, 1959). His circuit diagrams show arrays of parallel differential amplifiers, which in those days were thermionic valves together with resistors. In (Taylor, 1959) he introduces the maximum amplitude filter, later called the Maxnet, which outputs the maximum value of several inputs. In (Taylor, 1959) he describes parallel processing machines for recognizing letters, numbers, and other simple patterns in various positions, orientations, having different contrast and with a noisy background.

Complete Chapter List

Search this Book:
Reset