An Event-Based Neural Network Architecture with Content Addressable Memory

An Event-Based Neural Network Architecture with Content Addressable Memory

Sivaganesan S, Maria Antony S, Udayakumar E
DOI: 10.4018/IJERTCS.2020010102
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

A hybrid analog/digital very large-scale integration (VLSI) implementation of a spiking neural network with programmable synaptic weights was designed. The synaptic weight values are stored in an asynchronous module, which is interfaced to a fast current-mode event-driven DAC for producing synaptic currents with the appropriate amplitude values. It acts as a transceiver, receiving asynchronous events for input, performing neural computations with hybrid analog/digital circuits on the input spikes, and eventually producing digital asynchronous events in output. Input, output, and synaptic weight values are transmitted to/from the chip using a common communication protocol based on the address event representation (AER). Using this representation, it is possible to interface the device to a workstation or a microcontroller and explore the effect of different types of spike-timing dependent plasticity (STDP) learning algorithms for updating the synaptic weights values in the CAM module.
Article Preview
Top

1. Introduction

1.1 Biological Neuron

A neuron is a specialized type of cell found in the bodies of all eumetozoans. Only sponges and a few other simpler animals lack neurons. The features that define a neuron are electrical excitability and the presence of synapses, which are complex membrane junctions that transmit signals to other cells. The body's neurons, plus the glial cells that give them structural and metabolic support, together constitute the nervous system. In vertebrates, the majority of neurons belong to the central nervous system, but some reside in peripheral ganglia, and many sensory neurons are situated in sensory organs such as the retina and cochlea (Myers et al., 2007).

The soma is usually compact; the axon and dendrites are filaments that extrude from it. Dendrites typically branch profusely, getting thinner with each branching, and extending their farthest branches a few hundred micrometers from the soma. The axon leaves the soma at a swelling called the axon hillock, and can extend for great distances, giving rise to hundreds of branches. Unlike dendrites, an axon usually maintains the same diameter as it extends. The soma may give rise to numerous dendrites, but never to more than one axon (Liu et al., 2010). Synaptic signals from other neurons are received by the soma and dendrites; signals to other neurons are transmitted by the axon. A typical synapse, then, is a contact between the axon of one neuron and a dendrite or soma of another. Synaptic signals may be excitatory or inhibitory. If the net excitation received by a neuron over a short period of time is large enough, the neuron generates a brief pulse called an action potential, which originates at the soma and propagates rapidly along the axon, activating synapses onto other neurons as it goes.

1.2 Artificial Neural Networks

Artificial neural networks are parallel computational models, comprising densely interconnected adaptive processing units. These networks are composed of many but simple processors (relative, say, to a PC, which generally has a single, powerful processor) acting in parallel to model nonlinear static or dynamic systems, where a complex relationship exists between an input and its corresponding output (Hasler et al., 2010). Artificial neural networks are viable models for a wide variety of problems, including pattern classification, speech synthesis and recognition, adaptive interfaces between humans and complex physical systems, function approximation, image compression, forecasting and prediction, and nonlinear system modeling.

These networks are “neural” in the sense that they may have been inspired by the brain and neuroscience, but not necessarily because they are faithful models of biological, neural or cognitive phenomena. In fact, many artificial neural networks are more closely related to traditional mathematical and/or statistical models, such as nonparametric pattern classifiers, clustering algorithms, nonlinear filters and statistical regression models, than they are to neurobiological models (Sontag et al., 2000; Indiveri et al., 2007).

Complete Article List

Search this Journal:
Reset
Volume 15: 1 Issue (2024): Forthcoming, Available for Pre-Order
Volume 14: 1 Issue (2023)
Volume 13: 4 Issues (2022): 1 Released, 3 Forthcoming
Volume 12: 4 Issues (2021)
Volume 11: 4 Issues (2020)
Volume 10: 4 Issues (2019)
Volume 9: 2 Issues (2018)
Volume 8: 2 Issues (2017)
Volume 7: 2 Issues (2016)
Volume 6: 2 Issues (2015)
Volume 5: 4 Issues (2014)
Volume 4: 4 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing