Application of Brain-Inspired Computing for Daily Assistance

Application of Brain-Inspired Computing for Daily Assistance

Princy Diwan, Bhupesh Kumar Dewangan
DOI: 10.4018/978-1-6684-6980-4.ch001
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

The domains of artificial intelligence and machine learning continue to advance at a rapid speed in terms of algorithms, models, applications, and hardware thanks to an exponential increase in the amount of data collected on a daily basis. Deep neural networks have transformed these domains by achieving extraordinary human-like performance in various real-world challenges, such as picture or speech recognition. There is also a lot of effort going on to figure out the principles of computation in extensive biological neural networks, especially biologically plausible spiking neural networks. Neural-inspired algorithms (e.g., deep ANNs and deep RL) and brain intelligent systems have revolutionized the fields of machine learning and cognitive computing in the last decade, assisting in a variety of real-world learning tasks ranging from robot monitoring and interaction at home to complex decision-making about emotions and behaviors in humans and animals. While these brain-inspired algorithms and systems have made significant progress, they still require large data sets to train, and their outcomes lack the flexibility to adapt to a variety of learning tasks and provide long-term performance. To solve these issues, an analytical understanding of the concepts that allow brain-inspired intelligent systems to develop information, as well as how they might be translated to hardware for everyday help and practical applications, is required. This chapter focuses upon the applications, challenges, and solutions of brain-inspired computing for daily assistance.
Chapter Preview
Top

Introduction

Cognitive computing and artificial intelligence (AI) are about to undergo a revolution. The computing systems that power today's AI algorithms are based on the von Neumann architecture, which necessitates the rapid transfer of massive volumes of data back and forth during processing. As a result, there is a performance bottleneck and substantial space and power waste. Thus, it is increasingly evident that we need to switch to innovative architectures where memory and computation are better combined in order to develop effective cognitive computers. The von Neumann system's separation of memory and processing units necessitates frequent data transfers between storage and processing sections, which will result in high energy expenditures and performance degradation (Wang, Z. et al.,2020). According to studies, the brain's memory and learning processes are carried out by a network of around 1015 neurons that are linked together by a large number of synapses. These synapses contain a lot of functional connections between neuronal systems and internal relationships that are complicated with information flow. In fact, the Neural network representation of the human brain exhibits remarkable precision and processing information efficiently (Wang, W. et al.,2018). The brain accomplishes this feat by executing computational operations inside the memory and approximating them in the analogue domain, preventing data migration between the storage and the memory (Wang, P. et al.,2019). Therefore, the replication of brain function introduces a new computational methodology, called “neuromorphic computing” (Xia, Q., & Yang, J. J., 2019). Because of the structural resemblance between synaptic structures and memristive devices, neuromorphic computing is seen to be a promising field. Figure 1 depicts a brain synapse and memristor. The progression of the synaptic devices-based neuromorphic computing may create a route for very quick, energy-efficient computers (Zhang, X. Y. et al.,2020). In the past ten years, neural-inspired algorithms (such as deep ANNs and deep RL) and brain-intelligent systems have revolutionized the fields of machine learning and cognitive computing. These systems support a wide range of real-world learning tasks, from robot monitoring and interaction at home to complex decision-making about emotions and behaviors in humans and animals. The vast evolutionary history of living things has given them structures and abilities that allow them to survive in their surroundings. Emulation of these characteristics of living things is the aim of biomimetic technology. The development of electronic sensors and motor systems that resemble biological sensory organs is a goal of research in bioinspired electronics. In bioinspired applications such humanoid robots, exoskeletons, and other devices with motor systems, gadgets that mix an electrical item with a live body. Researchers must create biomimetic electronic sensors, motor systems, brains, and nerves in order to create bioinspired robotic and electronic devices that are compatible with the live body at the neuronal level and that are run by processes like those in a real body (Zhang, X. et al.,2020). Artificial organic synapses have mimicked the plasticity of the brain with considerably simpler architectures, cheaper production than neurons based on silicon circuits, and with less energy usage than conventional von Neumann computing techniques. Future neuromorphic systems may benefit from the incorporation of organic synapses. Due to their high level of parallelism, neuromorphic computing enables memory and computation to run simultaneously in the same computer core. Synaptic memory junctions, which encode the information, are the means by which artificial neurons receive inputs that are regulated (Park, T. J. et al.,2022). Since this kind of neural networks uses “spikes” for computation and communication, they are commonly referred to as Spiking Neural Networks (SNNs). SNNs are very effective because of their spike-driven event-driven architecture. In fact, it has been demonstrated that SNNs can consume an order of magnitude less power than an iso-network ANN when implemented on low power asynchronous neuromorphic hardware (Sengupta, A. et al.,2019). These networks should expand to their maximum capacity and approach the brain's efficiency when larger, more bio-realistic systems are created. Flow of the chapter is shown in below figure:

Complete Chapter List

Search this Book:
Reset