LSTM Network: A Deep Learning Approach and Applications

LSTM Network: A Deep Learning Approach and Applications

Anil Kumar (RD Engineering College, India), Abhay Bhatia (Roorkee Institute of Technology, India), Arun Kashyap (GL Bajaj Institute of Technology and Management, India), and Manish Kumar (Ajay Kumar Garg Engineering College, India)
DOI: 10.4018/978-1-6684-6909-5.ch007
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

The world wide web (WWW) is an advanced system with an unmatched amount of digital data. Today's internet usage is accessible through common search engines like Google and Yahoo. Cybercriminals have become more assertive on social media. As a result, numerous commercial and trade websites are hacked, leading to forced trafficking of women and children as well as a number of other cybercrimes. Due to this, it is important to identify social media crimes as soon as possible in order to avoid them. To do this, a machine learning technique to detect crimes early must be proposed. Long short-term memory networks are a type of recurrent neural network that can pick up order dependency in problems involving prediction of sequence.
Chapter Preview
Top

Introduction

Sequence prediction problems have been around for a while. They are regarded as one of the most difficult challenges to address in the community of data science. These cover a wide range of issues, from data prediction and pattern recognition to understanding movie storylines and speech recognition, from language translations to word prediction on your phone's keypad. LSTM networks have been determined to be the most widely solution for practically all of these prediction of sequence problems thanks to recent advancements in data science.

During the past several years DL algorithms have been well developed and used frequently to extract information from many types of data. They are taking into account various aspects of the input data, and there are various DL architecture types, including RNN, CNN, and deep neural networks. The temporal information of incoming data is typically for CNN and DNN,too complex to handle. Therefore,RNNs are prevalent in study fields and deal with sequential data, like text, audio, and video. The RNNs unfortunately are unable to connect the pertinent data when there is a significant gaping between relevant data input. Hochreiteret al. (1997) advocated lengthy short-term memory to address “long-term dependency” (LSTM). Since, LSTM has produced all intriguing RNN-based outcomesnearly; it has taken centre stage in deep learning. LSTMs perform incredibly well and have been extensively employed in a variety of tasks, including speech recognition, acoustic modelling, trajectory prediction, phrase embedding, and correlation analysis. This is mostly due to their powerful learning ability. We examine these LSTM networks in this review, which focuses on developments of the LSTM network cell and topologies of LSTM network. Here, the recurrent unit of LSTM networks is referred to as the LSTM cell.

Complete Chapter List

Search this Book:
Reset