Enhanced Sentiment Classification Using Recurrent Neural Networks

Enhanced Sentiment Classification Using Recurrent Neural Networks

Arunmozhi Mourougappane (St.Joseph's College of Engineering, India) and Suresh Jaganathan (SSN College of Engineering, India)
Copyright: © 2020 |Pages: 11
DOI: 10.4018/978-1-7998-1159-6.ch010

Abstract

Sentiment Analysis and classification becomes a key trend in the human world in analyzing the nature and quality of the product, people's emotion, inference about products, and movies. Sentiment Analysis is the process of classification as it classifies the inference or review into positive or negative. Since the data that are labeled are very expensive and difficult to gather, it is hard. Also, the sarcastic data and homonyms are difficult to be identified. Hence the assumption of reviews will be wrong. The solution to identify the sarcastic words and the words with different meanings happens with the help of Recurrent Neural Networks.
Chapter Preview
Top

Introduction

Sentiment Analysis has become the most promising area because of online services. Online reviews plays vital role in development of business environment, brochure, marketing and enrichment of people’s life. Moreover, it helps to find subjective and objective information, review, inference, etc., Generally, the sentiment analysis is performed based on polarity. The polarity may either be a positive or a negative. Beyond this, there is another type of classification technique called constructive type where the end users or the reviewers used to give some suggestions based on the usage of a product or visualization in their vision. In document level method, the process is performed dynamically with the help of user created content. In sentiment level, the classification is initially performed to identify the subjective and objective statements and finally polarity. In aspect level sentiment classification, idea and notion is analyzed. The opinion mining which is the science of using text analysis to determine the sentiment orientation of text carried out by many approaches like Lexicon- based approach, Learn based approach and Hybrid approach. The lexicon-based approach relies on a sentiment lexicons, Learning based approach is based on the ML algorithm and hybrid is based on both the sentimental approach and the learning based approach

Neural Networks consists of neurons that are connected to each other to perform highly computational tasks and to solve complex problems that are independent. Input is processed along with the activation function and the weights between different layers. Also, it does not handle with time dependent data. In order to overcome the disadvantages of neural networks such as independency, Recurrent neural Networks and Convolutional Neural Networks are used.

Recurrent Neural Networks deals with the time series data and the data that are not independent (Jian Zhang and Li, 2014). It processes sequential recognition and dynamic data. The output depends upon the previous value as well as the current input data. Different weights between the nodes and the layer are used for computation purpose. It also has memory to store the already computed values and the past histories. It can also pass the information sequentially in a selective manner. But the ability is only to perform one element at a time. The input size varies according to the specific application. Applications such as image classification, sentiment analysis, machine translation and image captioning are working under neural networks. The different classification of RNN includes the processing fixed size data, capturing images and giving out the text as a result, input and output as a text, language translation and classification based on a video.

Training Recurrent Neural Network is a highly difficult task. Training process involves Forward Pass, Backward Pass and Backpropagation. In forward pass, the sum is evaluated from initial node and in backward pass, Backpropagation is performed to minimize the squared error. In RNN, Backpropagation is performed by unfolding them into general feed-forward networks. Since RNN consists of memory to hold past values, it requires LSTM to process the data. Long Short Term Memory consists of four gates: Input Gate, Forget Gate, Memory Gate and Output Gate. These gates are said to be analog as it is very useful in differentiation during computation. Moreover, these cells allow selecting the data that needs to be remembered and the things that are not needed.

Normal Artificial Neural Networks are trained using Backpropagation. But RNN is trained using Backpropagation Through Time (BPTT). The difference is that, in BPTT, the gradient at each step depends upon the previous time step. This is when the normal neural network becomes the feed-forward delay neural networks. After considering the previous and current time series, normal Backpropagation is performed.

The impediment is that there is no longtime dependency support (Rodrigo Moraes and Neto, 2013). So RNN is elongated to bidirectional and deep bidirectional RNNs. The concept of BRNN is to divide the units or nodes of a normal RNN into forward state and negative state. In BRNN, future elements and neuron values are also taken into the sequence. In deep BRNN, number of bidirectional hidden layers is increased with time so that they become multilayer feed forward networks.

Complete Chapter List

Search this Book:
Reset