Attention-Based Convolution Bidirectional Recurrent Neural Network for Sentiment Analysis

Attention-Based Convolution Bidirectional Recurrent Neural Network for Sentiment Analysis

Soubraylu Sivakumar, Haritha D. (https://orcid.org/0000-0003-2772-2081) (ea63803d-6ff6-4ae7-9d66-89474f17d43f, Sree Ram N. (http://orcid.org/0000-0002-2721-7678) (83d6d495-4435-4605-8ac2-1ea7a0deb94f, Naveen Kumar, Rama Krishna G. (https://orcid.org/0000-0003-3572-6517) (a4bc1e39-7e05-4f67-8fb5-014c74b96deb, Dinesh Kumar A. (https://orcid.org/0000-0003-2008-6828) (7d795588-8258-4ca5-8ef2-9e5d29cad026
Copyright: © 2022 |Pages: 21
DOI: 10.4018/IJDSST.300368
Article PDF Download
Open access articles are freely available for download

Abstract

A customer conveys their opinion in natural language about an entity. Applying sentiment analysis to those reviews is a very complex task. The significant terms that influence the polarity of a review are not examined. The terms that have contextual meaning are not recognized and are present across multiple sentences in a review. To address the above two issues, the authors have proposed an attention-based convolution bi-directional recurrent neural network (ACBRNN). In this model, two convolution layer captures phrase-level feature while self-attention in the middle assigns high weight to the significant terms, and bi-directional GRU performs a conceptual scanning of review through forward and backward direction. The authors have conducted four different experiments (i.e., unidirectional, bidirectional, hybrid, and proposed model) on IMDB dataset to show the significance of the proposed model. The proposed model has obtained an F1 score of 87.94% on IMDB dataset, which is 5.41% higher than CNN. Thus, the proposed architecture performs well compared with all other baseline models.
Article Preview
Top

1. Introduction

Sentiment Analysis is the computational study of public opinions, emotions, sentiments, attitudes, and appraisals towards entities. The entity can be an issue, events, services, individuals, products, etc. Everyday more reviews are posted on the internet due to advancements in technology. The objective of Sentiment Analysis is to conclude the polarity of the review as positive (good) or negative (bad). So, decision making plays an important role in Sentiment Analysis.

Decision making process helps the customer to get the right product and helps the organization to sell the right product to the customer. It benefits the consumer and corporates based on the decision. Applying a machine learning algorithms to the decision making process on the opinion is a challenging task. The goal behind this is to apply deep learning algorithms to build models that allow automatic extraction of features from the text. When Deep Learning Methods (DLM) is used in feature engineering, automatically the high-level features are learned without any human bias. In deep learning, the features are learned during the training process and no specialized domain knowledge is required by the researchers.

Convolutional Neural Network (CNN) (Yoav Goldberg, 2015) has a local pattern of connection between the neurons of adjacent layers. This connection helps to maintain a special spatially local correlation. This characteristic is helpful in the classification of the sentences in NLP. It finds strong local clues that appear in the different places of inputs regardless of input class membership. The local indicators are nothing but the key phrases that helps to identify the sentiment of a sentence. Consider the movie review from the IMDB dataset

  • Review 1:“The movie has a moral story where the actor helps to resolve the social issue from the society.”

A convolutional layer extracts the local features from the movie review includes “actor”, “resolve”, “social”, “issue” etc. Self-Attention distinguishes relevant and un-relevant parts of a movie review based on Parts of Speech (POS). It correlates the distinct parts of a longer sequence to compute the weight of a part of a sequence. In the above review, “actor”, “resolve”, “social”, “issue” holds noun, verb, adjective, noun tag of POS. These tags are assigned higher weights by the Self-Attention layer and improve (Sindoori et al., 2017) the prediction score of the movie review. GRU (Chen Tao et al., 2017) is divergent of the recurrent network that doesn’t have internal memory and has only two gates when compared with LSTM Sepp Hochreiter et al., 1997). The internal design of GRU is simple and takes less training time than LSTM. Bidirectional GRU scans the review in the forward and reverse direction (Li Zhang et al., 2017 and Jianqiao Hu et al., 2017). It has higher learning power to better understand contextual information. It relates the features that are located in distinct parts of the sentence.

The term “movie” is located on the left side of the sentence, while “social issue” is located on the right side. BGRU understand contextually and correlates the term or phrases located on two extreme ends of the movie review. We have conducted four experiments on the IMDB dataset viz., Unidirectional Neural Network (CNN, LSTM, GRU), Bidirectional Neural Network (BLSTM (Yu Zhao et al., 2017), BGRU), Hybrid Neural Network (CNN+LSTM, CNN+BGRU) and Attention Based Neural Network. The proposed attention based architecture is compared with baseline architecture. It obtained better results than other architecture. The contribution of the research work is listed below:

  • We have designed a new architecture by integrating the attention layer with a hybrid convolution bidirectional recurrent neural network (ACBRNN).

  • The proposed architecture extracts more relevant terms and assigns high weights to those terms based on a context that influences the polarity of the review.

  • We have highlighted the importance of different layers in the proposed architecture with a movie review.

Complete Article List

Search this Journal:
Reset
Volume 16: 1 Issue (2024)
Volume 15: 2 Issues (2023)
Volume 14: 4 Issues (2022): 1 Released, 3 Forthcoming
Volume 13: 4 Issues (2021)
Volume 12: 4 Issues (2020)
Volume 11: 4 Issues (2019)
Volume 10: 4 Issues (2018)
Volume 9: 4 Issues (2017)
Volume 8: 4 Issues (2016)
Volume 7: 4 Issues (2015)
Volume 6: 4 Issues (2014)
Volume 5: 4 Issues (2013)
Volume 4: 4 Issues (2012)
Volume 3: 4 Issues (2011)
Volume 2: 4 Issues (2010)
Volume 1: 4 Issues (2009)
View Complete Journal Contents Listing