Mental Health Detection Using Transformer BERT

Mental Health Detection Using Transformer BERT

Kuldeep Kumar Patel, Anikesh Pal, Kumar Saurav, Pooja Jain
DOI: 10.4018/978-1-7998-8786-7.ch006
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

The COVID-19 pandemic has affected the daily life of each individual drastically at global level. The adverse effects of the pandemic on an individual and people around them have created an anxious and depressive environment. The virus has changed the way of living for most people and increased the distance between individuals. As the COVID-19 spread, people have been constantly in bad mental health which includes fear, boredom, sadness, and stress. Based on this situation, in this chapter the authors have analysed the mental health of people affected due to COVID-19 by analyzing two parameters of mental health, boredom and stress, from social media posts by detecting different emotions and feelings expressed in the form of text. The authors have utilized the BERT pre-trained model on preprocessed data to create classification models of boredom, stress, and consequently, determining the emotion of the person. These models are used to determine the emotions (i.e., stress and boredom) during different stages of the COVID-19 pandemic.
Chapter Preview
Top

Most of the Emotion or Mental state classification works nearly the same as Sentiment analysis. Sentiment classification generally has to identify the positive, negative and neutral nature of the text which also describes the mental state of the person who has written that text. Boredom and Stress which depict the mental state of a person can be classified with the same procedure as for sentiment or emotion. Using transformer-based techniques like BERT these classifications have been performed.

2.1 Transformer

The Transformers are architecture in Natural Language Processing that are used to solve tasks related to sequential input data like translation and text summarization. While doing so, it also handles long-range dependencies easily. The main idea behind the usage of Transformer is to take care of the dependencies between input and output with attention and recurrence.

Figure 1.

Transformer

978-1-7998-8786-7.ch006.f01

Figure 1, describes the transformers architecture. For the encoding part, these are actually multiple blocks of Encoder and Decoder stacked together.

These stacks for encoding and decoding works as follows:

  • The first encoder takes the word embeddings of the input sequence..

  • Then their transformation takes place and they are propagated to the next encoder.

  • The last encoder in the stack generates output which is then passed to all the decoders in the stack.Number of units in the encoder and the decoder stack are the same and it is a hyperparameter in the transformer.

Complete Chapter List

Search this Book:
Reset