Article Preview
Top1. Introduction
Emotions are an inherent part of an individual which gives us a glance at the cognitive state of mind. They provide us with insights that would help us in understanding the reasoning of an individual. Various forms of research have been done in this field. The earliest known was by method for knowing the emotion being expressed would be done manually by people, which was used to mainly gather the facial and speech data from the people where they have consented to share that information (Bosker, 2013). Sentimental Analysis can be attained from two forms: written texts and speech inputs (dialogues) (Shivhare & Khethawat, 2012). In the written texts, the focus was to extract the “words or phrases” from the sentences expressing emotions (Ezhilarasi & Minu, 2012, Krcadinac et.al, 2013). Heightened emotional involvement helps make the material and the learning experience more memorable. Research has shown that emotional engagement is associated with positive outcomes for student success, including academic achievement. Emotion recognition can be implemented using various features like face (Rangayya R et.al, 2021), speech, text, etc. In recent events, facial emotion recognition seems to have become very popular due to their accurate way of determining an emotion and ease of availability of datasets. The above given Fig 1 shows images from the FER dataset.
A digital image is matrix of numbers composed of integers, these numbers are numbered according to the rgb or the grey level these boxes in the matrix are known as pixels, now these pixels are noted by taking the x, y coordinates on the spatial coordinates of x-axis and y-axis. A digital image usually is not of appropriate quality or doesn't show the complete image with clarity this is where image processing stands up which manipulates the images to appropriate quality (Virupakshappa & Basavaraj A, 2019).
The eye-tracking system is another method being used to identify the attention span of a student. Eye-tracking is the process of measuring the motion of the eye and the point of gaze of the eye. The gaze of humans allows us to determine the attention of an individual and where their point of gaze is allowing us to determine the attention span of the student. In an online education environment like today, this would help in understanding the students better so that they can be focused on adequately. These insights help us in better evaluation of the student's attentivity in their classes (Suhail et.al, 2019, Gaikwad Kiran Pandhari & Manna Sheela Rani Chetty, 2020).
The paper being proposed is the integration of the above-specified techniques which would help us to ultimately be able to determine if a student is attentive or not.
Figure 1.
Images from FER dataset showing expressions of anger, fear, happiness, and sadness
TopThere have been various works in the field of emotion recognition. The prominent ones which were used to determine the emotion would be the use of biological signals and facial data. The biological signals such as heart rate, breathing rate, blood pressure would be used to identify what the person is feeling. These sure provide great results but cannot be performed from a remote location. Upani Wijeratne later came on to use brain waves to get the frequency being produced and the region of the brain getting activated to check the emotional state of an individual with the help of Electroencephalography (EEG) (Wijeratne & Perera, 2012).