Affective Video Tagging Framework using Human Attention Modelling through EEG Signals

Affective Video Tagging Framework using Human Attention Modelling through EEG Signals

Shanu Sharma, Ashwani Kumar Dubey, Priya Ranjan
Copyright: © 2022 |Pages: 18
DOI: 10.4018/IJIIT.306968
Article PDF Download
Open access articles are freely available for download

Abstract

The explosion of multimedia content over the past years is not surprising; thus, their efficient management and analysis methods are always in demand. The effectiveness of any multimedia content deals with analyzing human perception and cognition while watching it. Human attention is also one of the important parameters, as it describes the engagement and interestingness of the user while watching that content. Considering this aspect, a video tagging framework is proposed in which the EEG signals of participants are used to analyze human perception while watching videos. A rigorous analysis has been performed on different scalp locations and frequency rhythms of brain signals to formulate significant features corresponding to affective and interesting video content. The analysis presented in this paper shows that the extracted human attention-based features are generating promising results with the accuracy of 93.2% using SVM-based classification model, which supports the applicability of the model for various BCI-based applications for automatic classification of multimedia content.
Article Preview
Top

Introduction

Over the past years, one can see incredible growth in the direction of multimedia content creation along with the huge transformation in technology and devices. Today, the use and transmission of high-quality digital videos can be seen almost everywhere whether it is in online learning videos, surveillance videos, entertainment videos, or self-made videos on smartphones (Caviedes, 2012). With this explosive increase in the use and transmission of these images and videos, their efficient management techniques are also in high demand. With the advancement in technology, today new opportunities are also created for accessing multimedia content archives, but for efficient search methods, proper annotation of these content with significant features is always required (Dimitrova et al., 2002; Smith & Chen, 2005).

Any kind of multimedia content like images, videos, or music usually involves some emotions that the creator wants to bring in its viewers and somehow generates different impacts on viewers’ or listeners’ emotional states (Isola et al., 2014). This kind of effect is highly subjective and is usually not incorporated while indexing and analyzing multimedia content. Thus, in this paper, the most important factor “Affect” is considered for the analysis of multimedia content. Affectiveness is directly related to the viewer’s emotion and can work as a significant subjective feature for the classification and tagging of multimedia content (Siddharth et al., 2019). Affective analysis of multimedia content can be done by analyzing human’ s perception and cognition while watching that multimedia content (Isola et al., 2014; Siddharth et al., 2019).

In today’s era, the advancement in neuroscience and brain-computer interface technologies have provided a deep understanding of complex information processing for automatic detection of a range of cognitive states (Müller, 2008; Hassanien & Azar, 2015). The brain imaging research field can play an effective role in the prediction of non-conscious user reactions to various types of multimedia content like movies, cricket videos, online video advertisements, etc. To date, many technologies have been developed and successfully used for capturing and analyzing brain reactions such as fMRI, PET, CT, EEG, etc. (Ghaemmaghami, 2017). For the past few decades, owing to the development of affordable and portable EEG devices, researchers are trying to explore their use in different possible fields (Hassanien & Azar, 2015). The use of EEG signals for multimedia content assessment is also an active research area.

Motivated by the applicability of EEG signals and the role of affectiveness in multimedia content assessment, in this paper, the work is presented towards proposing a model for automatic tagging of videos by modeling the human’s EEG responses corresponding to specific video content. The work presented here is an extension of the previously published work done by Sharma et al. (2021), where the authors explored the relationship between affective content and corresponding EEG responses of viewers. Different brain sections like Frontal, Temporal, Parietal, and Occipital parts have been explored using extracted frequency bands of EEG signals i.e., Alpha, Beta, Gamma, Delta, and Theta. In continuation of this, here an automatic video tagging model is developed using the analysis presented in Sharma et al. (2021). The significant contributions of the presented research are mentioned as follows:

Complete Article List

Search this Journal:
Reset
Volume 20: 1 Issue (2024)
Volume 19: 1 Issue (2023)
Volume 18: 4 Issues (2022): 3 Released, 1 Forthcoming
Volume 17: 4 Issues (2021)
Volume 16: 4 Issues (2020)
Volume 15: 4 Issues (2019)
Volume 14: 4 Issues (2018)
Volume 13: 4 Issues (2017)
Volume 12: 4 Issues (2016)
Volume 11: 4 Issues (2015)
Volume 10: 4 Issues (2014)
Volume 9: 4 Issues (2013)
Volume 8: 4 Issues (2012)
Volume 7: 4 Issues (2011)
Volume 6: 4 Issues (2010)
Volume 5: 4 Issues (2009)
Volume 4: 4 Issues (2008)
Volume 3: 4 Issues (2007)
Volume 2: 4 Issues (2006)
Volume 1: 4 Issues (2005)
View Complete Journal Contents Listing