Sharing Video Emotional Information in the Web

Sharing Video Emotional Information in the Web

Eva Oliveira (Digital Games Research Centre (DIGARC), Polytechnic Institute of Cávado and Ave, Barcelos, Portugal), Teresa Chambel (LaSIGE, University of Lisbon FCUL, Lisbon, Portugal) and Nuno Magalhães Ribeiro (Centro de Estudos e Recursos Multimediáticos (CEREM), Universidade Fernando Pessoa, Porto, Portugal)
Copyright: © 2013 |Pages: 21
DOI: 10.4018/ijwp.2013070102

Abstract

Video growth over the Internet changed the way users search, browse and view video content. Watching movies over the Internet is increasing and becoming a pastime. The possibility of streaming Internet content to TV, advances in video compression techniques and video streaming have turned this recent modality of watching movies easy and doable. Web portals as a worldwide mean of multimedia data access need to have their contents properly classified in order to meet users’ needs and expectations. The authors propose a set of semantic descriptors based on both user physiological signals, captured while watching videos, and on video low-level features extraction. These XML based descriptors contribute to the creation of automatic affective meta-information that will not only enhance a web-based video recommendation system based in emotional information, but also enhance search and retrieval of videos affective content from both users’ personal classifications and content classifications in the context of a web portal.
Article Preview

1. Introduction

With the advent of rich interactive multimedia content over the Internet in such environments as educational or entertainment, the way people use online multimedia content, such as film viewing, video and image sharing, asks for new ways to access, explore and interact with such information (Purcell, 2010). Video on the Web has been in explosive growth, which improves the fullness of the user experience but leads to new challenges in content discovery, searching and accessing. Information needs to be labeled or annotated to be accessible, shared and searchable. The Multimedia Information Retrieval (MIR) research area is still trying to find solutions to automated content and users analysis techniques, and annotation techniques for media, as a result of a huge need of descriptors (metadata) of contents that can be understood by computers and accessible to humans. Also, data quality for Web portals consumers is still in debate and new methods to ensure and improve data quality are being studied (Herrera et al., 2010). Thus, there is a need for automatic methods for gathering information both from multimedia objects (video, images, audio, text) and from users (preferences, emotions, likings, comments, descriptions, annotations), and subsequently making this information available, searchable and accessible (Lew, 2006). In the literature, there are several studies that have attempted to define standards to establish structures of descriptors and concepts for affective applications in their categorization issues (Devillers, Vidrascu & Lamel, 2005; Douglas-Cowie et al., 2007; Luneski & Bamidis, 2007; Schroder et al., 2007). One of the first works developed towards this goal was the HUMAINE database, despite the fact that it did not propose a formal definition to structure emotions, but simply identified the main concepts.

It is clear that there is a demand for new tools that enable automatic annotation and labeling of digital videos. In addition, the improvement of new techniques for gathering emotional information about videos, be it through content analysis or user implicit feedback through user physiological signals, is revealing a set of new ways for exploring emotional information in videos, films or TV series. In fact, gathering emotional information in this context brings out new perspectives to personalize user information by creating emotional profiles for both users and videos. In a collaborative web environment, the collection of users profiles and video profiles has the potential to: (a) empower the discovery of interesting emotional information in unknown or unseen movies, (b) compare reactions to the same movies among other users, (c) compare directors’ intentions with the effective impact on users, and (d) analyze, over time, our own reactions or directors’ tendencies. The reason for the growing interest in emotion research lies precisely in the importance of the emotions for human beings. According to Axelrod and Hone (2006), emotions are currently regarded as important for human development, in social relationships and in the context of thinking processes such as reasoning, problem solving, motivation, consciousness, memory, learning, and creativity. Thus, the relationship between people and the world they live in is clearly emotional. Nowadays, with technological development, the world in which people live also implies computers and their applications. When considering this particular relation it becomes evident that we must support user experiences that are engaging and enjoyable.

Complete Article List

Search this Journal:
Reset
Open Access Articles: Forthcoming
Volume 11: 2 Issues (2019): Forthcoming, Available for Pre-Order
Volume 10: 2 Issues (2018): 1 Released, 1 Forthcoming
Volume 9: 2 Issues (2017)
Volume 8: 1 Issue (2016)
Volume 7: 2 Issues (2015)
Volume 6: 4 Issues (2014)
Volume 5: 4 Issues (2013)
Volume 4: 4 Issues (2012)
Volume 3: 4 Issues (2011)
Volume 2: 4 Issues (2010)
Volume 1: 4 Issues (2009)
View Complete Journal Contents Listing