BCI-Based User-Centered Design for Emotionally-Driven User Experience

BCI-Based User-Centered Design for Emotionally-Driven User Experience

Valeria Carofiglio (Università degli Studi, Bari, Italy) and Fabio Abbattista (Università degli Studi, Bari, Italy)
DOI: 10.4018/978-1-4666-4046-7.ch013
OnDemand PDF Download:
No Current Special Offers


In order to develop a complex interactive system, user-centered evaluation (UCE) is an essential component. The new interaction paradigms encourage exploring new variables for accounting the users’ experience in terms of their needs and preferences. This is especially important for Adaptable Virtual Environments (AVE). In this context, to obtain a more engaging overall user’s experience, a good designer should perform proper formative and summative usability tests based on the user’s emotional level, which become a UCE activity. Our methodology tries to overcome the weaknesses of traditional methods by employing a Brain Computer Interface (BCI) to collect additional information on user’s needs and preferences. A set of preliminary usability experiments has been conducted for (i) determining if the outcome of a BCI is suitable to drive the designer in organizing the user-system dialog within AVE and (ii) evaluating the user-system dialog, in terms of dynamic increase of the emotionally-driven interaction’s customization.
Chapter Preview

Setting The Stage

In the global race for more intuitive interfaces that must allow non-expert users to operate increasingly complex technology, we explored Virtual Environments, paying attention to the role of emotions in the design and use of such interfaces. The area to which our past work belonged most directly was Multimodal Interfaces. Our work specifically addressed the objective of this area to develop natural and adaptive multimodal interfaces and its focus on interaction between and among humans and the virtual and physical environment, with particular emphasis on recognizing and responding to emotive user reaction.

The employment of 3D Virtual Environments (VEs) is continuously growing in several different applicative domains. VEs show great potentialities in fields such as Virtual Heritage, Serious Gaming and Visual Analytics. The flexibility of a VE allows domain experts to communicate specific views and interpretations of the reality in a way accessible to final users by a proper choice of contents, representation and rendering. Interaction with 3D VEs is inherently multi-modal, spatial input devices (such as trackers, 3D pointing devices, gesture and vocal devices) and multisensory output technologies (head mounted displays, spatial audio and haptic devices) allow interaction to be carried out with advanced input/output devices involving different sensorial channels (sight, hear, touch, etc.) in an integrated way. Each device addresses a particular sense and exhibits a different interface: Bowman (Bowmann et al., 2005) offers a broad review of multimodal interaction while Salisbury (Salisbury et al., 2004) represents a good introduction to haptic. Interaction with VEs needs 3D user interfaces to be organized in metaphors based on user tasks (navigation, selection and manipulation) (Bowmann et al., 2001). Currently one of the most important issues in designing multimodal interaction for 3D VEs is to develop engineered methods to design multimodal interaction with VEs.

On the other side, the development of emotion-oriented systems related to more objectives: (i) To develop interfaces that were natural and intuitive (that should appear “believable” and easy to use and accepted by the human side of the interaction loop) and that were either capable of pro-active emotion-oriented behaviour or of adapting to the user environment (in particular emotion-related aspects) through continued interaction; (ii) To respond intelligently and in a way that were emotionally appropriate; (iii) To achieve robust dialog capability.

Complete Chapter List

Search this Book: