A Predictive Linear Regression Model for Affective State Detection of Mobile Touch Screen Users

A Predictive Linear Regression Model for Affective State Detection of Mobile Touch Screen Users

Samit Bhattacharya (IIT Guwahati, Guwahati, India)
Copyright: © 2017 |Pages: 15
DOI: 10.4018/IJMHCI.2017010103
OnDemand PDF Download:


Emotion, being important human factor, should be considered to improve user experience of interactive systems. For that, we first need to recognize user's emotional state. In this work, the author proposes a model to predict the affective state of a touch screen user. The prediction is done based on the user's finger strokes. The author defined seven features on the basis of the strokes. The proposed predictor is a linear combination of these features, which the author obtained using a linear regression approach. The predictor assumes three affective states in which a user can be: positive, negative and neutral. The existing works on affective touch interaction are few and rely on many features. Some of the feature values require special sensors, which may not be present in many devices. The seven features we propose do not require any special sensor for computation. Hence, the predictor can be implemented on any device. The model is developed and validated with empirical data involving 57 participants performing 7 touch input tasks. The validation study demonstrates a high prediction accuracy of 90.47%.
Article Preview

1. Introduction

Affective computing is a relatively recent field of study, focusing on investigating the role of affect and emotion in the design of computing systems (Piccard, 1997). It is well known that emotion is an important human factor in human computer interaction (HCI). Therefore, researchers tried to incorporate affect and emotion in HCI to improve usability and user experience by making the systems more natural and responsive to the goals and expectations of the user.

There are broadly two components of an affective interaction: recognition of the affective state of the user and design of the interaction that complements and/or changes the emotional state. Although the second component is non-trivial, the more difficult part is the first component; how to recognize the emotional state of an individual? A substantial body of literature is available on emotion recognition. In these works, methods were proposed to recognize emotion from facial expression, gesture, posture and physiological signals. These mostly involved computer vision and image processing techniques, which are computationally expensive. In addition, such methods often require additional set ups such as cameras or probes and wires to record physiological signals.

We see all around us various devices that are operated by touch. The popularity of mobile touch screen devices has seen significant rise in recent years. A vast range of such devices is now available that include the smart phones, tabs and laptops. In fact, the desktop systems are also being made touch-enabled. According to a survey by the Federation of Indian Chambers of Commerce and Industry (FICCI) and KPMG International1, 59 million people were using smart phones till 2013 in India and it is expected to reach 265 million by 2016. The statistics is in fact reflective of a global trend. Clearly, mobile touch input devices have already become very popular and going to be so in the near future. Since these are being used by the masses, HCI issues are very important to improve usability of these devices. Emotion being an integral component of human factors that influence usability, it is therefore necessary to work in the direction of affective touch interaction. In this article, we present the first step towards achieving the objective; namely, detect emotion for touch screen users.

Since we are dealing with mobile devices, we should keep in mind that the devices (particularly the affordable ones) come with limited resources in terms of power backup or processor speed. Moreover, attaching extra hardware may not be feasible considering the mobility aspect. Therefore, the existing emotion recognition approaches are not suitable for such devices. We require something that does not need any extra setup or depend on expensive computations. The method we propose in this work relies on users’ touch interaction behaviour (finger strokes) to detect the emotional state of mind. It does not require additional sensors or wires to record the finger strokes. Also, the computations are less compared to the existing methods. Hence, the proposed approach is expected to be more suitable for mobile touch input devices.

Briefly, our proposed approach works as follows: we categorize users into three emotional states, namely positive (representing the emotions happy, excited and elated), negative (representing the emotions sad, anger, fear and disgust) and neutral (representing the emotions calm, relaxed and contented). Given the finger stroke behaviour of a user during touch interaction, we predict his/her affective state into one of these three categories. In order to build the predictor, we identified seven features based on the user’s finger stroke behaviour. We assume that these features provide indirect indication to the user’s emotional state. The idea comes from the research findings that touch can tell about emotion types (Hertenstein, Holmes, McCullough, & Keltner, 2009). We developed a linear combination of the seven features using liner regression from empirical data, as our proposed predictive model. Empirical validation of the regression model demonstrates a prediction accuracy of about 91%, making it suitable for practical use. The proposed model along with the empirical data collection and analysis are described in this article. The article is organized as follows.

Works related to emotion detection are discussed in Section 2. The feature set is described in details in Section 3. The empirical study details are presented in Section 4. The proposed model is discussed in Section 5. A discussion on the proposed model, including its pros and cons, along with the scope for future works is presented in Section 6. Section 7 concludes the article.

Complete Article List

Search this Journal:
Open Access Articles
Volume 9: 4 Issues (2017): 2 Released, 2 Forthcoming
Volume 8: 4 Issues (2016)
Volume 7: 4 Issues (2015)
Volume 6: 4 Issues (2014)
Volume 5: 4 Issues (2013)
Volume 4: 4 Issues (2012)
Volume 3: 4 Issues (2011)
Volume 2: 4 Issues (2010)
Volume 1: 4 Issues (2009)
View Complete Journal Contents Listing