Affective computing is a relatively recent field of study, focusing on investigating the role of affect and emotion in the design of computing systems (Piccard, 1997). It is well known that emotion is an important human factor in human computer interaction (HCI). Therefore, researchers tried to incorporate affect and emotion in HCI to improve usability and user experience by making the systems more natural and responsive to the goals and expectations of the user.
There are broadly two components of an affective interaction: recognition of the affective state of the user and design of the interaction that complements and/or changes the emotional state. Although the second component is non-trivial, the more difficult part is the first component; how to recognize the emotional state of an individual? A substantial body of literature is available on emotion recognition. In these works, methods were proposed to recognize emotion from facial expression, gesture, posture and physiological signals. These mostly involved computer vision and image processing techniques, which are computationally expensive. In addition, such methods often require additional set ups such as cameras or probes and wires to record physiological signals.
We see all around us various devices that are operated by touch. The popularity of mobile touch screen devices has seen significant rise in recent years. A vast range of such devices is now available that include the smart phones, tabs and laptops. In fact, the desktop systems are also being made touch-enabled. According to a survey by the Federation of Indian Chambers of Commerce and Industry (FICCI) and KPMG International1, 59 million people were using smart phones till 2013 in India and it is expected to reach 265 million by 2016. The statistics is in fact reflective of a global trend. Clearly, mobile touch input devices have already become very popular and going to be so in the near future. Since these are being used by the masses, HCI issues are very important to improve usability of these devices. Emotion being an integral component of human factors that influence usability, it is therefore necessary to work in the direction of affective touch interaction. In this article, we present the first step towards achieving the objective; namely, detect emotion for touch screen users.
Since we are dealing with mobile devices, we should keep in mind that the devices (particularly the affordable ones) come with limited resources in terms of power backup or processor speed. Moreover, attaching extra hardware may not be feasible considering the mobility aspect. Therefore, the existing emotion recognition approaches are not suitable for such devices. We require something that does not need any extra setup or depend on expensive computations. The method we propose in this work relies on users’ touch interaction behaviour (finger strokes) to detect the emotional state of mind. It does not require additional sensors or wires to record the finger strokes. Also, the computations are less compared to the existing methods. Hence, the proposed approach is expected to be more suitable for mobile touch input devices.
Briefly, our proposed approach works as follows: we categorize users into three emotional states, namely positive (representing the emotions happy, excited and elated), negative (representing the emotions sad, anger, fear and disgust) and neutral (representing the emotions calm, relaxed and contented). Given the finger stroke behaviour of a user during touch interaction, we predict his/her affective state into one of these three categories. In order to build the predictor, we identified seven features based on the user’s finger stroke behaviour. We assume that these features provide indirect indication to the user’s emotional state. The idea comes from the research findings that touch can tell about emotion types (Hertenstein, Holmes, McCullough, & Keltner, 2009). We developed a linear combination of the seven features using liner regression from empirical data, as our proposed predictive model. Empirical validation of the regression model demonstrates a prediction accuracy of about 91%, making it suitable for practical use. The proposed model along with the empirical data collection and analysis are described in this article. The article is organized as follows.
Works related to emotion detection are discussed in Section 2. The feature set is described in details in Section 3. The empirical study details are presented in Section 4. The proposed model is discussed in Section 5. A discussion on the proposed model, including its pros and cons, along with the scope for future works is presented in Section 6. Section 7 concludes the article.