Article Preview
Top1. Introduction
There has been an increase in the use of Electro-encephalography (EEG) and other physiological signals, in part motivated by the existence of relatively low-cost hardware, e.g. modern fitness devices. This naturally has led to new possibilities of user-device interaction and equally to higher expectations by the users. Many users now expect personalization, by user preferences and by adaptation, as a standard function of many mobile and wearable devices. An emerging and ever-expanding approach to such personalization is emotion recognition in which the mood or affective state of the user is approximated and then used to modify or adapt the system functionality or appearance (Arevalillo-Herra´ez et al., 2014). This is particularly ever more apparent in recommender systems, such as (Posner, Russell, & Peterson, 2015; Qin, & Zhang, 2016; Rosa, Rodriguez, & Bressan, 1980) few but to give examples.
Many approaches to emotion recognition from EEG signals rely on identifying a very small number of classes and to train a classifier. The interpretation of these classes varies from a single emotion such as stress to features of emotional model such as valence-arousal. There are two major issues here. First classification approach limits the analysis of the data within the selected classes and also highly dependent on training and limits generalization. If we are to advance on personalized emotion models (Ayesh, Arevalillo-Herra´ez, & Ferri, 2016) we need more dynamic framework to model and identify emotions. This can then be naturally extended to include implicitly or explicitly other intertwining factors, such as personality, in representing and updating user affective states. Second issue is that it does not explore the inter-relationships between the data collected missing out on any correlations that could tell us interesting facts beyond emotional recognition. This second issue would be of particular interest to psychologists and medical professions.
In this paper, we investigate the use of Self-Organizing Maps (SOM) in identifying clusters from EEG signals that could then be mapped to emotional classes. We trained varying sizes of SOM with EEG data using DEAP (Koelstra et al., 2012), a publicly available dataset. The produced graphs showing Neighbor Distance, Sample Hits, Weight Position are analyzed holistically to identify patterns in the structure. Following from that, we compare node density and sample clustering to the sample classification provided in (Koelstra et al., 2012) to identify correlation between the sample classification and some of the generated clusters. The results show the potential for class discovery. We conclude with a discussion on the implications of this work and the difficulties in evaluating its outcome.
The paper is organized as follows. First, we start by giving background on the data used and how it was analyzed and prepared. The experiments with SOM and the analysis of the results are presented and the main conceptual contribution of this paper discussed in detail. We then conclude the paper with a critical discussion covering outstanding research questions.