Biometric Pattern Recognition from Social Media Aesthetics

Biometric Pattern Recognition from Social Media Aesthetics

Samiul Azam, Marina L. Gavrilova
DOI: 10.4018/IJCINI.2017070101
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Online social media (OSN) has witnessed a significant growth over past decade. Millions of people now share their thoughts, emotions, preferences, opinions and aesthetic information in the form of images, videos, music, texts, blogs and emoticons. Recently, due to existence of person specific traits in media data, researchers started to investigate such traits with the goal of biometric pattern analysis and recognition. Until now, gender recognition from image aesthetics has not been explored in the biometric community. In this paper, the authors present an authentic model for gender recognition, based on the discriminating visual features found in user favorite images. They validate the model on a publicly shared database consisting of 24,000 images provided by 120 Flickr (image based OSN) users. The authors propose the method based on the mixture of experts model to estimate the discriminating hyperplane from 56 dimensional aesthetic feature space. The experts are based on k-nearest neighbor, support vector machine and decision tree methods. To improve the model accuracy, they apply a systematic feature selection using statistical two sampled t-test. Moreover, the authors provide statistical feature analysis with graph visualization to show discriminating behavior between male and female for each feature. The proposed method achieves 77% accuracy in predicting gender, which is 5% better than recently reported results.
Article Preview
Top

Introduction

Human aesthetic is the set of cognitive rules that drives a person to choose what he likes (Aydın et al., 2015). Aesthetic rules vary from one person to another, differ for different ages, races and genders. They also slowly change over time depending on a person’s social environment (Gavrilova et al., 2016). Due to variability of aesthetic rules, it is recently introduced as social behavioral trait in the biometric community (Lovato et al., 2014). Although there is great advancement in the area of cognitive computing and informatics (Wang, 2016; Wang et al., 2013), it is still not a trivial task to reconstruct a person’s or group’s cognitive rules of aesthetic. However, with the rapid expansion of online social media (OSN), a person’s aesthetic data becoming more available in the form of image, video, text and music. For example, in the OSN Flickr (Flickr, 2004), people share pictures from different sources, as well as archive their favorite pictures separately. It is possible to impose data mining and data analysis on these images to reconstruct the cognitive rules of image aesthetic.

In this research, our primary focus is to identify a person’s gender from visual aesthetics. According to Cela et al., 2009, research shows that significant differences exist between males and females in their neural correlation of aesthetic preferences. Moreover, males and females have different physiology and cognitive processing (Buss et al., 1992). The authors infer that the observed gender related differences are the process of human evolution. In the last decade, research emerged in the area of web and image aesthetics (Moss et al., 2006, Moss & Gunn, 2007, You et al., 2014).

Demographic information such as gender, age, height, hair color can be combined with primary biometric traits to enhance the performance of biometric security for humans (Dantcheva et al., 2016) or robots (Yampolskiy et al., 2012). These soft traits are known as soft biometrics. Soft biometrics are also extensively used in forensic analysis to cluster potential suspects based on their gender or age. It is very common to utilize primary biometric traits such as fingerprint, iris, face and gait to extract gender information (Dantcheva et al., 2016). However, extracting gender information from image aesthetic has been hardly explored by the researchers. It has several application areas, such as image aesthetic based recommender system and forensic data extraction (Lovato et al., 2014). In this article, we propose a gender prediction model utilizing human’s visual aesthetic features. The preliminary proof of the hypothesis that gender prediction using image aesthetic is possible, was presented at the ICCI*CC 2016 Conference (Azam & Gavrilova, 2016a). The current article is the extended version of this conference paper, invited to IJCINI following the conference. As an extension, we provide statistical features analysis to visualize discriminating behavior of individual features. The proposed method utilizes 14 different types of image perceptual features that have the length of 56 and stored in a vector. To improve the prediction accuracy, and speedup the training and testing process, statistical (t-test) feature selection method is applied. The model is based on the mixture of three conventional but powerful classifiers: support vector machine, k nearest neighbor and decision tree. Instead of equal influence, we apply weighted combination of individual probability that improves the ensemble performance. The model is tested on a real image database collected from 120 Flickr users, available upon request. Among them 60 are males and the rest are females. Each provided 200 of his or her favorite images. Figure 1 shows some example images from a male and a female Flickr user. The proposed method can predict gender correctly for 92 users out of 120 based on aesthetic preferences alone. We also present statistical analysis of features to understand the discriminating behavior of aesthetic features between male and female.

Complete Article List

Search this Journal:
Reset
Volume 18: 1 Issue (2024)
Volume 17: 1 Issue (2023)
Volume 16: 1 Issue (2022)
Volume 15: 4 Issues (2021)
Volume 14: 4 Issues (2020)
Volume 13: 4 Issues (2019)
Volume 12: 4 Issues (2018)
Volume 11: 4 Issues (2017)
Volume 10: 4 Issues (2016)
Volume 9: 4 Issues (2015)
Volume 8: 4 Issues (2014)
Volume 7: 4 Issues (2013)
Volume 6: 4 Issues (2012)
Volume 5: 4 Issues (2011)
Volume 4: 4 Issues (2010)
Volume 3: 4 Issues (2009)
Volume 2: 4 Issues (2008)
Volume 1: 4 Issues (2007)
View Complete Journal Contents Listing