Modification of Happiness Expression in Face Images

Modification of Happiness Expression in Face Images

Dao Nam Anh, Trinh Minh Duc
Copyright: © 2017 |Pages: 12
DOI: 10.4018/IJNCR.2017070104
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This article describes how facial expression detection and adjustment in complex psychological aspects of vision is central to a number of visual and cognitive computing applications. This article presents an algorithm for automatically estimating happiness expression of face images whose demographic aspects like race, gender and eye direction are changeable. The method is also broadening for alteration of level of happiness expression for face images. A schema of the weighted modification is proposed for enhancement of happiness expression. The authors employ a robust face representation which combines the color patch similarity and the self-resemblance of image patches. A large set of face images with appearance of the properties is learned in a statistical model for interpreting the facial expression of happiness. The authors will show the experiments of such a model using face features for learning by SVM and analyze the performance.
Article Preview
Top

1. Introduction

Detecting personal characteristics for face images is an essential question evolving multiple operations in psychological and visual analysis. Observation of face images of community which has variety of happiness expression in faces, would take advantage by a productive method of perceiving challenge situations to recognize the happiness (Ozer et al., 2006). As much work has been focused on this and on the associated fields such as perceptible personal characteristics in the point of view of statistics, the field of study is addressed with considerable progress. From this viewpoint, we find that happiness expression can be used in reasoning, in particular when accessing the distributions of image features like color and patch structures.

Using such analysis, the facial expression question can be interpreted as a matching problem. Because actual facial scenes often associate the dealing with happiness expression which is not identical, we need to consider wide alteration. This leads constantly to a suitable concept for face images presentation which preserves the significant appearance of facial happiness. There are two main aspects we would like such concept to cover: (a) image features should be generic to be efficient represent examples of images, (b) the features must be explicit to specify facial characteristics. Consistently this places conditions on the searching pattern features for happiness judgment for face images since proper selection of features in actual facial scenes is often hard to get.The most current advances in facial expression detection adopt local image features to encode image structure in the spatial neighborhoods where appropriate features are required to hold the locality, invariance, distinctiveness and repeatability (Brown & Lowe, 2007). The features detected from a set of face image examples are used to learn the happiness expression distribution and the estimation of happiness is gained by a reasoning algorithm (Murphy, 2012; Cootes, Taylor, 2001). This method implements well for spatial image features, but the extension with personal characteristics like age, and sex is capable to improve the correctness of the happiness expression estimation.

In this paper, we outline an application that can support recognition and modification of happiness expression for face images with local image features regarding the personal characteristics. We first provide a model of rational judgments to demonstrate how well combined features can raise reliability of estimation for the happiness appearance of face images. This is achieved by implementing a relevance analysis of the features that a face image has, and study the efficacy of different image features against facial object specification. We then describe estimators based on the relevance analysis in order to evaluate the level of happiness expression for a face image in the most efficient manner. In the next stage, the face image modification aims at enhancing the happiness expression by gathering target happy face images and adjusting the image features while keeping essential personal qualities, see two examples in Figure 1.

Figure 1.

Face image modification for enhancing the happiness expression

IJNCR.2017070104.f01

In the remainder of this paper we show our method to model relationship of personal characteristics and geometric local appearance features. We outline how the model can be used in image perception and describe a feasible application for face image interpretation. Finally, we discuss the consistency and weaknesses of the method, and come to conclusions.

Top

2. Prior Works

The variability of facial expressions and the complexity of psychological emotions (Ekman, 1993) make face image recognition and analysis a challenging task. There has been increased considerable interest in study of emotional measures for physical appearance of faces. One motivation is to achieve reliability and validity of the measurement to interpret face images by using appropriate machine learning methods, whose central themes are the suitable selection of image features (Boucher Alain and Thi-Lan Le, 2005) to present facial objects. This section will present related work for patch color similarity (Bhateja et al., 2015) and self-resemblance that used in our study (Figure 2).

Complete Article List

Search this Journal:
Reset
Volume 12: 1 Issue (2024): Forthcoming, Available for Pre-Order
Volume 11: 4 Issues (2022): 1 Released, 3 Forthcoming
Volume 10: 4 Issues (2021)
Volume 9: 4 Issues (2020)
Volume 8: 4 Issues (2019)
Volume 7: 4 Issues (2018)
Volume 6: 2 Issues (2017)
Volume 5: 4 Issues (2015)
Volume 4: 4 Issues (2014)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing