Human Face Recognition using Gabor Based Kernel Entropy Component Analysis

Human Face Recognition using Gabor Based Kernel Entropy Component Analysis

Arindam Kar, Debotosh Bhattacharjee, Dipak Kumar Basu, Mita Nasipuri, Mahantapas Kundu
Copyright: © 2012 |Pages: 20
DOI: 10.4018/ijcvip.2012070101
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

In this paper, the authors present a novel Gabor wavelet based Kernel Entropy Component Analysis (KECA) method by integrating the Gabor wavelet transformation (GWT) of facial images with the KECA method for enhanced face recognition performance. Firstly, from the Gabor wavelet transformed images the most important discriminative desirable facial features characterized by spatial frequency, spatial locality and orientation selectivity to cope with the variations due to illumination and facial expression changes were derived. After that KECA, relating to the Renyi entropy is extended to include cosine kernel function. The KECA with the cosine kernels is then applied on the extracted most important discriminating feature vectors of facial images to obtain only those real kernel ECA eigenvectors that are associated with eigenvalues having positive entropy contribution. Finally, these real KECA features are used for image classification using the L1, L2 distance measures; the Mahalanobis distance measure and the cosine similarity measure. The feasibility of the Gabor based KECA method with the cosine kernel has been successfully tested on both frontal and pose-angled face recognition, using datasets from the ORL, FRAV2D, and the FERET database.
Article Preview
Top

1. Introduction

Face authentication has gained considerable attention in the recent years due to the increasing need in access verification systems, surveillance, security monitoring and so on. Such systems are used for the verification of a user's identity on the internet, bank automaton, authentication for entry to secured buildings, etc. Face recognition involves recognition of personal identity based on geometric or statistical features derived from face images. Even though humans can identify faces with ease, but developing an automated system that accomplishes such objectives is very challenging. The challenges are even more intense when there are large variations in illumination conditions, viewing directions or poses, facial expression, aging, etc.

Robust face recognition schemes require features with low-dimensional representation for storage purposes and enhanced discrimination abilities for subsequent image retrieval. The representation methods usually start with a dimensionality reduction procedure as the high-dimensionality of the original visual space makes the statistical estimation very difficult, if not impossible, due to the fact that the high-dimensional space is mostly empty. Data transformation is of fundamental importance in pattern analysis and machine intelligence. The goal is to transform the potentially high dimensional data into an alternative and typically lower dimensional representation, which reveals the underlying structure of the data. The most well known method for this is principal component analysis (PCA) (Jolliffe, 1986; Smith, 2002), which is based on the data correlation matrix. It is a linear method ensuring that the transformed data are uncorrelated and preserve maximally the second order statistics of the original data. This method is based on selecting the top or bottom eigenvalues (spectrum) and eigenvectors of specially constructed data matrices. Another linear method is metric multidimensional scaling (MDS), which preserves inner products. Metrics MDS and PCA can be shown to be equivalent (Hotelling, 1933; Williams, 2002). In recent years, a number of advanced nonlinear spectral data transformation methods have been proposed. A very influential method is kernel PCA (KPCA) (Heo, Gader, & Frigui, 2009). KPCA performs traditional PCA in a so-called kernel feature space, which is nonlinearly related to the input space (Schölkopf, Smola, & Muller, 1998). This is enabled by a positive semi definite (psd) kernel function, which computes inner products in the kernel feature space. An inner product matrix, or kernel matrix, can thus be constructed. Performing metric MDS on the kernel matrix, based on the top eigenvalues of the matrix, provides the KPCA data transformation. KPCA has been used in many contexts. A spectral clustering method was, for example, developed in Jenssen and Eltoft (2008) where C-means (Zhang & Chen, 2004) is performed on the KPCA eigenvectors. KPCA has also been used for pattern de-noising (Jade, Srikanth, Jayaraman, Kulkarni, Jog, & Priya, 2003; MacQueen, 1967; Kwok & Tsang, 2004; Li, Li, & Tao, 2008) and classification (Mika, Schölkopf, Smola, Müller, Scholz, & Rätsch, 1999). Many other spectral methods exist, which differ in the way the data matrices are constructed. In Mika, Schölkopf, Smola, Müller, Scholz, and Rätsch (1999) for example, the data transformation is based on the normalized Laplacian matrix, and clustering is performed using C-means on unit norm spectral data. Manifold learning and other clustering variants using the Laplacian matrix exists (Ng, Jordan, & Weiss, 2002).

Kernel Entropy Component Analysis (KECA) (Jenssen, 2010) is a new and only spectral data transformation method on information theory. It is directly related to the Renyi entropy of the input space data set via a kernel-based Renyi entropy estimator. Renyi entropy estimator is expressed in terms of projections onto the principal axes of feature space. The KECA transformation is based on the most entropy preserving axes. The present work in this paper is possibly the first application of Gabor based KECA in a face recognition or image classification problem.

Complete Article List

Search this Journal:
Reset
Volume 14: 1 Issue (2024): Forthcoming, Available for Pre-Order
Volume 13: 1 Issue (2023)
Volume 12: 4 Issues (2022): 1 Released, 3 Forthcoming
Volume 11: 4 Issues (2021)
Volume 10: 4 Issues (2020)
Volume 9: 4 Issues (2019)
Volume 8: 4 Issues (2018)
Volume 7: 4 Issues (2017)
Volume 6: 2 Issues (2016)
Volume 5: 2 Issues (2015)
Volume 4: 2 Issues (2014)
Volume 3: 4 Issues (2013)
Volume 2: 4 Issues (2012)
Volume 1: 4 Issues (2011)
View Complete Journal Contents Listing