Incremental and Decremental Exponential Discriminant Analysis for Face Recognition

Incremental and Decremental Exponential Discriminant Analysis for Face Recognition

Nitin Kumar, R.K. Agrawal, Ajay Jaiswal
Copyright: © 2014 |Pages: 16
DOI: 10.4018/ijcvip.2014010104
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Linear Discriminant Analysis (LDA) is widely used for feature extraction in face recognition but suffers from small sample size (SSS) problem in its original formulation. Exponential discriminant analysis (EDA) is one of the variants of LDA suggested recently to overcome this problem. For many real time systems, it may not be feasible to have all the data samples in advance before the actual model is developed. The new data samples may appear in chunks at different points of time. In this paper, the authors propose incremental formulation of EDA to avoid learning from scratch. The proposed incremental algorithm takes less computation time and memory. Experiments are performed on three publicly available face datasets. Experimental results demonstrate the effectiveness of the proposed incremental formulation in comparison to its batch formulation in terms of computation time and memory requirement. Also, the proposed incremental algorithms (IEDA, DEDA) outperform incremental formulation of LDA in terms of classification accuracy.
Article Preview
Top

Introduction

Appearance based methods (Turk & Pentland, 1991; Murase & Nayar, 1995) are being widely used for feature extraction in the recent past for face recognition. In these methods, the facial image of a person with size l×w pixels is represented as a vector in n-dimensional space where n=l×w. Such image data are usually characterized with high dimensions and small-sample-size (SSS). This necessitates dimensionality reduction to avoid curse-of-dimensionality (Bellman, 1961) prior to develop any learning model. This also improves performance in terms of accuracy of the learning model, computation time and memory storage. One of popular dimensionality reduction methods is a Linear Discriminant Analysis (LDA) (Fukunaga, 1990; Duda, Hart & Stork, 2000) which is a supervised technique and aims at finding an optimal transformation ijcvip.2014010104.m01 that maximizes the between-class scatter and minimizes the within-class scatter simultaneously. If Sb and Sw denote the between-class and within-class scatter matrices respectively, then the Fisher’s criterion is given by (Duda, Hart & Stork, 2000):

ijcvip.2014010104.m02
(1)

The optimal transformation matrix W is computed by solving the following generalized eigenvalue decomposition problem:

ijcvip.2014010104.m03
(2)

The resultant transformation provides a more compact representation of the original data with preservation of salient features for classification. LDA has been successfully implemented in applications in which there are enough samples to be analyzed. However, when the dimensionality of the samples is large as compared to the number of samples, LDA suffers from SSS problem. Under such situations, the within-class scatter matrix Sw becomes singular.

In literature, several approaches have been proposed to address the SSS problem. Raudys & Duin (1998) suggested replacing the inverse of Sw with its pseudoinverse. Belhumeur et al. proposed Fisherfaces (Belhumeur, Hespanha & Kriegman, 1997) which employs principle component analysis (PCA) (Duda, Hart & Stork, 2000) for dimensionality reduction prior to LDA. However, due to PCA transformation, some discriminant information may get lost which is useful for classification. Chen et al. (Chen, Liao, Ko, Lin & Yu, 2000) proposed null-space LDA (NLDA) which modifies the Fisher criterion as:

ijcvip.2014010104.m04
(3)

Complete Article List

Search this Journal:
Reset
Volume 14: 1 Issue (2024): Forthcoming, Available for Pre-Order
Volume 13: 1 Issue (2023)
Volume 12: 4 Issues (2022): 1 Released, 3 Forthcoming
Volume 11: 4 Issues (2021)
Volume 10: 4 Issues (2020)
Volume 9: 4 Issues (2019)
Volume 8: 4 Issues (2018)
Volume 7: 4 Issues (2017)
Volume 6: 2 Issues (2016)
Volume 5: 2 Issues (2015)
Volume 4: 2 Issues (2014)
Volume 3: 4 Issues (2013)
Volume 2: 4 Issues (2012)
Volume 1: 4 Issues (2011)
View Complete Journal Contents Listing