Tensor Independent Component Analysis and Tensor Non-Negative Factorization

Tensor Independent Component Analysis and Tensor Non-Negative Factorization

David Zhang, Fengxi Song, Yong Xu, Zhizhen Liang
DOI: 10.4018/978-1-60566-200-8.ch010
(Individual Chapters)
No Current Special Offers


In this chapter, we describe two tensor-based subspace analysis approaches (tensor ICA and tensor NMF) that can be used in many fields like face recognition and other biometric recognition. Section 10.1 gives the background and development of the two tensor-based subspace analysis approaches. Section 10.2 introduces tensor independent component analysis. Section 10.3 presents tensor nonnegative factorization. Section 10.4 discusses some potential applications of these two subspace analysis approaches in biometrics. Finally, we summarize this chapter in Section 10.5.
Chapter Preview


Independent component analysis (ICA) (Hyvärinen & Oja, 2001) is a statistical signal processing technique. The basic idea of ICA is to represent a set of random variables using basis functions, where the components are statistically independent or as independent as possible. In general, there are two arguments for using ICA for image representation and recognition. First, the high-order relationships among images pixels may contain important information for recognition tasks. Second, ICA seeks to find the directions so that the projections of the data into those directions have maximally non-Gaussian distribution, which may be useful for classification tasks. In addition, the concept of ICA can be viewed as a generalization of PCA, since it is concerned not only with the second-order dependencies between variables but also with high-order dependencies between them.

During the past several years, the ICA algorithm has been widely used in face recognition and biomedical data. Bartlett and Sejnowski (1997) have demonstrated that the recognition accuracy using ICA basis vectors is higher than that of the PCA basis vectors with 200 face images. They found that the ICA representation of faces has the invariance to big changes in pose and small changes in illuminations. In Bartlett, Movellan and Sejnowski (2002), the authors first organized the database into a matrix X where each row vector is a different image. In this representation, the images are random variables and the pixels are trials. In this case, it makes sense to talk about independence of images or functions of images. Two images i and j are independent if when moving across pixels. In addition, they transposed the matrix X and organized the data so that images are in the columns of X. In this representation, pixels are random variables and images are trials. Here, it also makes sense to talk about independence of pixels or functions of pixels. For example, pixel and would be independent if when moving across the entire set of images. Based on these two ideas, they suggested two ICA architectures (ICA Architectures I and II) for face representation and used the Infomax algorithm (Bell & Sejnowski, 1995, 1997) to implement ICA. Both architectures were evaluated on a subset of the FERET face database and were found to be effective for face recognition. Yuen and Lai (2000, 2002) adopted the fixed-point algorithm to obtain the independent components (ICs) and used a householder transform to gain the least square solution of a face image for representations. Liu and Wechsler (1999, 2003) used an ICA algorithm to perform ICA and assessed its performance for face identification. All of these researchers claimed that ICA outperforms PCA in face recognition. Other researchers, however, reported differently. Baek, Draper, Beveridge, and She (2002) reported that PCA outperforms ICA while Moghaddam (2002), Jin and Davoine (2004) reported no significant performance difference between the two methods. Socolinsky and Selinger (2002) reported that ICA outperforms PCA on visible images but PCA outperforms ICA on infrared images.

Complete Chapter List

Search this Book: