Swarm Intelligence for Non-Negative Matrix Factorization

Swarm Intelligence for Non-Negative Matrix Factorization

Andreas Janecek, Ying Tan
Copyright: © 2011 |Pages: 23
DOI: 10.4018/jsir.2011100102
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

The Non-negative Matrix Factorization (NMF) is a special low-rank approximation which allows for an additive parts-based and interpretable representation of the data. This article presents efforts to improve the convergence, approximation quality, and classification accuracy of NMF using five different meta-heuristics based on swarm intelligence. Several properties of the NMF objective function motivate the utilization of meta-heuristics: this function is non-convex, discontinuous, and may possess many local minima. The proposed optimization strategies are two-fold: On the one hand, a new initialization strategy for NMF is presented in order to initialize the NMF factors prior to the factorization; on the other hand, an iterative update strategy is proposed, which improves the accuracy per runtime for the multiplicative update NMF algorithm. The success of the proposed optimization strategies are shown by applying them on synthetic data and data sets coming from the areas of spam filtering/email classification, and evaluate them also in their application context. Experimental results show that both optimization strategies are able to improve NMF in terms of faster convergence, lower approximation error, and better classification accuracy. Especially the initialization strategy leads to significant reductions of the runtime per accuracy ratio for both, the NMF approximation as well as the classification results achieved with NMF.
Article Preview
Top

1. Introduction

Low-rank approximations are utilized in several content based retrieval and data mining applications, such as text and multimedia mining, web search, etc. and achieve a more compact representation of the data with only limited loss in information. They reduce storage and runtime requirements, and also reduce redundancy and noise in the data representation while capturing the essential associations. The Non-negative Matrix Factorization (NMF) (Lee & Seung, 1999) leads to a low-rank approximation which satisfies non-negativity constraints. NMF approximates a data matrix jsir.2011100102.m01 by jsir.2011100102.m02where jsir.2011100102.m03 and jsir.2011100102.m04 are the NMF factors. NMF requires all entries injsir.2011100102.m05, jsir.2011100102.m06andjsir.2011100102.m07to be zero or positive. Contrary to other low-rank approximations such as the Singular Value Decomposition (SVD), these constraints force NMF to produce so-called “additive parts-based” representations. This is an impressive benefit of NMF, since it makes the interpretation of the NMF factors much easier than for factors containing positive and negative entries (Berry, Browne, Langville, Pauca, & Plemmons, 2007; Janecek & Gansterer, 2010; Lee & Seung, 1999).

The NMF is usually not unique if different initializations of the factors jsir.2011100102.m08 and jsir.2011100102.m09 are used. Moreover, there are several different NMF algorithms which all follow different strategies (e.g., mean squared error, least squares, gradient descent,...) and produce different results. Mathematically, the goal of NMF is to find a “good” (ideally the best) solution of an optimization problem with bound constraints in the form jsir.2011100102.m10, where jsir.2011100102.m11 is the nonlinear objective function of NMF, and jsir.2011100102.m12 is the feasible region (for NMF, jsir.2011100102.m13 is restricted to non-negative values). jsir.2011100102.m14is usually not convex, discontinuous and may possess many local minima (Stadlthanner, Lutter, Theis, Lang, Tome, Georgieva, & Puntonet, 2007). Since meta-heuristic optimization algorithms are known to be able to deal well with such difficulties they seem to be a promising choice for improving the quality of NMF. Over the last decades nature-inspired meta-heuristics, including those based on swarm intelligence, have gained much popularity due to their applicability for various optimization problems. They benefit from the fact that they are able to find acceptable results within a reasonable amount of time for many complex, large and dynamic problems (Blackwell, 2007). Although they lack the ability to guarantee the optimal solution for a given problem (comparably to NMF), it has been shown that they are able to tackle various kinds of real-world optimization problems (Chiong, 2009). Meta-heuristics as well as the principles of NMF are in accordance with the law of sufficiency (Kennedy, Eberhart, & Shi, 2001): If a solution to a problem is good enough, fast enough and cheap enough, then it is sufficient.

Complete Article List

Search this Journal:
Reset
Volume 15: 1 Issue (2024)
Volume 14: 3 Issues (2023)
Volume 13: 4 Issues (2022)
Volume 12: 4 Issues (2021)
Volume 11: 4 Issues (2020)
Volume 10: 4 Issues (2019)
Volume 9: 4 Issues (2018)
Volume 8: 4 Issues (2017)
Volume 7: 4 Issues (2016)
Volume 6: 4 Issues (2015)
Volume 5: 4 Issues (2014)
Volume 4: 4 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing