Abstract
Although nonnegative matrix factorization (NMF) favors a part-based and sparse representation of its input, there is no guarantee for this behavior. Several extensions to NMF have been proposed in order to introduce sparseness via the ℓ1-norm, while little work is done using the more natural sparseness measure, the ℓ0-pseudo-norm. In this work we propose two NMF algorithms with ℓ0-sparseness constraints on the bases and the coefficient matrices, respectively. We show that classic NMF [1] is a suited tool for ℓ0-sparse NMF algorithms, due to a property we call sparseness maintenance. We apply our algorithms to synthetic and real-world data and compare our results to sparse NMF [2] and nonnegative KSVD [3].
Original language | English |
---|---|
Title of host publication | Proceedings of the 2010 IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2010 |
Publisher | Institute of Electrical and Electronics Engineers |
Pages | 83-88 |
Number of pages | 6 |
ISBN (Print) | 9781424478774 |
DOIs | |
Publication status | Published - 24 Nov 2010 |
Externally published | Yes |
Event | 20th IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2010 - Kittila, Finland Duration: 29 Aug 2010 → 1 Sept 2010 Conference number: 20 |
Conference
Conference | 20th IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2010 |
---|---|
Abbreviated title | MLSP 2010 |
Country/Territory | Finland |
City | Kittila |
Period | 29/08/10 → 1/09/10 |