Joint Majorization-Minimization for Nonnegative Matrix Factorization with the β-divergence

06/29/2021
by   Arthur Marmin, et al.
0

This article proposes new multiplicative updates for nonnegative matrix factorization (NMF) with the β-divergence objective function. Our new updates are derived from a joint majorization-minimization (MM) scheme, in which an auxiliary function (a tight upper bound of the objective function) is built for the two factors jointly and minimized at each iteration. This is in contrast with the classic approach in which the factors are optimized alternately and a MM scheme is applied to each factor individually. Like the classic approach, our joint MM algorithm also results in multiplicative updates that are simple to implement. They however yield a significant drop of computation time (for equally good solutions), in particular for some β-divergences of important applicative interest, such as the squared Euclidean distance and the Kullback-Leibler or Itakura-Saito divergences. We report experimental results using diverse datasets: face images, audio spectrograms, hyperspectral data and song play counts. Depending on the value of β and on the dataset, our joint MM approach yields a CPU time reduction of about 10% to 78% in comparison to the classic alternating scheme.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset