On Efficient and Scalable Computation of the Nonparametric Maximum Likelihood Estimator in Mixture Models
In this paper we study the computation of the nonparametric maximum likelihood estimator (NPMLE) in multivariate mixture models. Our first approach discretizes this infinite dimensional convex optimization problem by fixing the support points of the NPMLE and optimizing over the mixture proportions. In this context we propose, leveraging the sparsity of the solution, an efficient and scalable semismooth Newton based augmented Lagrangian method (ALM). Our algorithm beats the state-of-the-art methods <cit.> and can handle n ≈ 10^6 data points with m ≈ 10^4 support points. Our second procedure, which combines the expectation-maximization (EM) algorithm with the ALM approach above, allows for joint optimization of both the support points and the probability weights. For both our algorithms we provide formal results on their (superlinear) convergence properties. The computed NPMLE can be immediately used for denoising the observations in the framework of empirical Bayes. We propose new denoising estimands in this context along with their consistent estimates. Extensive numerical experiments are conducted to illustrate the effectiveness of our methods. In particular, we employ our procedures to analyze two astronomy data sets: (i) Gaia-TGAS Catalog <cit.> containing n ≈ 1.4 × 10^6 data points in two dimensions, and (ii) the d=19 dimensional data set from the APOGEE survey <cit.> with n ≈ 2.7 × 10^4.
READ FULL TEXT