Modular proximal optimization for multidimensional total-variation regularization
One of the most frequently used notions of "structured sparsity" is that of sparse (discrete) gradients, a structure typically elicited through Total-Variation (TV) regularizers. This paper focuses on anisotropic TV-regularizers, in particular on ℓ_p-norm weighted TV regularizers for which it develops efficient algorithms to compute the corresponding proximity operators. Our algorithms enable one to scalably incorporate TV regularization of vector, matrix, or tensor data into a proximal convex optimization solvers. For the special case of vectors, we derive and implement a highly efficient weighted 1D-TV solver. This solver provides a backbone for subsequently handling the more complex task of higher-dimensional (two or more) TV by means of a modular proximal optimization approach. We present numerical experiments that demonstrate how our 1D-TV solver matches or exceeds the best known 1D-TV solvers. Thereafter, we illustrate the benefits of our modular design through extensive experiments on: (i) image denoising; (ii) image deconvolution; and (iii) four variants of fused-lasso. Our results show the flexibility and speed our TV solvers offer over competing approaches. To underscore our claims, we provide our TV solvers in an easy to use multi-threaded C++ library (which also aids reproducibility of our results).
READ FULL TEXT