A subspace-accelerated split Bregman method for sparse data recovery with joint l1-type regularizers
We propose a subspace-accelerated Bregman method for the linearly constrained minimization of functions of the form E(u) = f(u)+τ_1 u_1 + τ_2 D u_1, where f is a smooth convex function and D represents a linear operator, e.g. a finite difference operator, as in anisotropic Total Variation and fused-lasso regularizations. Problems of this type arise in a wide variety of applications, including portfolio optimization and learning of predictive models from fMRI data. The use of D u_1 is aimed at encouraging structured sparsity in the solution. The subspaces where the acceleration is performed are selected so that the restriction of the objective function is a smooth function in a neighborhood of the current iterate. Numerical experiments on multi-period portfolio selection problems using real datasets show the effectiveness of the proposed method.
READ FULL TEXT