Submodular + Concave

06/09/2021
by   Siddharth Mitra, et al.
0

It has been well established that first order optimization methods can converge to the maximal objective value of concave functions and provide constant factor approximation guarantees for (non-convex/non-concave) continuous submodular functions. In this work, we initiate the study of the maximization of functions of the form F(x) = G(x) +C(x) over a solvable convex body P, where G is a smooth DR-submodular function and C is a smooth concave function. This class of functions is a strict extension of both concave and continuous DR-submodular functions for which no theoretical guarantee is known. We provide a suite of Frank-Wolfe style algorithms, which, depending on the nature of the objective function (i.e., if G and C are monotone or not, and non-negative or not) and on the nature of the set P (i.e., whether it is downward closed or not), provide 1-1/e, 1/e, or 1/2 approximation guarantees. We then use our algorithms to get a framework to smoothly interpolate between choosing a diverse set of elements from a given ground set (corresponding to the mode of a determinantal point process) and choosing a clustered set of elements (corresponding to the maxima of a suitable concave function). Additionally, we apply our algorithms to various functions in the above class (DR-submodular + concave) in both constrained and unconstrained settings, and show that our algorithms consistently outperform natural baselines.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset