Continuous Submodular Maximization: Beyond DR-Submodularity

06/21/2020
by   Moran Feldman, et al.
0

In this paper, we propose the first continuous optimization algorithms that achieve a constant factor approximation guarantee for the problem of monotone continuous submodular maximization subject to a linear constraint. We first prove that a simple variant of the vanilla coordinate ascent, called Coordinate-Ascent+, achieves a (e-1/2e-1-ε)-approximation guarantee while performing O(n/ε) iterations, where the computational complexity of each iteration is roughly O(n/√(ε)+nlog n) (here, n denotes the dimension of the optimization problem). We then propose Coordinate-Ascent++, that achieves the tight (1-1/e-ε)-approximation guarantee while performing the same number of iterations, but at a higher computational complexity of roughly O(n^3/ε^2.5 + n^3 log n / ε^2) per iteration. However, the computation of each round of Coordinate-Ascent++ can be easily parallelized so that the computational cost per machine scales as O(n/√(ε)+nlog n).

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset