Lower Bounds on the Worst-Case Complexity of Efficient Global Optimization

09/20/2022
by   Wenjie Xu, et al.
0

Efficient global optimization is a widely used method for optimizing expensive black-box functions such as tuning hyperparameter, and designing new material, etc. Despite its popularity, less attention has been paid to analyzing the inherent hardness of the problem although, given its extensive use, it is important to understand the fundamental limits of efficient global optimization algorithms. In this paper, we study the worst-case complexity of the efficient global optimization problem and, in contrast to existing kernel-specific results, we derive a unified lower bound for the complexity of efficient global optimization in terms of the metric entropy of a ball in its corresponding reproducing kernel Hilbert space (RKHS). Specifically, we show that if there exists a deterministic algorithm that achieves suboptimality gap smaller than ϵ for any function f∈ S in T function evaluations, it is necessary that T is at least Ω(log𝒩(S(𝒳), 4ϵ,·_∞)/log(R/ϵ)), where 𝒩(·,·,·) is the covering number, S is the ball centered at 0 with radius R in the RKHS and S(𝒳) is the restriction of S over the feasible set 𝒳. Moreover, we show that this lower bound nearly matches the upper bound attained by non-adaptive search algorithms for the commonly used squared exponential kernel and the Matérn kernel with a large smoothness parameter ν, up to a replacement of d/2 by d and a logarithmic term logR/ϵ. That is to say, our lower bound is nearly optimal for these kernels.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset