Practical Batch Bayesian Optimization for Less Expensive Functions
Bayesian optimization (BO) and its batch extensions are successful for optimizing expensive black-box functions. However, these traditional BO approaches are not yet ideal for optimizing less expensive functions when the computational cost of BO can dominate the cost of evaluating the blackbox function. Examples of these less expensive functions are cheap machine learning models, inexpensive physical experiment through simulators, and acquisition function optimization in Bayesian optimization. In this paper, we consider a batch BO setting for situations where function evaluations are less expensive. Our model is based on a new exploration strategy using geometric distance that provides an alternative way for exploration, selecting a point far from the observed locations. Using that intuition, we propose to use Sobol sequence to guide exploration that will get rid of running multiple global optimization steps as used in previous works. Based on the proposed distance exploration, we present an efficient batch BO approach. We demonstrate that our approach outperforms other baselines and global optimization methods when the function evaluations are less expensive.
READ FULL TEXT