Bandit Multiclass Linear Classification: Efficient Algorithms for the Separable Case

02/06/2019
by   Alina Beygelzimer, et al.
0

We study the problem of efficient online multiclass linear classification with bandit feedback, where all examples belong to one of K classes and lie in the d-dimensional Euclidean space. Previous works have left open the challenge of designing efficient algorithms with finite mistake bounds when the data is linearly separable by a margin γ. In this work, we take a first step towards this problem. We consider two notions of linear separability, strong and weak. 1. Under the strong linear separability condition, we design an efficient algorithm that achieves a near-optimal mistake bound of O( K/γ^2 ). 2. Under the more challenging weak linear separability condition, we design an efficient algorithm with a mistake bound of (2^O(K ^2 (1/γ)), 2^O(√(1/γ) K)). Our algorithm is based on kernel Perceptron, which is inspired by the work of Klivans-Servedio-2008 on improperly learning intersection of halfspaces.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset