Online Clustering of Bandits

01/31/2014
by   Claudio Gentile, et al.
0

We introduce a novel algorithmic approach to content recommendation based on adaptive clustering of exploration-exploitation ("bandit") strategies. We provide a sharp regret analysis of this algorithm in a standard stochastic noise setting, demonstrate its scalability properties, and prove its effectiveness on a number of artificial and real-world datasets. Our experiments show a significant increase in prediction performance over state-of-the-art methods for bandit problems.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset