Although the bandits framework is a classical and well-suited approach f...
We study a posterior sampling approach to efficient exploration in
const...
We consider a special case of bandit problems, named batched bandits, in...
We consider a special case of bandit problems, namely batched bandits.
M...