Expected Worst Case Regret via Stochastic Sequential Covering

09/09/2022
āˆ™
by   Changlong Wu, et al.
āˆ™
2
āˆ™

We study the problem of sequential prediction and online minimax regret with stochastically generated features under a general loss function. We introduce a notion of expected worst case minimax regret that generalizes and encompasses prior known minimax regrets. For such minimax regrets we establish tight upper bounds via a novel concept of stochastic global sequential covering. We show that for a hypothesis class of VC-dimension š–µš–¢ and i.i.d. generated features of length T, the cardinality of the stochastic global sequential covering can be upper bounded with high probability (whp) by e^O(š–µš–¢Ā·log^2 T). We then improve this bound by introducing a new complexity measure called the Star-Littlestone dimension, and show that classes with Star-Littlestone dimension š–²š–« admit a stochastic global sequential covering of order e^O(š–²š–«Ā·log T). We further establish upper bounds for real valued classes with finite fat-shattering numbers. Finally, by applying information-theoretic tools of the fixed design minimax regrets, we provide lower bounds for the expected worst case minimax regret. We demonstrate the effectiveness of our approach by establishing tight bounds on the expected worst case minimax regrets for logarithmic loss and general mixable losses.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset