Online nonparametric regression with Sobolev kernels

02/06/2021
by   Oleksandr Zadorozhnyi, et al.
0

In this work we investigate the variation of the online kernelized ridge regression algorithm in the setting of d-dimensional adversarial nonparametric regression. We derive the regret upper bounds on the classes of Sobolev spaces W_p^β(𝒳), p≥ 2, β>d/p. The upper bounds are supported by the minimax regret analysis, which reveals that in the cases β> d/2 or p=∞ these rates are (essentially) optimal. Finally, we compare the performance of the kernelized ridge regression forecaster to the known non-parametric forecasters in terms of the regret rates and their computational complexity as well as to the excess risk rates in the setting of statistical (i.i.d.) nonparametric regression.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset