Infinitely Deep Bayesian Neural Networks with Stochastic Differential Equations

02/12/2021
by   Winnie Xu, et al.
24

We perform scalable approximate inference in a recently-proposed family of continuous-depth Bayesian neural networks. In this model class, uncertainty about separate weights in each layer produces dynamics that follow a stochastic differential equation (SDE). We demonstrate gradient-based stochastic variational inference in this infinite-parameter setting, producing arbitrarily-flexible approximate posteriors. We also derive a novel gradient estimator that approaches zero variance as the approximate posterior approaches the true posterior. This approach further inherits the memory-efficient training and tunable precision of neural ODEs.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset