Convolutional Normalizing Flows for Deep Gaussian Processes

04/17/2021
by   Haibin Yu, et al.
0

Deep Gaussian processes (DGPs), a hierarchical composition of GP models, have successfully boosted the expressive power than the single-layer counterpart. However, it is impossible to perform exact inference in DGPs, which has motivated the recent development of variational inference based methods. Unfortunately, these methods either yield a biased posterior belief or are difficult to evaluate the convergence. This paper, on the contrary, introduces a new approach for specifying flexible, arbitrarily complex, and scalable approximate posterior distributions. The posterior distribution is constructed through a normalizing flow (NF) which transforms a simple initial probability into a more complex one through a sequence of invertible transformations. Moreover, a novel convolutional normalizing flow (CNF) is developed to improve the time efficiency and capture dependency between layers. Empirical evaluation demonstrates that CNF DGP outperforms the state-of-the-art approximation methods for DGPs.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro