Recent successful generative models are trained by fitting a neural netw...
Generative Flow Networks (GFlowNets) are a new family of probabilistic
s...
We propose Riemannian Flow Matching (RFM), a simple yet powerful framewo...
Neural compression offers a domain-agnostic approach to creating codecs ...
We introduce a new paradigm for generative modeling built on Continuous
...
We investigate the parameterization of deep neural networks that by desi...
While the maximum entropy (MaxEnt) reinforcement learning (RL) framework...
There are many frameworks for deep generative modeling, each often prese...
Continuous Normalizing Flows (CNFs) are a class of generative models tha...
Mapping between discrete and continuous distributions is a difficult tas...
We perform scalable approximate inference in a recently-proposed family ...
Flow-based models are powerful tools for designing probabilistic models ...
Standard first-order stochastic optimization algorithms base their updat...
We propose a new class of parameterizations for spatio-temporal point
pr...
The existing Neural ODE formulation relies on an explicit knowledge of t...
Neural differential equations may be trained by backpropagating gradient...
Standard variational lower bounds used to train latent variable models
p...
The adjoint sensitivity method scalably computes gradients of solutions ...
Gradients of neural networks can be computed efficiently for any
archite...
Time series with non-uniform intervals occur in many applications, and a...
Flow-based generative models parameterize probability distributions thro...
A promising class of generative models maps points from a simple distrib...