Scalability is a significant challenge when it comes to applying differe...
Diffusion models (DMs) are widely used for generating high-quality image...
A major challenge in applying differential privacy to training deep neur...
Maximum mean discrepancy (MMD) is a particularly useful distance metric ...
Training even moderately-sized generative models with differentially-pri...
We are interested in privatizing an approximate posterior inference algo...
Kernel mean embedding is a useful tool to compare probability measures.
...
We introduce Dirichlet pruning, a novel post-processing technique to
tra...
We introduce a novel framework to quantify the importance of each input
...
We present a differentially private data generation paradigm using rando...
Developing a differentially private deep learning algorithm is challengi...
We develop a novel approximate Bayesian computation (ABC) framework, ABC...
Convolutional neural networks (CNNs) in recent years have made a dramati...
Interpretable predictions, where it is clear why a machine learning mode...
The use of propensity score methods to reduce selection bias when determ...
We propose a new variational family for Bayesian neural networks. We
dec...
Kernel two-sample testing is a useful statistical tool in determining wh...
We provide a general framework for privacy-preserving variational Bayes ...
We develop a privatised stochastic variational inference method for Late...
Iteratively reweighted least squares (IRLS) is a widely-used method in
m...
The iterative nature of the expectation maximization (EM) algorithm pres...
Complicated generative models often result in a situation where computin...
We introduce the Locally Linear Latent Variable Model (LL-LVM), a
probab...
Neural population activity often exhibits rich variability and temporal
...
The ultimate goal of optimization is to find the minimizer of a target
f...