These lectures, presented at the 2022 Les Houches Summer School on
Stati...
We study the distribution of a fully connected neural network with rando...
This article derives and validates three principles for initialization a...
In this short note we consider random fully connected ReLU networks of w...
This article concerns Bayesian inference using deep linear networks with...
Training a neural network requires choosing a suitable learning rate,
in...
Advanced deep neural networks (DNNs), designed by either human or AutoML...
This article considers fully connected neural networks with Gaussian ran...
We prove a precise geometric description of all one layer ReLU networks
...
This article gives a new proof that fully connected neural networks with...
This book develops an effective theory approach to understanding deep ne...
Assessing the complexity of functions computed by a neural network helps...
Neural Networks (NNs) are the method of choice for building learning
alg...
We present a theoretical framework recasting data augmentation as stocha...
We prove the precise scaling, at finite depth and width, for the mean an...
The success of deep networks has been attributed in part to their
expres...
It is well-known that the expressivity of a neural network depends on it...
We study products of random matrices in the regime where the number of t...
We investigate the effects of initialization and architecture on the sta...
We give a rigorous analysis of the statistical behavior of gradients in
...
This article concerns the expressive power of depth in deep feed-forward...
This article concerns the expressive power of depth in neural nets with ...