Large language models based on transformers have achieved great empirica...
Self-supervised learning (SSL) has emerged as a powerful framework to le...
Unsupervised representation learning aims at describing raw data efficie...
Single-index models are a class of functions given by an unknown univari...
Several recent works have proposed a class of algorithms for the offline...
Neural networks are known to be highly sensitive to adversarial examples...
In this paper, we tackle the computational efficiency of kernelized UCB
...
Large-scale machine learning systems often involve data distributed acro...
Energy-based models (EBMs) are generative models that are usually traine...
Many supervised learning problems involve high-dimensional data such as
...
We study the approximation power of Graph Neural Networks (GNNs) on late...
Energy-based models (EBMs) are a simple yet powerful framework for gener...
The success of deep convolutional networks on on tasks involving
high-di...
Deep networks are often considered to be more expressive than shallow on...
We study properties of Graph Convolutional Networks (GCNs) by analyzing ...
Counterfactual reasoning from logged data has become increasingly import...
State-of-the-art neural networks are heavily over-parameterized, making ...
Despite their success, deep neural networks suffer from several drawback...
We study and empirically optimize contextual bandit learning, exploratio...
In this paper, we study deep signal representations that are invariant t...
Stochastic optimization algorithms with variance reduction have proven
s...