Robustness and compactness are two essential components of deep learning...
We propose a novel framework and a solution to tackle the continual lear...
We design a new family of hybrid CNN-ViT neural networks, named FasterVi...
Cascaded computation, whereby predictions are recurrently refined over
s...
Structural pruning can simplify network architecture and improve inferen...
We propose global context vision transformer (GC ViT), a novel architect...
In this work we demonstrate the vulnerability of vision transformers (Vi...
Federated learning (FL) allows the collaborative training of AI models
w...
We introduce AdaViT, a method that adaptively adjusts the inference cost...
Pruning enables appealing reductions in network memory footprint and tim...
Structural pruning can simplify network architecture and improve inferen...
Transformers yield state-of-the-art results across many tasks. However, ...
Understanding the behavior and vulnerability of pre-trained deep neural
...
Training deep neural networks requires gradient estimation from data bat...
Mental health problems impact quality of life of millions of people arou...
Modern deep neural networks are powerful and widely applicable models th...
Deep neural networks (DNNs) have been deployed in myriad machine learnin...
We introduce DeepInversion, a new method for synthesizing images from th...
Diabetes impacts the quality of life of millions of people. However, dia...
Deep neural networks (DNNs) have become a widely deployed model for nume...
Many long short-term memory (LSTM) applications need fast yet compact mo...
This paper proposes an efficient neural network (NN) architecture design...
Long short-term memory (LSTM) has been widely used for sequential data
m...
Long short-term memory (LSTM) has been widely used for sequential data
m...
Neural networks (NNs) have begun to have a pervasive impact on various
a...