The prompt-based learning paradigm, which bridges the gap between
pre-tr...
Federated learning (FL) enables multiple data owners to build machine
le...
Recent studies on adversarial images have shown that they tend to leave ...
The fine-tuning of pre-trained language models has a great success in ma...
Current state-of-the-art cross-lingual summarization models employ multi...
Automatic question generation can benefit many applications ranging from...
Many state-of-the-art neural models for NLP are heavily parameterized an...
This paper tackles the problem of reading comprehension over long narrat...
Recurrent neural networks (RNNs) such as long short-term memory and gate...
We propose DecaProp (Densely Connected Attention Propagation), a new den...
Learning a matching function between two text sequences is a long standi...
The dominant, state-of-the-art collaborative filtering (CF) methods toda...
Attention is typically used to select informative sub-phrases that are u...
Sarcasm is a sophisticated speech act which commonly manifests on social...
We propose MRU (Multi-Range Reasoning Units), a new fast compositional
e...
Many recent state-of-the-art recommender systems such as D-ATT, TransNet...
This paper presents a new deep learning architecture for Natural Languag...
Temporal gates play a significant role in modern recurrent-based neural
...
Deep learning has demonstrated tremendous potential for Automatic Text
S...
Many popular knowledge graphs such as Freebase, YAGO or DBPedia maintain...
The dominant neural architectures in question answer retrieval are based...
The YouTube-8M video classification challenge requires teams to classify...