Numerical reasoning is vital for natural language processing models to
u...
Sound classification models' performance suffers from generalizing on
ou...
Multi-modal sarcasm detection has attracted much recent attention.
Never...
Prompting methods have shown impressive performance in a variety of text...
Structured text extraction is one of the most valuable and challenging
a...
Two-Tower Vision-Language (VL) models have shown promising improvements ...
Spoken Language Understanding (SLU) is one of the core components of a
t...
Recently, much Chinese text error correction work has focused on Chinese...
The limited scale of annotated data constraints existing context-depende...
Prompt-based learning reformulates downstream tasks as cloze problems by...
Cross-domain text classification aims to adapt models to a target domain...
Zero-shot dialogue understanding aims to enable dialogue to track the us...
Image augmentation is a common mechanism to alleviate data scarcity in
c...
Multilingual spoken language understanding (SLU) consists of two sub-tas...
In this paper, we study the problem of knowledge-intensive text-to-SQL, ...
Text-to-SQL semantic parsing is an important NLP task, which greatly
fac...
Table-and-text hybrid question answering (HybridQA) is a widely used and...
Natural language processing for programming, which aims to use NLP techn...
Pre-trained Language Model (PLM) has become a representative foundation ...
Prompting method is regarded as one of the crucial progress for few-shot...
In this paper, we present an overview of the CTC 2021, a Chinese text
co...
Pre-trained language models learn large amounts of knowledge from their
...
Due to high data demands of current methods, attention to zero-shot
cros...
Existing Chinese text error detection mainly focuses on spelling and sim...
Prompting methods recently achieve impressive success in few-shot learni...
Existing text-to-SQL semantic parsers are typically designed for particu...
Knowledge graph embedding (KGE) models learn the representation of entit...
Deep neural networks are vulnerable to adversarial examples in Natural
L...
Current researches on spoken language understanding (SLU) heavily are li...
As an effective strategy, data augmentation (DA) alleviates data scarcit...
The Interaction between Drugs and Targets (DTI) in human body plays a cr...
Consistency Identification has obtained remarkable success on open-domai...
In this paper, we provide a bilingual parallel human-to-human recommenda...
Compared to monolingual models, cross-lingual models usually require a m...
Achieving human-level performance on some of Machine Reading Comprehensi...
Fine-tuning pre-trained cross-lingual language models can transfer
task-...
Adversarial training (AT) as a regularization method has proved its
effe...
Multi-intent SLU can handle multiple intents in an utterance, which has
...
Multilingual pre-trained models have achieved remarkable transfer perfor...
Achieving human-level performance on some of Machine Reading Comprehensi...
Spoken Language Understanding (SLU) aims to extract the semantics frame ...
Learning interpretable dialog structure from human-human dialogs yields ...
Pre-training of text and layout has proved effective in a variety of
vis...
In a dialog system, dialog act recognition and sentiment classification ...
Slot filling, a fundamental module of spoken language understanding, oft...
Most existing approaches to disfluency detection heavily rely on
human-a...
In this paper, we study the few-shot multi-label classification for user...
Intent detection and slot filling are two closely related tasks for buil...
Intent detection and slot filling are two main tasks for building a spok...
We introduce N-LTP, an open-source Python Chinese natural language proce...