Adapting Task-Oriented Dialogue Models for Email Conversations

08/19/2022
by   Soham Deshmukh, et al.
0

Intent detection is a key part of any Natural Language Understanding (NLU) system of a conversational assistant. Detecting the correct intent is essential yet difficult for email conversations where multiple directives and intents are present. In such settings, conversation context can become a key disambiguating factor for detecting the user's request from the assistant. One prominent way of incorporating context is modeling past conversation history like task-oriented dialogue models. However, the nature of email conversations (long form) restricts direct usage of the latest advances in task-oriented dialogue models. So in this paper, we provide an effective transfer learning framework (EMToD) that allows the latest development in dialogue models to be adapted for long-form conversations. We show that the proposed EMToD framework improves intent detection performance over pre-trained language models by 45 pre-trained dialogue models by 30 Additionally, the modular nature of the proposed framework allows plug-and-play for any future developments in both pre-trained language and task-oriented dialogue models.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset