Disentangled Knowledge Transfer for OOD Intent Discovery with Unified Contrastive Learning

Abstract

Discovering Out-of-Domain(OOD) intents is essential for developing new skills in a task-oriented dialogue system. The key challenge is how to transfer prior IND knowledge to OOD clustering. Different from existing work based on shared intent representation, we propose a novel disentangled knowledge transfer method via a unified multi-head contrastive learning framework. We aim to bridge the gap between IND pre-training and OOD clustering. Experiments and analysis on two benchmark datasets show the effectiveness of our method.

Publication
ACL 2022
Yutao Mu
Yutao Mu
Postgraduate Student

Task-oriented Dialogue System, Spoken Language Understading

Keqing He
Postgraduate Student

Dialogue System, Summarization, Pre-training Language Model

Yanan Wu
Yanan Wu
Postgraduate Student

Spoken Language Understading

Zhiyuan Zeng
Zhiyuan Zeng
Postgraduate Student

Spoken Language Understanding, Text Generation

Hong Xu
Postgraduate Student

Natual Language Processing, Intent Detection

Weiran Xu
Weiran Xu
Associate Professor, Master Supervisor, Ph.D Supervisor

Information Retrieval, Pattern Recognition, Machine Learning