Disentangled Knowledge Transfer for OOD Intent Discovery with Unified Contrastive Learning

摘要

Discovering Out-of-Domain(OOD) intents is essential for developing new skills in a task-oriented dialogue system. The key challenge is how to transfer prior IND knowledge to OOD clustering. Different from existing work based on shared intent representation, we propose a novel disentangled knowledge transfer method via a unified multi-head contrastive learning framework. We aim to bridge the gap between IND pre-training and OOD clustering. Experiments and analysis on two benchmark datasets show the effectiveness of our method.

会议
ACL 2022
牟宇滔
牟宇滔
硕士研究生

任务型对话系统,自然语言理解

何可清
硕士研究生

对话系统,摘要,预训练

吴亚楠
吴亚楠
硕士研究生

自然语言理解

曾致远
曾致远
硕士研究生

自然语言理解,文本生成

徐红
硕士研究生

自然语言处理,意图识别

徐蔚然
徐蔚然
副教授,硕士生导师,博士生导师

信息检索,模式识别,机器学习