Distribution Calibration for Out-of-Domain Detection with Bayesian Approximation

Abstract

Out-of-Domain (OOD) detection is a key component in a task-oriented dialog system, which aims to identify whether a query falls outside the predefined supported intent set. Previous softmax-based detection algorithms are proved to be overconfident for OOD samples. In this paper, we analyze overconfident OOD comes from distribution uncertainty due to the mismatch between the training and test distributions, which makes the model can’t confidently make predictions thus probably causing abnormal softmax scores. We propose a Bayesian OOD detection framework to calibrate distribution uncertainty using Monte-Carlo Dropout.Our method is flexible and easily pluggable into existing softmax-based baselines and gains 33.33% OOD F1 improvements with increasing only 0.41% inference time compared to MSP. Further analyses show the effectiveness of Bayesian learning for OOD detection.

Publication
COLING 2022
Yanan Wu
Yanan Wu
Postgraduate Student

Spoken Language Understading

Zhiyuan Zeng
Zhiyuan Zeng
Postgraduate Student

Spoken Language Understanding, Text Generation

Keqing He
Postgraduate Student

Dialogue System, Summarization, Pre-training Language Model

Yutao Mu
Yutao Mu
Postgraduate Student

Task-oriented Dialogue System, Spoken Language Understading

Pei Wang
Pei Wang
Postgraduate Student
Weiran Xu
Weiran Xu
Associate Professor, Master Supervisor, Ph.D Supervisor

Information Retrieval, Pattern Recognition, Machine Learning