PSSAT: A Perturbed Semantic Structure Awareness Transferring Method for Perturbation-Robust Slot Filling

Abstract

Most existing slot filling models tend to memorize inherent patterns of entities and corresponding contexts from training data. However, these models can lead to system failure or undesirable outputs when being exposed to spoken language perturbation or variation in practice. We propose a perturbed semantic structure awareness transferring method for training perturbation-robust slot filling models. Specifically, we introduce two MLM-based training strategies to respectively learn contextual semantic structure and word distribution from unsupervised language perturbation corpus. Then, we transfer semantic knowledge learned from upstream training procedure into the original samples and filter generated data by consistency processing. These procedures aim to enhance the robustness of slot filling models. Experimental results show that our method consistently outperforms the previous basic methods and gains strong generalization while preventing the model from memorizing inherent patterns of entities and contexts.

Publication
COLING 2022
Guanting Dong
Guanting Dong
Postgraduate Student

Spoken Language Understading and related applications

Daichi Guo
Daichi Guo
Postgraduate Student

Debiaes, Class Imbalance

Liwen Wang
Liwen Wang
Postgraduate Student

Spoken Language Understading and related applications

Xuefeng Li
Xuefeng Li
Postgraduate Student

Slot Filling, Intent Detection

Zechen Wang
Zechen Wang
Postgraduate Student
Chen Zeng
Chen Zeng
Postgraduate Student
Keqing He
Postgraduate Student

Dialogue System, Summarization, Pre-training Language Model

Hao Lei
Hao Lei
Postgraduate Student

Machine Reading Comprehension

Weiran Xu
Weiran Xu
Associate Professor, Master Supervisor, Ph.D Supervisor

Information Retrieval, Pattern Recognition, Machine Learning