Generative zero-shot prompt learning for cross-domain slot filling with inverse prompting

Abstract

Zero-shot cross-domain slot filling aims to transfer knowledge from the labeled source domain to the unlabeled target domain. Existing models either encode slot descriptions and examples or design handcrafted question templates using heuristic rules, suffering from poor generalization capability or robustness. In this paper, we propose a generative zero-shot prompt learning framework for cross-domain slot filling, both improving generalization and robustness than previous work. Besides, we introduce a novel inverse prompting strategy to distinguish different slot types to avoid the multiple prediction problem, and an efficient prompt-tuning strategy to boost higher performance by only training fewer prompt parameters. Experiments and analysis demonstrate the effectiveness of our proposed framework, especially huge improvements (+13.44% F1) on the unseen slots.

Publication
ACL 2023
Xuefeng Li
Xuefeng Li
Postgraduate Student

Slot Filling, Intent Detection

Liwen Wang
Liwen Wang
Postgraduate Student

Spoken Language Understading and related applications

Guanting Dong
Guanting Dong
Postgraduate Student

Spoken Language Understading and related applications

Keqing He
Postgraduate Student

Dialogue System, Summarization, Pre-training Language Model

Hao Lei
Hao Lei
Postgraduate Student

Machine Reading Comprehension

Weiran Xu
Weiran Xu
Associate Professor, Master Supervisor, Ph.D Supervisor

Information Retrieval, Pattern Recognition, Machine Learning