自然语言处理徐蔚然老师研究组
自然语言处理徐蔚然老师研究组
主页
主研方向
成员
论文
项目
专利
博客
去向
联系
浅色
深色
自动
中文 (简体)
中文 (简体)
English
董冠霆
硕士研究生
北京邮电大学
研究方向
自然语言理解
教育经历
硕士研究生, 2021至今
北京邮电大学人工智能学院
通信工程学士学位, 2021
北京邮电大学信息与通信工程学院
最新
CS-Bench:A Comprehensive Benchmark for Large Language Models towards Computer Science Mastery
PreAct:Predicting Future in ReAct Enhances Agent's Planning Ability
Multi-Perspective Consistency Enhances Confidence Estimation in Large Language Models
DolphCoder:Echo-Locating Code Large Language Models with Diverse and Multi-Objective Instruction Tuning
Knowledge Editing on Black-box Large Language Models
A multi-task semantic decomposition framework with task-specific pre-training for few-shot ner
Bridging the KB-Text Gap: Leveraging Structured Knowledge-aware Pre-training for KBQA
DemoNSF: A Multi-task Demonstration-based Generative Framework for Noisy Slot Filling Task
Large Language Models Meet Open-World Intent Discovery and Recognition: An Evaluation of ChatGPT
Revisit input perturbation problems for llms: A unified robustness evaluation framework for noisy slot filling task
Semantic Parsing by Large Language Models for Intricate Updating Strategies of Zero-Shot Dialogue State Tracking
A Prototypical Semantic Decoupling Method via Joint Contrastive Learning for Few-Shot Named Entity Recognition
Generative zero-shot prompt learning for cross-domain slot filling with inverse prompting
Revisit Out-Of-Vocabulary Problem For Slot Filling: A Unified Contrastive Framework With Multi-Level Data Augmentations
Entity-level Interaction via Heterogeneous Graph for Multimodal Named Entity Recognition
Exploiting domain-slot related keywords description for Few-Shot Cross-Domain Dialogue State Tracking
Semi-Supervised Knowledge-Grounded Pre-training for Task-Oriented Dialog Systems
PSSAT: A Perturbed Semantic Structure Awareness Transferring Method for Perturbation-Robust Slot Filling
A Robust Contrastive Alignment Method For Multi-Domain Text Classification
引用
×