Calibrating Structured Output Predictors for Natural Language Processing

被引:0
|
作者
Jagannatha, Abhyuday [1 ]
Yu, Hong [1 ,2 ]
机构
[1] Univ Massachusetts, Coll Informat & Comp Sci, Amherst, MA 01003 USA
[2] Univ Massachusetts Lowell, Dept Comp Sci, Lowell, MA USA
基金
美国国家卫生研究院;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We address the problem of calibrating prediction confidence for output entities of interest in natural language processing (NLP) applications. It is important that NLP applications such as named entity recognition and question answering produce calibrated confidence scores for their predictions, especially if the applications are to be deployed in a safety-critical domain such as healthcare. However, the output space of such structured prediction models is often too large to adapt binary or multi-class calibration methods directly. In this study, we propose a general calibration scheme for output entities of interest in neural network based structured prediction models. Our proposed method can be used with any binary class calibration scheme and a neural network model. Additionally, we show that our calibration method can also be used as an uncertainty-aware, entity-specific decoding step to improve the performance of the underlying model at no additional training cost or data requirements. We show that our method outperforms current calibration techniques for named-entity-recognition, part-of-speech and question answering. We also improve our model's performance from our decoding step across several tasks and benchmark datasets. Our method improves the calibration and model performance on out-of-domain test scenarios as well.
引用
收藏
页码:2078 / 2092
页数:15
相关论文
共 50 条
  • [1] Deep Structured Learning for Natural Language Processing
    Li, Yong
    Yang, Xiaojun
    Zuo, Min
    Jin, Qingyu
    Li, Haisheng
    Cao, Qian
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2021, 20 (03)
  • [2] The application of structured learning in natural language processing
    Ni, Yizhao
    Saunders, Craig
    Szedmak, Sandor
    Niranjan, Mahesan
    MACHINE TRANSLATION, 2010, 24 (02) : 71 - 85
  • [3] Special issue on statistical learning of natural language structured input and output
    Marquez, Lluis
    Moschitti, Alessandro
    NATURAL LANGUAGE ENGINEERING, 2012, 18 : 147 - 153
  • [4] Special Issue on Deep Structured Learning for Natural Language Processing
    Manogaran, Gunasekaran
    Qudrat-Ullah, Hassan
    Xin, Qin
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2021, 20 (01)
  • [5] Inferring attribute grammars with structured data for natural language processing
    Starkie, B
    GRAMMATICAL INFERENCE: ALGORITHMS AND APPLICATIONS, 2002, 2484 : 237 - 248
  • [6] Processing natural language without natural language processing
    Brill, E
    COMPUTATIONAL LINGUISTICS AND INTELLIGENT TEXT PROCESSING, PROCEEDINGS, 2003, 2588 : 360 - 369
  • [7] Natural Language Processing for EHR-Based Pharmacovigilance: A Structured Review
    Yuan Luo
    William K. Thompson
    Timothy M. Herr
    Zexian Zeng
    Mark A. Berendsen
    Siddhartha R. Jonnalagadda
    Matthew B. Carson
    Justin Starren
    Drug Safety, 2017, 40 : 1075 - 1089
  • [8] Natural Language Processing for EHR-Based Pharmacovigilance: A Structured Review
    Luo, Yuan
    Thompson, William K.
    Herr, Timothy M.
    Zeng, Zexian
    Berendsen, Mark A.
    Jonnalagadda, Siddhartha R.
    Carson, Matthew B.
    Starren, Justin
    DRUG SAFETY, 2017, 40 (11) : 1075 - 1089
  • [9] Introduction to the Special Issue on Deep Structured Learning for Natural Language Processing
    Manogaran, Gunasekaran
    Qudrat-Ullah, Hassan
    Xin, Qin
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2021, 20 (03)
  • [10] Using natural language processing and the gene ontology to populate a structured pathway database
    Dehoney, D
    Harte, R
    Lu, Y
    Chin, D
    PROCEEDINGS OF THE 2003 IEEE BIOINFORMATICS CONFERENCE, 2003, : 646 - 647