Mapping probability word problems to executable representations

被引:0
|
作者
Suster, Simon [1 ]
Fivez, Pieter [2 ]
Totis, Pietro [3 ]
Kimmig, Angelika [3 ]
Davis, Jesse [3 ]
De Raedt, Luc [3 ]
Daelemans, Walter [2 ]
机构
[1] Univ Melbourne, Melbourne, Vic, Australia
[2] Univ Antwerp, Antwerp, Belgium
[3] Katholieke Univ Leuven, Leuven, Belgium
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
While solving math word problems automatically has received considerable attention in the NLP community, few works have addressed probability word problems specifically. In this paper, we employ and analyse various neural models for answering such word problems. In a two-step approach, the problem text is first mapped to a formal representation in a declarative language using a sequence-to-sequence model, and then the resulting representation is executed using a probabilistic programming system to provide the answer. Our best performing model incorporates general-domain contextualised word representations that were finetuned using transfer learning on another in-domain dataset. We also apply end-to-end models to this task, which bring out the importance of the two-step approach in obtaining correct solutions to probability problems.
引用
收藏
页码:3627 / 3640
页数:14
相关论文
共 50 条