CHECKWHY: Causal Fact Verification via Argument Structure

被引:0
|
作者
Si, Jiasheng [1 ,4 ]
Zhao, Yibo [2 ,3 ]
Zhu, Yingjie [2 ,3 ]
Zhu, Haiyang [2 ,3 ]
Lu, Wenpeng [1 ,4 ]
Zhou, Deyu [2 ,3 ]
机构
[1] Qilu Univ Technol, Shandong Comp Sci Ctr, Key Lab Comp Power Network & Informat Secur, Minist Educ,Shandong Acad Sci, Jinan, Peoples R China
[2] Southeast Univ, Sch Comp Sci & Engn, Nanjing, Peoples R China
[3] Southeast Univ, Minist Educ, Key Lab New Generat Artificial Intelligence Techn, Nanjing, Peoples R China
[4] Shandong Fundamental Res Ctr Comp Sci, Shandong Prov Key Lab Comp Networks, Jinan, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the growing complexity of fact verification tasks, the concern with "thoughtful" reasoning capabilities is increasing. However, recent fact verification benchmarks mainly focus on checking a narrow scope of semantic factoids within claims and lack an explicit logical reasoning process. In this paper, we introduce CHECKWHY, a challenging dataset tailored to a novel causal fact verification task: checking the truthfulness of the causal relation within claims through rigorous reasoning steps. CHECKWHY consists of over 19K "why" claimevidence-argument structure triplets with supports, refutes, and not enough info labels. Each argument structure is composed of connected evidence, representing the reasoning process that begins with foundational evidence and progresses toward claim establishment. Through extensive experiments on state-of-the-art models, we validate the importance of incorporating the argument structure for causal fact verification. Moreover, the automated and human evaluation of argument structure generation reveals the difficulty in producing satisfying argument structure by fine-tuned models or Chainof-Thought prompted LLMs, leaving considerable room for future improvements1.
引用
收藏
页码:15636 / 15659
页数:24
相关论文
共 50 条