Reasoning About Trust and Belief Change on a Social Network: A Formal Approach

被引:2
|
作者
Hunter, Aaron [1 ]
机构
[1] BC Inst Technol, Burnaby, BC, Canada
关键词
D O I
10.1007/978-3-319-72359-4_49
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
One important aspect of trust is the following: when a trusted source reports some new information, then we are likely to believe that the new information is true. As such, the notion of trust is closely connected to the notion of belief change. In this paper, we demonstrate how a formal model of trust developed in the Artificial Intelligence community can be used to model the dynamics of belief on a social network. We use a formal model to capture the preceived areas of expertise of each agent, and we introduce a logical operator to determine how beliefs change following reported information. Significantly, the trust held in another agent is not determined solely by individual expertise; the extent to which an agent is trusted is also influenced by social relationships between agents. We prove a number of formal properties, and demonstrate that our approach can actually model a wide range of practical trust problems involving social agents. This work is largely foundational, and it connects two different research communities. In particular, this work illustrates how fundamentally logic-based models of reasoning can be applied to solve problems related to trust on social networks.
引用
收藏
页码:783 / 801
页数:19
相关论文
共 50 条
  • [1] Reasoning about trust: A formal logical framework
    Demolombe, R
    TRUST MANAGEMENT, PROCEEDING, 2004, 2995 : 291 - 303
  • [2] Formal Reasoning About Privacy and Trust in Loyalty Systems
    Decroix, Koen
    Lapon, Jorn
    Lemaire, Laurens
    De Decker, Bart
    Naessens, Vincent
    BUSINESS INFORMATION SYSTEMS WORKSHOPS, BIS 2015, 2015, 228 : 211 - 223
  • [3] Reasoning about Trust and Belief in Possibilistic Answer Set Programming
    Maia, Gabriel
    Alcantara, Joao
    PROCEEDINGS OF 2016 5TH BRAZILIAN CONFERENCE ON INTELLIGENT SYSTEMS (BRACIS 2016), 2016, : 217 - 222
  • [4] Dynamic agent-oriented reasoning about belief and trust
    Ustymenko, Stanislav
    Schwartz, Daniel G.
    MULTIAGENT AND GRID SYSTEMS, 2008, 4 (03) : 335 - 346
  • [5] A Formal Notion of Trust - Enabling Reasoning about Security Properties
    Fuchs, Andreas
    Guergens, Sigrid
    Rudolph, Carsten
    TRUST MANAGEMENT IV, 2010, 321 : 200 - 215
  • [6] Reasoning About Belief, Evidence and Trust in a Multi-agent Setting
    Liu, Fenrong
    Lorini, Emiliano
    PRINCIPLES AND PRACTICE OF MULTI-AGENT SYSTEMS (PRIMA 2017), 2017, 10621 : 71 - 89
  • [7] A formal method toward reasoning about continuous change
    Li, CP
    AI 2004: ADVANCES IN ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 2004, 3339 : 1174 - 1180
  • [8] Trust Mass, Volume and Density - a Novel Approach to Reasoning about Trust
    Degerlund, Fredrik
    ELECTRONIC NOTES IN THEORETICAL COMPUTER SCIENCE, 2007, 179 : 87 - 96
  • [9] Logical systems for reasoning about multi-agent belief, information acquisition and trust
    Liau, CJ
    ECAI 2000: 14TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 2000, 54 : 368 - 372
  • [10] Reasoning about continuous change: A formal method based on process description
    Li, CP
    PROCEEDINGS OF THE 2005 INTERNATIONAL CONFERENCE ON ACTIVE MEDIA TECHNOLOGY (AMT 2005), 2005, : 551 - 555