Preference-based query tuning through refinement/enlargement in a formal context

被引:0
|
作者
Spyratos, N [1 ]
Meghini, C
机构
[1] Univ Paris 11, Lab Rech Informat, Orsay, France
[2] CNR, Ist Sci & Tecnol Informaz, I-56100 Pisa, Italy
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The user of an information system rarely knows exactly what he is looking for, but once shown a piece of information he can quickly tell whether it is what he needs. Query tuning is the process of searching for the query that best approximates the information need of the user. Typically, navigation and querying are two completely separate processes, and the user usually has to switch often from one to the other-a painstaking process producing a frustrating experience. In this paper, we propose an approach to query tuning that integrates navigation and querying into a single process, thus leading to a more flexible and more user friendly method of query tuning. The proposed approach is based on formal concept analysis, and models the directory of an information source as a formal context in which the underlying concept lattice serves for navigation and the attributes of the formal context serve for query formulation. In order to support the user in coping with a possibly overwhelming number of alternative query tunings, preferences are introduced.
引用
收藏
页码:278 / 293
页数:16
相关论文
共 50 条
  • [1] Preference-Based Query Answering in Probabilistic Datalog+/- Ontologies
    Lukasiewicz, Thomas
    Martinez, Maria Vanina
    Simari, Gerardo I.
    Tifrea-Marciuska, Oana
    JOURNAL ON DATA SEMANTICS, 2015, 4 (02) : 81 - 101
  • [2] Preference-Based Query Answering in Probabilistic Datalog plus /- Ontologies
    Lukasiewicz, Thomas
    Martinez, Maria Vanina
    Simari, Gerardo I.
    ON THE MOVE TO MEANINGFUL INTERNET SYSTEMS: OTM 2013 CONFERENCES, 2013, 8185 : 501 - 518
  • [3] Context-aware, preference-based vehicle routing
    Guo, Chenjuan
    Yang, Bin
    Hu, Jilin
    Jensen, Christian S.
    Chen, Lu
    VLDB JOURNAL, 2020, 29 (05): : 1149 - 1170
  • [4] Context-aware, preference-based vehicle routing
    Chenjuan Guo
    Bin Yang
    Jilin Hu
    Christian S. Jensen
    Lu Chen
    The VLDB Journal, 2020, 29 : 1149 - 1170
  • [5] Preference-based reinforcement learning: a formal framework and a policy iteration algorithm
    Johannes Fürnkranz
    Eyke Hüllermeier
    Weiwei Cheng
    Sang-Hyeun Park
    Machine Learning, 2012, 89 : 123 - 156
  • [6] Preference-based reinforcement learning: a formal framework and a policy iteration algorithm
    Fuernkranz, Johannes
    Huellermeier, Eyke
    Cheng, Weiwei
    Park, Sang-Hyeun
    MACHINE LEARNING, 2012, 89 (1-2) : 123 - 156
  • [7] Preference-Based Inconsistency Assessment in Multi-Context Systems
    Eiter, Thomas
    Fink, Michael
    Weinzierl, Antonius
    LOGICS IN ARTIFICIAL INTELLIGENCE, JELIA 2010, 2010, 6341 : 143 - 155
  • [8] Preference-Based Inconsistency Management in Multi-Context Systems
    Eiter, Thomas
    Weinzierl, Antonius
    JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 2017, 60 : 347 - 424
  • [9] Preference-based inconsistency-tolerant query answering under existential rules
    Calautti, Marco
    Greco, Sergio
    Molinaro, Cristian
    Trubitsyna, Irina
    ARTIFICIAL INTELLIGENCE, 2022, 312
  • [10] Team Formation Through Preference-Based Behavior Composition
    Barati, Masoud
    St-Denis, Richard
    MULTIAGENT SYSTEM TECHNOLOGIES, MATES 2017, 2017, 10413 : 54 - 71