Using Markov Decision Process for Recommendations Based on Aggregated Decision Data Models

被引:0
|
作者
Petrusel, Razvan [1 ]
机构
[1] Univ Babes Bolyai, Fac Econ Sci & Business Adm, Cluj Napoca 400591, Romania
来源
关键词
Decision Process Recommendation; Decision Data Model; Markov Decision Process;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Our research is placed in the context of business decision making processes. We look at decision making as at a workflow of (mostly mental) activities directed at choosing one decision alternative. Our goal is to direct the flow of decision activities such that the relevant alternatives are properly evaluated. It is outside our purpose to recommend which alternative should be chosen. Since business decision making is data-centric, we use a Decision Data Model (DDM). It is automatically mined from a log containing the decision maker's actions while interacting with business software. The recommendation is based on an aggregated DDM that shows what many decision makers have done in the same decision situation. In our previous work we created algorithms that seek a local optimum. In this paper we show how the recommendation based on DDM problem can be mapped to a Markov Decision Process (MDP). The aim is to use MDP to find a global optimal decision making strategy.
引用
收藏
页码:125 / 137
页数:13
相关论文
共 50 条
  • [31] Fast Markov Decision Process for Data Collection in Sensor Networks
    Thai Duong
    Thinh Nguyen
    2014 23RD INTERNATIONAL CONFERENCE ON COMPUTER COMMUNICATION AND NETWORKS (ICCCN), 2014,
  • [32] Intelligent decision making for overtaking maneuver using mixed observable Markov decision process
    Sezer, Volkan
    JOURNAL OF INTELLIGENT TRANSPORTATION SYSTEMS, 2018, 22 (03) : 201 - 217
  • [33] Large Scale System Management Based on Markov Decision Process and Big Data Concept
    Valeev, Sagit
    Kondratyeva, Natalya
    2016 IEEE 10TH INTERNATIONAL CONFERENCE ON APPLICATION OF INFORMATION AND COMMUNICATION TECHNOLOGIES (AICT), 2016, : 6 - 9
  • [34] A unified approach to time-aggregated Markov decision processes
    Li, Yanjie
    Wu, Xinyu
    AUTOMATICA, 2016, 67 : 77 - 84
  • [35] Learning-based symbolic assume-guarantee reasoning for Markov decision process by using interval Markov process
    Bouchekir R.
    Boukala M.C.
    Innovations in Systems and Software Engineering, 2018, 14 (3) : 229 - 244
  • [36] Optimizing Maintenance Decision in Rails: A Markov Decision Process Approach
    Sancho, Luis C. B.
    Braga, Joaquim A. P.
    Andrade, Antonio R.
    ASCE-ASME JOURNAL OF RISK AND UNCERTAINTY IN ENGINEERING SYSTEMS PART A-CIVIL ENGINEERING, 2021, 7 (01):
  • [37] Making Decision Process Knowledge Explicit Using the Decision Data Model
    Petrusel, Razvan
    Vanderfeesten, Irene
    Dolean, Cristina Claudia
    Mican, Daniel
    BUSINESS INFORMATION SYSTEMS, 2011, 87 : 172 - +
  • [38] Automatic decision making in SHM using hidden Markov models
    Tschoepe, Constanze
    Wolff, Matthias
    DEXA 2007: 18TH INTERNATIONAL CONFERENCE ON DATABASE AND EXPERT SYSTEMS APPLICATIONS, PROCEEDINGS, 2007, : 307 - +
  • [39] IDENTIFICATION IN DISCRETE MARKOV DECISION MODELS
    Srisuma, Sorawoot
    ECONOMETRIC THEORY, 2015, 31 (03) : 521 - 538
  • [40] On Markov decision models with an absorbing set
    Waldmann, Karl-Heinz
    Decision Theory and Multi-Agent Planning, 2006, (482): : 145 - 163