FSMI: Fast computation of Shannon Mutual Information for information-theoretic mapping

被引:2
|
作者
Zhang, Thengdong [1 ]
Henderson, Trevor [1 ]
Sze, Vivienne [1 ]
Karaman, Sertac [1 ]
机构
[1] MIT, Cambridge, MA 02139 USA
关键词
EXPLORATION;
D O I
10.1109/icra.2019.8793541
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Information-based mapping algorithms are critical to robot exploration tasks in several applications ranging from disaster response to space exploration. Unfortunately, most existing information-based mapping algorithms are plagued by the computational difficulty of evaluating the Shannon mutual information between potential future sensor measurements and the map. This has lead researchers to develop approximate methods, such as Cauchy-Schwarz Quadratic Mutual Information (CSQMI). In this paper, we propose a new algorithm, called Fast Shannon Mutual Information (FSMI), which is significantly faster than existing methods at computing the exact Shannon mutual information. The key insight behind FSMI is recognizing that the integral over the sensor beam can be evaluated analytically, removing an expensive numerical integration. In addition, we provide a number of approximation techniques for FSMI, which significantly improve computation time. Equipped with these approximation techniques, the FSMI algorithm is more than three orders of magnitude faster than the existing computation for Shannon mutual information; it also outperforms the CSQMI algorithm significantly, being roughly twice as fast, in our experiments.
引用
收藏
页码:6912 / 6918
页数:7
相关论文
共 50 条
  • [21] INFORMATION-THEORETIC INCOMPLETENESS
    CHAITIN, GJ
    APPLIED MATHEMATICS AND COMPUTATION, 1992, 52 (01) : 83 - 101
  • [22] Information-Theoretic Adverbialism
    Gert, Joshua
    AUSTRALASIAN JOURNAL OF PHILOSOPHY, 2021, 99 (04) : 696 - 715
  • [23] Information-theoretic logic
    Corcoran, J
    TRUTH IN PERSPECTIVE: RECENT ISSUES IN LOGIC, REPRESENTATION AND ONTOLOGY, 1998, : 113 - 135
  • [24] The information-theoretic turn
    Blevins, James P.
    PSIHOLOGIJA, 2013, 46 (04) : 355 - 375
  • [25] Information-theoretic lower bound on energy cost of stochastic computation
    Wiesner, Karoline
    Gu, Mile
    Rieper, Elisabeth
    Vedral, Vlatko
    PROCEEDINGS OF THE ROYAL SOCIETY A-MATHEMATICAL PHYSICAL AND ENGINEERING SCIENCES, 2012, 468 (2148): : 4058 - 4066
  • [26] AN INFORMATION-THEORETIC APPROACH TO TIME-BOUNDS FOR ONLINE COMPUTATION
    PAUL, WJ
    SEIFERAS, JI
    SIMON, J
    JOURNAL OF COMPUTER AND SYSTEM SCIENCES, 1981, 23 (02) : 108 - 126
  • [27] Information-Theoretic Bounds for Multiround Function Computation in Collocated Networks
    Ma, Nan
    Ishwar, Prakash
    Gupta, Piyush
    2009 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY, VOLS 1- 4, 2009, : 2306 - +
  • [28] A New Information-Theoretic Lower Bound for Distributed Function Computation
    Xu, Aolin
    Raginsky, Maxim
    2014 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2014, : 2227 - 2231
  • [29] Energy reduction in VLSI computation modules: An information-theoretic approach
    Sotiriadis, PP
    Tarokh, V
    Chandrakasan, AR
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2003, 49 (04) : 790 - 808
  • [30] Applying Information-theoretic Measures To Computation and Communication in Neural Ensembles
    Carmena, Jose M.
    Canolty, Ryan T.
    So, Kelvin
    Gastpar, Michael C.
    2010 CONFERENCE RECORD OF THE FORTY FOURTH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS AND COMPUTERS (ASILOMAR), 2010, : 2169 - 2171