Bayesian Sensor Fusion Methods for Dynamic Object Tracking - A Comparative Study

被引:5
|
作者
Markovic, Ivan [1 ]
Petrovic, Ivan [1 ]
机构
[1] Univ Zagreb, Fac Elect Engn & Comp, Dept Control & Comp Engn, HR-10000 Zagreb, Croatia
关键词
Bayesian sensor fusion; Information filter; Particle filter; Renyi entropy; MULTISENSOR; SYSTEM;
D O I
10.7305/automatika.2014.09.847
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper we study the problem of Bayesian sensor fusion for dynamic object tracking. The prospects of utilizing measurements from several sensors to infer about a system state are manyfold and they range from increased estimate accuracy to more reliable and robust estimates. Sensor measurements may be combined, or fused, at a variety of levels; from the raw data level to a state vector level, or at the decision level. In this paper we mainly focus on the Bayesian fusion at the likelihood and state vector level. We analyze two groups of data fusion methods: centralized independent likelihood fusion, where the sensors report only its measurement to the fusion center, and hierarchical fusion, where each sensor runs its own local estimate which is then communicated to the fusion center along with the corresponding uncertainty. We compare the prospects of utilizing both approaches, and present explicit solutions in the forms of extended information filter, unscented information filter, and particle filter. Furthermore, we also propose a solution for fusion of arbitrary filters and test it on a hierarchical fusion example of two of the aforementioned filters. Hence, the main contributions of this paper are systematic comparative study of Bayesian fusion methods, and a method for hierarchical fusion of arbitrary filters. The fusion methods are tested on a synthetic data generated by multiple Monte Carlo runs for tracking of a dynamic object with several sensors of different accuracies by analyzing the quadratic Renyi entropy and root-mean-square error.
引用
收藏
页码:386 / 398
页数:13
相关论文
共 50 条
  • [41] Object tracking based on harmony search: comparative study
    Gao, Ming-Liang
    He, Xiao-Hai
    Luo, Dai-Sheng
    Yu, Yan-Mei
    JOURNAL OF ELECTRONIC IMAGING, 2012, 21 (04)
  • [42] Bayesian techniques for sensor fusion
    Hriljac, P
    SECOND INTERNATIONAL CONFERENCE ON NONLINEAR PROBLEMS IN AVIATION & AEROSPACE VOL 1 AND 2, 1999, : 281 - 285
  • [43] Sensor Radar for Object Tracking
    Chiani, Marco
    Giorgetti, Andrea
    Paolini, Enrico
    PROCEEDINGS OF THE IEEE, 2018, 106 (06) : 1022 - 1041
  • [44] Study on Particle Filter Object Tracking Based on Weighted Fusion
    Wen, Zhiqiang
    Zhu, Yanhui
    Peng, Zhaoyi
    MEMS, NANO AND SMART SYSTEMS, PTS 1-6, 2012, 403-408 : 3049 - +
  • [45] SMART-TRACK: A Novel Kalman Filter-Guided Sensor Fusion for Robust UAV Object Tracking in Dynamic Environments
    Abdelkader, Mohamed
    Gabr, Khaled
    Jarraya, Imen
    Almusalami, Abdullah
    Koubaa, Anis
    IEEE SENSORS JOURNAL, 2025, 25 (02) : 3086 - 3097
  • [46] Dynamic feature fusion with spatial-temporal context for robust object tracking
    Nai, Ke
    Li, Zhiyong
    Wang, Haidong
    PATTERN RECOGNITION, 2022, 130
  • [47] Multiple Sensor Bayesian Extended Target Tracking Fusion Approaches Using Random Matrices
    Vivone, Gemine
    Granstroem, Karl
    Braca, Paolo
    Willett, Peter
    2016 19TH INTERNATIONAL CONFERENCE ON INFORMATION FUSION (FUSION), 2016, : 886 - 892
  • [48] Active and Dynamic Multi-sensor Information Fusion Method Based on Dynamic Bayesian Networks
    Han, Pengxin
    Mu, Rongjun
    Cui, Naigang
    2009 IEEE INTERNATIONAL CONFERENCE ON MECHATRONICS AND AUTOMATION, VOLS 1-7, CONFERENCE PROCEEDINGS, 2009, : 3076 - 3080
  • [49] Robust video object tracking via Bayesian model averaging-based feature fusion
    Dai, Yi
    Liu, Bin
    OPTICAL ENGINEERING, 2016, 55 (08)
  • [50] Reliable Data Fusion in Wireless Sensor Networks : A Dynamic Bayesian Game Approach
    Dehnie, Sintayehu
    Guan, Kyle
    Gharai, Ladan
    Ghanadan, Reza
    Kumar, Srikanta
    MILCOM 2009 - 2009 IEEE MILITARY COMMUNICATIONS CONFERENCE, VOLS 1-4, 2009, : 2287 - 2293