Efficient Spatio-Temporal Information Fusion in Sensor Networks

被引:3
|
作者
Chejerla, Brijesh Kashyap [1 ]
Madria, Sanjay K. [1 ]
机构
[1] Missouri Univ Sci & Technol, Dept Comp Sci, Rolla, MO 65401 USA
关键词
D O I
10.1109/MDM.2013.26
中图分类号
TN [电子技术、通信技术];
学科分类号
0809 ;
摘要
Making the sensor data look more meaningful in its representation of an observed entity is the primary goal of sensor data fusion. Due to the energy constraint on sensors, there exists a need for algorithms that minimize the fusion cost while maintaining the validity of the data sent to the base station. Maintaining validity is even more difficult when we have a limited knowledge of the factors that govern an observed sensor entity. To achieve this goal, we modeled the uncertainties in sensor data and fed them into the system, employing recursive data estimation. By doing so, we considered the dynamically changing environmental parameters affecting the network to produce the most accurate representation of the observed system state. We propose here a spatio-temporal, correlation-based estimation procedure to corroborate the detection of an event in a sensor field. The number of in-network communications plays a great role from the networking perspective. This is because the power consumption during communication is several times greater than the power consumption during computation. To achieve this, our algorithm ensures that communication is done only during the time of an event. At all other times, the sensor motes maintain an updated global estimate, without communicating, by using a prediction algorithm. This reduces the need for frequent sensor synchronization. We conducted experiments using our distributed fusion architecture to show our algorithm's effectiveness; by a reduction in the power consumption,
引用
收藏
页码:157 / 166
页数:10
相关论文
共 50 条
  • [31] Spatio-temporal correlation:: theory and applications for wireless sensor networks
    Vuran, MC
    Akan, ÖB
    Akyildiz, IF
    COMPUTER NETWORKS, 2004, 45 (03) : 245 - 259
  • [32] A new sensor bias-driven spatio-temporal fusion model based on convolutional neural networks
    Li, Yunfei
    Li, Jun
    He, Lin
    Chen, Jin
    Plaza, Antonio
    SCIENCE CHINA-INFORMATION SCIENCES, 2020, 63 (04)
  • [33] A new sensor bias-driven spatio-temporal fusion model based on convolutional neural networks
    Yunfei LI
    Jun LI
    Lin HE
    Jin CHEN
    Antonio PLAZA
    ScienceChina(InformationSciences), 2020, 63 (04) : 24 - 39
  • [34] A new sensor bias-driven spatio-temporal fusion model based on convolutional neural networks
    Yunfei Li
    Jun Li
    Lin He
    Jin Chen
    Antonio Plaza
    Science China Information Sciences, 2020, 63
  • [35] Synchronization and information transmission in spatio-temporal networks of deformable units
    Kakmeni, F. M. Moukam
    Baptista, M. S.
    PRAMANA-JOURNAL OF PHYSICS, 2008, 70 (06): : 1063 - 1076
  • [36] Dynamic spatio-temporal pruning for efficient spiking neural networks
    Gou, Shuiping
    Fu, Jiahui
    Sha, Yu
    Cao, Zhen
    Guo, Zhang
    Eshraghian, Jason K.
    Li, Ruimin
    Jiao, Licheng
    FRONTIERS IN NEUROSCIENCE, 2025, 19
  • [37] Synchronization and information transmission in spatio-temporal networks of deformable units
    F. M. Moukam Kakmeni
    M. S. Baptista
    Pramana, 2008, 70 (6) : 1063 - 1076
  • [38] Efficient Spatio-Temporal Graph Neural Networks for Traffic Forecasting
    Lubarsky, Yackov
    Gaissinski, Alexei
    Kisilev, Pavel
    ARTIFICIAL INTELLIGENCE APPLICATIONS AND INNOVATIONS, AIAI 2023, PT II, 2023, 676 : 109 - 120
  • [39] Rainfall Forecasting Based on Spatio-Temporal Information Fusion Using Informer
    Qiu, Chao
    Qiu, Ying-jie
    Wang, Bei
    Zhang, Zhuo-fan
    Chen, Qi
    2023 11TH INTERNATIONAL CONFERENCE ON INFORMATION TECHNOLOGY: IOT AND SMART CITY, ITIOTSC 2023, 2023, : 73 - 78
  • [40] Spatio-temporal dynamics of vehicles: Fusion of traffic data and context information
    Bolanos-Martinez, Daniel
    Bermudez-Edo, Maria
    Garrido, Jose Luis
    Delgado-Marquez, Blanca L.
    DATA IN BRIEF, 2024, 53