Information-theoretic bounds on average signal transition activity

被引:25
|
作者
Ramprasad, S [1 ]
Shanbhag, NR [1 ]
Hajj, IN [1 ]
机构
[1] Univ Illinois, Coordinated Sci Lab, Urbana, IL 61801 USA
基金
美国国家科学基金会;
关键词
achievable bounds; busses; CMOS circuits; information theory; low power; switching activity;
D O I
10.1109/92.784097
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Transitions on high-capacitance busses in very large scale integration systems result in considerable system power dissipation. Therefore, various coding schemes have been proposed in the literature to encode the input signal in order to reduce the number of transitions. In this paper, we derive lower and upper bounds on the average signal transition activity via an information-theoretic approach, in which symbols generated by a process (possibly correlated) with entropy rate H are coded with an average of R bits per symbol. The bounds are asymptotically achievable if the process is stationary and ergodic. We also present a coding algorithm based on the Lempel-Ziv data-compression algorithm to achieve the bounds. Bounds are also obtained on the expected number of ones (or zeros). These results are applied to determine the activity-reducing efficiency of different coding algorithms such as entropy coding, transition signaling, and bus-invert coding and determine the lower bound on the power-delay product given H and R. Two examples are provided where transition activity within 4% and 9% of the lower bound is achieved when blocks of eight symbols and 13 symbols, respectively, are coded at a time.
引用
收藏
页码:359 / 368
页数:10
相关论文
共 50 条
  • [41] Information-theoretic bounds on model selection for Gaussian Markov random fields
    Wang, Wei
    Wainwright, Martin J.
    Ramchandran, Kannan
    2010 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY, 2010, : 1373 - 1377
  • [42] Information-theoretic bounds of evolutionary processes modeled as a protein communication system
    Gong, Liuling
    Bouaynaya, Nidhal
    Schonfeld, Dan
    2007 IEEE/SP 14TH WORKSHOP ON STATISTICAL SIGNAL PROCESSING, VOLS 1 AND 2, 2007, : 1 - 5
  • [43] Improved Information-Theoretic Generalization Bounds for Distributed, Federated, and Iterative Learning
    Barnes, Leighton Pate
    Dytso, Alex
    Poor, Harold Vincent
    ENTROPY, 2022, 24 (09)
  • [44] GENERIC BOUNDS ON THE MAXIMUM DEVIATIONS IN SEQUENTIAL PREDICTION: AN INFORMATION-THEORETIC ANALYSIS
    Fang, Song
    Zhu, Quanyan
    2019 IEEE 29TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2019,
  • [45] Information-Theoretic Lower Bounds on the Storage Cost of Shared Memory Emulation
    Cadambe, Viveck R.
    Wang, Zhiying
    Lynch, Nancy
    PROCEEDINGS OF THE 2016 ACM SYMPOSIUM ON PRINCIPLES OF DISTRIBUTED COMPUTING (PODC'16), 2016, : 305 - 314
  • [46] Information-Theoretic Bounds on the Generalization Error and Privacy Leakage in Federated Learning
    Yagli, Semih
    Dytso, Alex
    Poor, H. Vincent
    PROCEEDINGS OF THE 21ST IEEE INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (IEEE SPAWC2020), 2020,
  • [47] Information-Theoretic Lower Bounds on the Oracle Complexity of Stochastic Convex Optimization
    Agarwal, Alekh
    Bartlett, Peter L.
    Ravikumar, Pradeep
    Wainwright, Martin J.
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2012, 58 (05) : 3235 - 3249
  • [48] Information-theoretic lower bounds on the bit error probability of codes on graphs
    Sason, I
    Urbanke, M
    2003 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY - PROCEEDINGS, 2003, : 268 - 268
  • [49] Tight Information-Theoretic Lower Bounds for Welfare Maximization in Combinatorial Auctions
    Mirrokni, Vahab
    Schapira, Michael
    Vondrak, Jan
    EC'08: PROCEEDINGS OF THE 2008 ACM CONFERENCE ON ELECTRONIC COMMERCE, 2008, : 70 - 77
  • [50] An Information-Theoretic Analysis of Memory Bounds in a Distributed Resource Allocation Mechanism
    Araujo, Ricardo M.
    Lamb, Luis C.
    20TH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2007, : 212 - 217