Bounds on mutual information for simple codes using information combining

被引:0
|
作者
Land, I
Huettinger, S
Hoeher, PA
Huber, J
机构
[1] Aalborg Univ, Dept Commun Technol, Digital Commun Div, DK-9220 Aalborg, Denmark
[2] Univ Kiel, Fac Engn, Informat & Coding Theory Lab, D-24143 Kiel, Germany
[3] Univ Erlangen Nurnberg, Inst Informat Transmiss, D-91058 Erlangen, Germany
关键词
information theory; information measure; error correcting code; concatenation; iteration; memoryless channel; binary channel;
D O I
暂无
中图分类号
TN [电子技术、通信技术];
学科分类号
0809 ;
摘要
For coded transmission over a memoryless channel, two kinds of mutual information are considered: the mutual information between a code symbol and its noisy observation and the overall mutual information between encoder input and decoder output. The overall mutual information is interpreted as a combination of the mutual informations associated with the individual code symbols. Thus, exploiting code constraints in the decoding procedure is interpreted as combining mutual informations. For single parity check codes and repetition codes, we present bounds on the overall mutual information, which are based only on the mutual informations associated with the individual code symbols. Using these mutual information bounds, we compute bounds on extrinsic information transfer (EXIT) functions and bounds on information processing characteristics (IPC) for these codes.
引用
收藏
页码:184 / 214
页数:31
相关论文
共 50 条
  • [1] Bounds on mutual information for simple codes using information combiningBornes sur L’information Mutuelle Pour des Codes Simples Basés Sur la Combinaison D’Information
    Ingmar Land
    Simon Huettinger
    Peter A. Hoeher
    Johannes Huber
    Annales Des Télécommunications, 2005, 60 (1-2): : 184 - 214
  • [2] Bounds on the mutual information for bit linear linear-dispersion codes
    Jin, Xianglan
    Yang, Jae-Dong
    Song, Kyoung-Young
    No, Jong-Seon
    Shin, Dong-Joon
    2007 IEEE 18TH INTERNATIONAL SYMPOSIUM ON PERSONAL, INDOOR AND MOBILE RADIO COMMUNICATIONS, VOLS 1-9, 2007, : 3353 - +
  • [3] Bounds on information combining
    Land, I
    Huettinger, S
    Hoeher, PA
    Huber, JB
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2005, 51 (02) : 612 - 619
  • [4] On Variational Bounds of Mutual Information
    Poole, Ben
    Ozair, Sherjil
    van den Oord, Aaron
    Alemi, Alexander A.
    Tucker, George
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [5] Mutual Information Rate and Bounds for It
    Baptista, Murilo S.
    Rubinger, Rero M.
    Viana, Emilson R.
    Sartorelli, Jose C.
    Parlitz, Ulrich
    Grebogi, Celso
    PLOS ONE, 2012, 7 (10):
  • [6] Lower bounds on mutual information
    Foster, David V.
    Grassberger, Peter
    PHYSICAL REVIEW E, 2011, 83 (01)
  • [7] Bounds on Information Combining With Quantum Side Information
    Hirche, Christoph
    Reeb, David
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2018, 64 (07) : 4739 - 4757
  • [8] Bounds on Information Combining With Quantum Side Information
    Hirche, Christoph
    Reeb, David
    2018 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2018, : 896 - 900
  • [9] Renyi Bounds on Information Combining
    Hirche, Christoph
    2020 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2020, : 2297 - 2302
  • [10] Mutual information superadditivity and unitarity bounds
    Horacio Casini
    Eduardo Testé
    Gonzalo Torroba
    Journal of High Energy Physics, 2021