Capacity Definitions for General Channels with Receiver Side Information - Computer Science > Information TheoryReportar como inadecuado




Capacity Definitions for General Channels with Receiver Side Information - Computer Science > Information Theory - Descarga este documento en PDF. Documentación en PDF para descargar gratis. Disponible también para leer online.

Abstract: We consider three capacity definitions for general channels with channel sideinformation at the receiver, where the channel is modeled as a sequence offinite dimensional conditional distributions not necessarily stationary,ergodic, or information stable. The {\em Shannon capacity} is the highest rateasymptotically achievable with arbitrarily small error probability. The {\emcapacity versus outage} is the highest rate asymptotically achievable with agiven probability of decoder-recognized outage. The {\em expected capacity} isthe highest average rate asymptotically achievable with a single encoder andmultiple decoders, where the channel side information determines the decoder inuse. As a special case of channel codes for expected rate, the code forcapacity versus outage has two decoders: one operates in the non-outage statesand decodes all transmitted information, and the other operates in the outagestates and decodes nothing. Expected capacity equals Shannon capacity forchannels governed by a stationary ergodic random process but is typicallygreater for general channels. These alternative capacity definitionsessentially relax the constraint that all transmitted information must bedecoded at the receiver. We derive capacity theorems for these capacitydefinitions through information density. Numerical examples are provided todemonstrate their connections and differences. We also discuss the implicationof these alternative capacity definitions for end-to-end distortion,source-channel coding and separation.



Autor: Michelle Effros, Andrea Goldsmith, Yifan Liang

Fuente: https://arxiv.org/







Documentos relacionados