Information Inequalities for Joint Distributions, with Interpretations and Applications - Computer Science > Information TheoryReportar como inadecuado




Information Inequalities for Joint Distributions, with Interpretations and Applications - Computer Science > Information Theory - Descarga este documento en PDF. Documentación en PDF para descargar gratis. Disponible también para leer online.

Abstract: Upper and lower bounds are obtained for the joint entropy of a collection ofrandom variables in terms of an arbitrary collection of subset joint entropies.These inequalities generalize Shannon-s chain rule for entropy as well asinequalities of Han, Fujishige and Shearer. A duality between the upper andlower bounds for joint entropy is developed. All of these results are shown tobe special cases of general, new results for submodular functions- thus, theinequalities presented constitute a richly structured class of Shannon-typeinequalities. The new inequalities are applied to obtain new results incombinatorics, such as bounds on the number of independent sets in an arbitrarygraph and the number of zero-error source-channel codes, as well as newdeterminantal inequalities in matrix theory. A new inequality for relativeentropies is also developed, along with interpretations in terms of hypothesistesting. Finally, revealing connections of the results to literature ineconomics, computer science, and physics are explored.



Autor: Mokshay Madiman, Prasad Tetali

Fuente: https://arxiv.org/







Documentos relacionados