Elements of Information Theory

by ,

Show Synopsis

Following a brief introduction and overview, early chapters cover the basic algebraic relationships of entropy, relative entropy and mutual information, AEP, entropy rates of stochastics processes and data compression, duality of data compression and the growth rate of wealth. Later chapters explore Kolmogorov complexity, channel capacity, differential entropy, the capacity of the fundamental Gaussian channel, the relationship between information theory and statistics, rate distortion and network information theories. The ...

Filter Results
Shipping
Item Condition
Seller Rating
Other Options
Change Currency

Customer Reviews

Write a Review

cantbelieveit

Dec 9, 2015

not good

This book was very poor printed.
At many places, the fraction line was missing.

reovalis

Aug 9, 2007

Asymptotic and exaustive

One of the problems in the literature is that "Information Theory" is not well defined. It started off meaning that which Shannon accomplished in Communication Theory, then had evolved into what this book defines as IT. The problem that I have with this is that the theory still relies on asymptotic arguments. We do not do asymptotic experiments. Also, while it mentions Maximum Entropy methods, they are out of date. This method has evolved to the point where it is now fused with Bayesian theory (Giffin 07, arxiv.org). All of that said, this is an invaluable book. It is the most comprehensive work that I know of regarding the asymptotic view of IT.

1 Silent Rating

See All Customer Reviews


This item doesn't have extra editions

loading