This book provides an up-to-date introduction to information theory. In addition to the classical topics discussed, it provides the first comprehensive treatment of the theory of I-Measure, network coding theory, Shannon and non-Shannon type information inequalities, and a relation between entropy and group theory. ITIP, a software package for proving information inequalities, is also included. With a large number of examples, illustrations, and original problems, this book is excellent as a textbook or reference book for a senior or graduate level course on the subject, as well as a reference for researchers in related fields.
1. The Science of Information.- 2. Information Measures.- 3. Zero-Error Data Compression.- 4. Weak Typicality.- 5. Strong Typicality.- 6. The I-Measure.- 7. Markov Structures.- 8. Channel Capacity.- 9. Rate Distortion Theory.- 10. The Blahut-Arimoto Algorithms.- 11. Single-Source Network Coding.- 12. Information Inequalities.- 13. Shannon-Type Inequalities.- Appendix 13A: The Basic Inequalities and the Polymatroidal Axioms.- 14. Beyond Shannon-Type Inequalities.- 15. Multi-Source Network Coding.- Appendix 15A: Approximation of Random Variables with Infinite Alphabets.- 16. Entropy and Groups.- Bibliography.- Index.