Le informazioni nella sezione "Riassunto" possono far riferimento a edizioni diverse di questo titolo.
Le informazioni nella sezione "Su questo libro" possono far riferimento a edizioni diverse di questo titolo.
Spese di spedizione:
EUR 2,47
In U.S.A.
Descrizione libro Condizione: New. Codice articolo 439071-n
Descrizione libro Condizione: New. Buy with confidence! Book is in new, never-used condition. Codice articolo bk0486604349xvz189zvxnew
Descrizione libro Condizione: New. New! This book is in the same immaculate condition as when it was published. Codice articolo 353-0486604349-new
Descrizione libro Paperback or Softback. Condizione: New. Mathematical Foundations of Information Theory 0.32. Book. Codice articolo BBS-9780486604343
Descrizione libro Condizione: New. Brand New! Not Overstocks or Low Quality Book Club Editions! Direct From the Publisher! We're not a giant, faceless warehouse organization! We're a small town bookstore that loves books and loves it's customers! Buy from Lakeside Books!. Codice articolo OTF-S-9780486604343
Descrizione libro Condizione: New. Brand New. Codice articolo 0486604349
Descrizione libro Paperback. Condizione: New. Brand New! This item is printed on demand. Codice articolo 0486604349
Descrizione libro Condizione: New. Codice articolo ABLING22Oct2018170014604
Descrizione libro Condizione: New. Codice articolo I-9780486604343
Descrizione libro Paperback. Condizione: new. Paperback. The first comprehensive introduction to information theory, this book places the work begun by Shannon and continued by McMillan, Feinstein, and Khinchin on a rigorous mathematical basis. For the first time, mathematicians, statisticians, physicists, cyberneticists, and communications engineers are offered a lucid, comprehensive introduction to this rapidly growing field.In his first paper, Dr. Khinchin develops the concept of entropy in probability theory as a measure of uncertainty of a finite ""scheme,"" and discusses a simple application to coding theory. The second paper investigates the restrictions previously placed on the study of sources, channels, and codes and attempts ""to give a complete, detailed proof of both . Shannon theorems, assuming any ergodic source and any stationary channel with a finite memory.""Partial Contents: I. The Entropy Concept in Probability Theory - Entropy of Finite Schemes. The Uniqueness Theorem. Entropy of Markov chains. Application to Coding Theory. II. On the Fundamental Theorems of Information Theory - Two generalizations of Shannon's inequality. Three inequalities of Feinstein. Concept of a source. Stationarity. Entropy. Ergodic sources. The E property. The martingale concept. Noise. Anticipation and memory. Connection of the channel to the source. Feinstein's Fundamental Lemma. Coding. The first Shannon theorem. The second Shannon theorem. First comprehensive introduction to information theory explores the work of Shannon, McMillan, Feinstein, and Khinchin. Topics include the entropy concept in probability theory, fundamental theorems, and other subjects. 1957 edition. Shipping may be from multiple locations in the US or from the UK, depending on stock availability. Codice articolo 9780486604343