Entropy, mutual information and divergence measure the randomness, dependence and dissimilarity, respectively, of random objects. In addition to their prominent role in information theory, they have found numerous applications, among others, in probability theory statistics, physics, chemistry, molecular biology, ecology, bioinformatics, neuroscience, machine learning, linguistics, and finance. Many of these applications require a universal estimate of information measures which does not assume knowledge of the statistical properties of the observed data. Over the past few decades, several nonparametric algorithms have been proposed to estimate information measures. Universal Estimation of Information Measures for Analog Sources presents a comprehensive survey of universal estimation of information measures for memoryless analog (real-valued or real vector-valued) sources with an emphasis on the estimation of mutual information and divergence and their applications. The book reviews the consistency of the universal algorithms and the corresponding sufficient conditions as well as their speed of convergence. Universal Estimation of Information Measures for Analog Sources provides a comprehensive review of an increasingly important topic in Information Theory. It will be of interest to students, practitioners and researchers working in Information Theory.
Le informazioni nella sezione "Riassunto" possono far riferimento a edizioni diverse di questo titolo.
EUR 5,30 per la spedizione in U.S.A.
Destinazione, tempi e costiEUR 3,52 per la spedizione in U.S.A.
Destinazione, tempi e costiDa: Second Story Books, ABAA, Rockville, MD, U.S.A.
Softcover. Octavo; Fair+; Paperback; Spine, green with black print; Cover has slight edgewear, puckering to top spine corner, else clean and bright; Text block has faint moisture stain and puckering to top spine corner throughout, else clean and tight; ix, 93 pages, illustrated (b&w diagrams). 1351585. FP New Rockville Stock. Codice articolo 1351585
Quantità: 1 disponibili
Da: Lucky's Textbooks, Dallas, TX, U.S.A.
Condizione: New. Codice articolo ABLIING23Mar2811580106030
Quantità: Più di 20 disponibili
Da: PBShop.store UK, Fairford, GLOS, Regno Unito
PAP. Condizione: New. New Book. Delivered from our UK warehouse in 4 to 14 business days. THIS BOOK IS PRINTED ON DEMAND. Established seller since 2000. Codice articolo IQ-9781601982308
Quantità: 15 disponibili
Da: Phatpocket Limited, Waltham Abbey, HERTS, Regno Unito
Condizione: Like New. Used - Like New. Book is new and unread but may have minor shelf wear. Your purchase helps support Sri Lankan Children's Charity 'The Rainbow Centre'. Our donations to The Rainbow Centre have helped provide an education and a safe haven to hundreds of children who live in appalling conditions. Codice articolo Z1-J-018-01066
Quantità: 1 disponibili
Da: THE SAINT BOOKSTORE, Southport, Regno Unito
Paperback / softback. Condizione: New. This item is printed on demand. New copy - Usually dispatched within 5-9 working days 189. Codice articolo C9781601982308
Quantità: Più di 20 disponibili
Da: Revaluation Books, Exeter, Regno Unito
Paperback. Condizione: Brand New. 104 pages. 9.00x6.10x0.30 inches. In Stock. Codice articolo x-1601982305
Quantità: 2 disponibili
Da: PBShop.store US, Wood Dale, IL, U.S.A.
PAP. Condizione: New. New Book. Shipped from UK. THIS BOOK IS PRINTED ON DEMAND. Established seller since 2000. Codice articolo L0-9781601982308
Quantità: Più di 20 disponibili
Da: Chiron Media, Wallingford, Regno Unito
PF. Condizione: New. Codice articolo 6666-IUK-9781601982308
Quantità: 10 disponibili
Da: Ria Christie Collections, Uxbridge, Regno Unito
Condizione: New. In. Codice articolo ria9781601982308_new
Quantità: Più di 20 disponibili
Da: AHA-BUCH GmbH, Einbeck, Germania
Taschenbuch. Condizione: Neu. nach der Bestellung gedruckt Neuware - Printed after ordering - Entropy, mutual information and divergence measure the randomness, dependence and dissimilarity, respectively, of random objects. In addition to their prominent role in information theory, they have found numerous applications, among others, in probability theory statistics, physics, chemistry, molecular biology, ecology, bioinformatics, neuroscience, machine learning, linguistics, and finance. Many of these applications require a universal estimate of information measures which does not assume knowledge of the statistical properties of the observed data. Over the past few decades, several non-parametric algorithms have been proposed to estimate information measures.Universal Estimation of Information Measures for Analog Sources presents a comprehensive survey of universal estimation of information measures for memoryless analog (real-valued or real vector-valued) sources with an emphasis on the estimation of mutual information and divergence and their applications. The book reviews the consistency of the universal algorithms and the corresponding sufficient conditions as well as their speed of convergence.Universal Estimation of Information Measures for Analog Sources provides a comprehensive review of an increasingly important topic in Information Theory. It will be of interest to students, practitioners and researchers working in Information Theory. Codice articolo 9781601982308
Quantità: 1 disponibili