Ensemble methods that train multiple learners and then combine them to use, with Boosting and Bagging as representatives, are well-known machine learning approaches. It has become common sense that an ensemble is usually significantly more accurate than a single learner, and ensemble methods have already achieved great success in various real-world tasks.
Twelve years have passed since the publication of the first edition of the book in 2012 (Japanese and Chinese versions published in 2017 and 2020, respectively). Many significant advances in this field have been developed. First, many theoretical issues have been tackled, for example, the fundamental question of why AdaBoost seems resistant to overfitting gets addressed, so that now we understand much more about the essence of ensemble methods. Second, ensemble methods have been well developed in more machine learning fields, e.g., isolation forest in anomaly detection, so that now we have powerful ensemble methods for tasks beyond conventional supervised learning.
Third, ensemble mechanisms have also been found helpful in emerging areas such as deep learning and online learning. This edition expands on the previous one with additional content to reflect the significant advances in the field, and is written in a concise but comprehensive style to be approachable to readers new to the subject.
Le informazioni nella sezione "Riassunto" possono far riferimento a edizioni diverse di questo titolo.
Zhi-Hua Zhou, Professor of Computer Science and Artificial Intelligence at Nanjing University, President of IJCAI trustee, Fellow of the ACM, AAAI, AAAS, IEEE, recipient of the IEEE Computer Society Edward J. McCluskey Technical Achievement Award, CCF-ACM Artificial Intelligence Award.
Le informazioni nella sezione "Su questo libro" possono far riferimento a edizioni diverse di questo titolo.
EUR 17,05 per la spedizione da U.S.A. a Italia
Destinazione, tempi e costiEUR 8,06 per la spedizione da Regno Unito a Italia
Destinazione, tempi e costiDa: Speedyhen, London, Regno Unito
Condizione: NEW. Codice articolo NW9781032960609
Quantità: 2 disponibili
Da: Ria Christie Collections, Uxbridge, Regno Unito
Condizione: New. In. Codice articolo ria9781032960609_new
Quantità: Più di 20 disponibili
Da: THE SAINT BOOKSTORE, Southport, Regno Unito
Hardback. Condizione: New. New copy - Usually dispatched within 4 working days. 830. Codice articolo B9781032960609
Quantità: 1 disponibili
Da: Majestic Books, Hounslow, Regno Unito
Condizione: New. Codice articolo 410463306
Quantità: 3 disponibili
Da: Books Puddle, New York, NY, U.S.A.
Condizione: New. Codice articolo 26402690965
Quantità: 3 disponibili
Da: GreatBookPricesUK, Woodford Green, Regno Unito
Condizione: New. Codice articolo 48281270-n
Quantità: 3 disponibili
Da: GreatBookPrices, Columbia, MD, U.S.A.
Condizione: New. Codice articolo 48281270-n
Quantità: 3 disponibili
Da: Chiron Media, Wallingford, Regno Unito
hardcover. Condizione: New. Codice articolo 6666-GRD-9781032960609
Quantità: 1 disponibili
Da: Biblios, Frankfurt am main, HESSE, Germania
Condizione: New. Codice articolo 18402690975
Quantità: 3 disponibili
Da: AussieBookSeller, Truganina, VIC, Australia
Hardcover. Condizione: new. Hardcover. Ensemble methods that train multiple learners and then combine them to use, with Boosting and Bagging as representatives, are well-known machine learning approaches. It has become common sense that an ensemble is usually significantly more accurate than a single learner, and ensemble methods have already achieved great success in various real-world tasks.Twelve years have passed since the publication of the first edition of the book in 2012 (Japanese and Chinese versions published in 2017 and 2020, respectively). Many significant advances in this field have been developed. First, many theoretical issues have been tackled, for example, the fundamental question of why AdaBoost seems resistant to overfitting gets addressed, so that now we understand much more about the essence of ensemble methods. Second, ensemble methods have been well developed in more machine learning fields, e.g., isolation forest in anomaly detection, so that now we have powerful ensemble methods for tasks beyond conventional supervised learning.Third, ensemble mechanisms have also been found helpful in emerging areas such as deep learning and online learning. This edition expands on the previous one with additional content to reflect the significant advances in the field, and is written in a concise but comprehensive style to be approachable to readers new to the subject. Ensemble methods that train multiple learners and then combine them to use, with \textit{Boosting} and \textit{Bagging} as representatives, are well-known machine learning approaches. An ensemble is significantly more accurate than a single learner, and ensemble methods have already achieved great success in various real-world tasks. Shipping may be from our Sydney, NSW warehouse or from our UK or US warehouse, depending on stock availability. Codice articolo 9781032960609
Quantità: 1 disponibili