This work reports critical analyses on complexity issues in the continuum setting and on generalization to new examples, which are two basic milestones in learning from examples in connectionist models. The problem of loading the weights of neural networks, which is often framed as continuous optimization, has been the target of many criticisms, since the potential solution of any learning problem is limited by the presence of local minimal in the error function. The notion of efficient solution needs to be formalized so as to provide useful comparisons with the traditional theory of computational complexity in the discrete setting. It also covers up-to-date developments in computational mathematics.
Le informazioni nella sezione "Riassunto" possono far riferimento a edizioni diverse di questo titolo.
This book reports critical analyses on complexity issues in the continuum setting and on generalization to new examples, which are two basic milestones in learning from examples in connectionist models. The problem of loading the weights of neural networks, which is often framed as continuous optimization, has been the target of many criticisms, since the potential solution of any learning problem is severely limited by the presence of local minimal in the error function. The maturity of the field requires to convert the quest for a general solution to all learning problems into the understanding of which learning problems are likely to be solved efficiently. Likewise, the notion of efficient solution needs to be formalized so as to provide useful comparisons with the traditional theory of computational complexity in the discrete setting. The book covers these topics focussing also on recent developments in computational mathematics, where interesting notions of computational complexity emerge in the continuum setting.
This book reports critical analyses on complexity issues in the continuum setting and on generalization to new examples, which are two basic milestones in learning from examples in connectionist models. The problem of loading the weights of neural networks, which is often framed as continuous optimization, has been the target of many criticisms, since the potential solution of any learning problem is severely limited by the presence of local minimal in the error function. The maturity of the field requires to convert the quest for a general solution to all learning problems into the understanding of which learning problems are likely to be solved efficiently. Likewise, the notion of efficient solution needs to be formalized so as to provide useful comparisons with the traditional theory of computational complexity in the discrete setting. The book covers these topics focussing also on recent developments in computational mathematics, where interesting notions of computational complexity emerge in the continuum setting.
Le informazioni nella sezione "Su questo libro" possono far riferimento a edizioni diverse di questo titolo.
EUR 3,41 per la spedizione in U.S.A.
Destinazione, tempi e costiDa: BookOrders, Russell, IA, U.S.A.
Hard Cover. Condizione: Good. No Jacket. Usual ex-library features. The interior is clean and tight. Binding is good. Cover shows slight wear. 245. Ex-Library. Codice articolo 113279
Quantità: 1 disponibili
Da: bmyguest books, Toronto, ON, Canada
Hardcover. Condizione: Very Good. 243 Pages With The Index. In Fine Condition. Hardcover. Name Inscription At The Front Endpaper.books are NOT signed. We will state signed at the description section. we confirm they are signed via email or stated in the description box. - Specializing in academic, collectiblle and historically significant, providing the utmost quality and customer service satisfaction. For any questions feel free to email us. Codice articolo 038048
Quantità: 1 disponibili
Da: Hay-on-Wye Booksellers, Hay-on-Wye, HEREF, Regno Unito
Condizione: Fine. Codice articolo 077826-2
Quantità: 1 disponibili
Da: Hay-on-Wye Booksellers, Hay-on-Wye, HEREF, Regno Unito
Condizione: Very Good. Codice articolo 011433-1
Quantità: 1 disponibili