Da: GreatBookPrices, Columbia, MD, U.S.A.
EUR 57,08
Quantità: Più di 20 disponibili
Aggiungi al carrelloCondizione: New.
Da: GreatBookPrices, Columbia, MD, U.S.A.
EUR 63,50
Quantità: Più di 20 disponibili
Aggiungi al carrelloCondizione: As New. Unread book in perfect condition.
Da: California Books, Miami, FL, U.S.A.
EUR 66,14
Quantità: Più di 20 disponibili
Aggiungi al carrelloCondizione: New.
Da: Ria Christie Collections, Uxbridge, Regno Unito
EUR 58,63
Quantità: Più di 20 disponibili
Aggiungi al carrelloCondizione: New. In English.
Da: GreatBookPricesUK, Woodford Green, Regno Unito
EUR 56,12
Quantità: Più di 20 disponibili
Aggiungi al carrelloCondizione: New.
Da: GreatBookPricesUK, Woodford Green, Regno Unito
EUR 66,51
Quantità: Più di 20 disponibili
Aggiungi al carrelloCondizione: As New. Unread book in perfect condition.
Condizione: New. 1st edition NO-PA16APR2015-KAP.
Da: Majestic Books, Hounslow, Regno Unito
EUR 86,45
Quantità: 1 disponibili
Aggiungi al carrelloCondizione: New.
Lingua: Inglese
Editore: Springer International Publishing, 2020
ISBN 10: 3031010493 ISBN 13: 9783031010491
Da: AHA-BUCH GmbH, Einbeck, Germania
EUR 58,84
Quantità: 1 disponibili
Aggiungi al carrelloTaschenbuch. Condizione: Neu. Druck auf Anfrage Neuware - Printed after ordering - Embeddings have undoubtedly been one of the most influential research areas in Natural Language Processing (NLP). Encoding information into a low-dimensional vector representation, which is easily integrable in modern machine learning models, has played a central role in the development of NLP. Embedding techniques initially focused on words, but the attention soon started to shift to other forms: from graph structures, such as knowledge bases, to other types of textual content, such as sentences and documents. This book provides a high-level synthesis of the main embedding techniques in NLP, in the broad sense. The book starts by explaining conventional word vector space models and word embeddings (e.g., Word2Vec and GloVe) and then moves to other types of embeddings, such as word sense, sentence and document, and graph embeddings. The book also provides an overview of recent developments in contextualized representations (e.g., ELMo and BERT) and explains their potential in NLP. Throughout the book, the reader can find both essential information for understanding a certain topic from scratch and a broad overview of the most successful techniques developed in the literature.
Da: preigu, Osnabrück, Germania
EUR 53,50
Quantità: 5 disponibili
Aggiungi al carrelloTaschenbuch. Condizione: Neu. Embeddings in Natural Language Processing | Theory and Advances in Vector Representations of Meaning | Jose Camacho-Collados (u. a.) | Taschenbuch | xviii | Englisch | 2020 | Springer | EAN 9783031010491 | Verantwortliche Person für die EU: Springer Verlag GmbH, Tiergartenstr. 17, 69121 Heidelberg, juergen[dot]hartmann[at]springer[dot]com | Anbieter: preigu.
Lingua: Inglese
Editore: Springer International Publishing Nov 2020, 2020
ISBN 10: 3031010493 ISBN 13: 9783031010491
Da: BuchWeltWeit Ludwig Meier e.K., Bergisch Gladbach, Germania
EUR 58,84
Quantità: 2 disponibili
Aggiungi al carrelloTaschenbuch. Condizione: Neu. This item is printed on demand - it takes 3-4 days longer - Neuware -Embeddings have undoubtedly been one of the most influential research areas in Natural Language Processing (NLP). Encoding information into a low-dimensional vector representation, which is easily integrable in modern machine learning models, has played a central role in the development of NLP. Embedding techniques initially focused on words, but the attention soon started to shift to other forms: from graph structures, such as knowledge bases, to other types of textual content, such as sentences and documents. This book provides a high-level synthesis of the main embedding techniques in NLP, in the broad sense. The book starts by explaining conventional word vector space models and word embeddings (e.g., Word2Vec and GloVe) and then moves to other types of embeddings, such as word sense, sentence and document, and graph embeddings. The book also provides an overview of recent developments in contextualized representations (e.g., ELMo and BERT) and explains their potential in NLP. Throughout the book, the reader can find both essential information for understanding a certain topic from scratch and a broad overview of the most successful techniques developed in the literature. 176 pp. Englisch.
Lingua: Inglese
Editore: Springer, Berlin|Springer International Publishing|Morgan & Claypool|Springer, 2020
ISBN 10: 3031010493 ISBN 13: 9783031010491
Da: moluna, Greven, Germania
EUR 51,51
Quantità: Più di 20 disponibili
Aggiungi al carrelloCondizione: New. Dieser Artikel ist ein Print on Demand Artikel und wird nach Ihrer Bestellung fuer Sie gedruckt. Embeddings have undoubtedly been one of the most influential research areas in Natural Language Processing (NLP). Encoding information into a low-dimensional vector representation, which is easily integrable in modern machine learning models, has played a c.
Da: Books Puddle, New York, NY, U.S.A.
Condizione: New. Print on Demand pp. 176.
Lingua: Inglese
Editore: Springer International Publishing, Springer International Publishing Nov 2020, 2020
ISBN 10: 3031010493 ISBN 13: 9783031010491
Da: buchversandmimpf2000, Emtmannsberg, BAYE, Germania
EUR 58,84
Quantità: 1 disponibili
Aggiungi al carrelloTaschenbuch. Condizione: Neu. This item is printed on demand - Print on Demand Titel. Neuware -Embeddings have undoubtedly been one of the most influential research areas in Natural Language Processing (NLP). Encoding information into a low-dimensional vector representation, which is easily integrable in modern machine learning models, has played a central role in the development of NLP. Embedding techniques initially focused on words, but the attention soon started to shift to other forms: from graph structures, such as knowledge bases, to other types of textual content, such as sentences and documents. This book provides a high-level synthesis of the main embedding techniques in NLP, in the broad sense. The book starts by explaining conventional word vector space models and word embeddings (e.g., Word2Vec and GloVe) and then moves to other types of embeddings, such as word sense, sentence and document, and graph embeddings. The book also provides an overview of recent developments in contextualized representations (e.g., ELMo and BERT) and explains their potential in NLP. Throughout the book, the reader can find both essential information for understanding a certain topic from scratch and a broad overview of the most successful techniques developed in the literature.Springer Verlag GmbH, Tiergartenstr. 17, 69121 Heidelberg 176 pp. Englisch.