Over the next few decades, millions of people, with varying backgrounds and levels of technical expertise, will have to effectively interact with robotic technologies on a daily basis. This means it will have to be possible to modify robot behavior without explicitly writing code, but instead via a small number of wearable devices or visual demonstrations. At the same time, robots will need to infer and predict humans’ intentions and internal objectives on the basis of past interactions in order to provide assistance before it is explicitly requested; this is the basis of imitation learning for robotics.
This book introduces readers to robotic imitation learning based on human demonstration with wearable devices. It presents an advanced calibration method for wearable sensors and fusion approaches under the Kalman filter framework, as well as a novel wearable device for capturing gestures and other motions. Furthermore it describes the wearable-device-based and vision-based imitation learning method for robotic manipulation, making it a valuable reference guide for graduate students with a basic knowledge of machine learning, and for researchers interested in wearable computing and robotic learning.
Le informazioni nella sezione "Riassunto" possono far riferimento a edizioni diverse di questo titolo.
Bin Fang is an Assistant Researcher at the Department of Computer Science and Technology, Tsinghua University. His main research interests include wearable devices and human-robot interaction. He is a leader guest editor for a number of journals, including Frontiers in Neurorobotics, and Frontiers in Robotics and AI, and has served as an associate editor for various journals and conferences, e.g. the International Journal of Advanced Robotic Systems, and the IEEE International Conference on Advanced Robotics and Mechatronics.
Fuchun Sun is a Full Professor at the Department of Computer Science and Technology, Tsinghua University. A recipient of the National Science Fund for Distinguished Young Scholars, his main research interests include intelligent control and robotics. He serves as an associate editor for a number of international journals, including IEEE Transactions on Systems, Man and Cybernetics: Systems, IEEE Transactions on Fuzzy Systems, and Mechatronics, Robotics and Autonomous Systems.
Huaping Liu is an Associate Professor at the Department of Computer Science and Technology, Tsinghua University. His main research interests include robotic perception and learning. He serves as an associate editor for various journals, including IEEE Transactions on Automation Science and Engineering, IEEE Transactions on Industrial Informatics, IEEE Robotics & Automation Letters, Neurocomputing, and Cognitive Computation.
?Chunfang Liu is an Assistant Professor at the Department of Artificial Intelligence and Automation, Beijing University of Technology. Her research interests include intelligent robotics and vision.
Di Guo received her Ph.D. degree from the Department of Computer Science and Technology, Tsinghua University, Beijing, in 2017. Her research interests include robotic manipulation and sensor fusion.
Over the next few decades, millions of people, with varying backgrounds and levels of technical expertise, will have to effectively interact with robotic technologies on a daily basis. This means it will have to be possible to modify robot behavior without explicitly writing code, but instead via a small number of wearable devices or visual demonstrations. At the same time, robots will need to infer and predict humans intentions and internal objectives on the basis of past interactions in order to provide assistance before it is explicitly requested; this is the basis of imitation learning for robotics.
This book introduces readers to robotic imitation learning based on human demonstration with wearable devices. It presents an advanced calibration method for wearable sensors and fusion approaches under the Kalman filter framework, as well as a novel wearable device for capturing gestures and other motions. Furthermore it describes the wearable-device-based and vision-based imitation learning method for robotic manipulation, making it a valuable reference guide for graduate students with a basic knowledge of machine learning, and for researchers interested in wearable computing and robotic learning.
Le informazioni nella sezione "Su questo libro" possono far riferimento a edizioni diverse di questo titolo.
GRATIS per la spedizione da Germania a Italia
Destinazione, tempi e costiEUR 9,70 per la spedizione da Germania a Italia
Destinazione, tempi e costiDa: Buchpark, Trebbin, Germania
Condizione: Gut. Zustand: Gut | Seiten: 232 | Sprache: Englisch | Produktart: Bücher. Codice articolo 36259454/3
Quantità: 1 disponibili
Da: moluna, Greven, Germania
Gebunden. Condizione: New. Dieser Artikel ist ein Print on Demand Artikel und wird nach Ihrer Bestellung fuer Sie gedruckt. Provides a systematic and comprehensive introduction to robotic imitation learning Introduces demonstration-learning-based solutions for robotic manipulationShowcase the applications of wearable devices in robotic learning &nb. Codice articolo 362613533
Quantità: Più di 20 disponibili
Da: Brook Bookstore On Demand, Napoli, NA, Italia
Condizione: new. Questo è un articolo print on demand. Codice articolo 7bd9de395088341e551226ab281787bf
Quantità: Più di 20 disponibili
Da: Ria Christie Collections, Uxbridge, Regno Unito
Condizione: New. In. Codice articolo ria9789811551239_new
Quantità: Più di 20 disponibili
Da: BuchWeltWeit Ludwig Meier e.K., Bergisch Gladbach, Germania
Buch. Condizione: Neu. This item is printed on demand - it takes 3-4 days longer - Neuware -Over the next few decades, millions of people, with varying backgrounds and levels of technical expertise, will have to effectively interact with robotic technologies on a daily basis. This means it will have to be possible to modify robot behavior without explicitly writing code, but instead via a small number of wearable devices or visual demonstrations. At the same time, robots will need to infer and predict humans' intentions and internal objectives on the basis of past interactions in order to provide assistance before it is explicitly requested; this is the basis of imitation learning for robotics. This book introduces readers to robotic imitation learning based on human demonstration with wearable devices. It presents an advanced calibration method for wearable sensors and fusion approaches under the Kalman filter framework, as well as a novel wearable device for capturing gestures and other motions. Furthermore it describes the wearable-device-based and vision-based imitation learning method for robotic manipulation, making it a valuable reference guide for graduate students with a basic knowledge of machine learning, and for researchers interested in wearable computing and robotic learning. 232 pp. Englisch. Codice articolo 9789811551239
Quantità: 2 disponibili
Da: buchversandmimpf2000, Emtmannsberg, BAYE, Germania
Buch. Condizione: Neu. Neuware -Over the next few decades, millions of people, with varying backgrounds and levels of technical expertise, will have to effectively interact with robotic technologies on a daily basis. This means it will have to be possible to modify robot behavior without explicitly writing code, but instead via a small number of wearable devices or visual demonstrations. At the same time, robots will need to infer and predict humans¿ intentions and internal objectives on the basis of past interactions in order to provide assistance before it is explicitly requested; this is the basis of imitation learning for robotics.This book introduces readers to robotic imitation learning based on human demonstration with wearable devices. It presents an advanced calibration method for wearable sensors and fusion approaches under the Kalman filter framework, as well as a novel wearable device for capturing gestures and other motions. Furthermore it describes the wearable-device-based and vision-based imitation learning method for robotic manipulation, making it a valuable reference guide for graduate students with a basic knowledge of machine learning, and for researchers interested in wearable computing and robotic learning.Springer Verlag GmbH, Tiergartenstr. 17, 69121 Heidelberg 232 pp. Englisch. Codice articolo 9789811551239
Quantità: 2 disponibili
Da: AHA-BUCH GmbH, Einbeck, Germania
Buch. Condizione: Neu. Druck auf Anfrage Neuware - Printed after ordering - Over the next few decades, millions of people, with varying backgrounds and levels of technical expertise, will have to effectively interact with robotic technologies on a daily basis. This means it will have to be possible to modify robot behavior without explicitly writing code, but instead via a small number of wearable devices or visual demonstrations. At the same time, robots will need to infer and predict humans' intentions and internal objectives on the basis of past interactions in order to provide assistance before it is explicitly requested; this is the basis of imitation learning for robotics. This book introduces readers to robotic imitation learning based on human demonstration with wearable devices. It presents an advanced calibration method for wearable sensors and fusion approaches under the Kalman filter framework, as well as a novel wearable device for capturing gestures and other motions. Furthermore it describes the wearable-device-based and vision-based imitation learning method for robotic manipulation, making it a valuable reference guide for graduate students with a basic knowledge of machine learning, and for researchers interested in wearable computing and robotic learning. Codice articolo 9789811551239
Quantità: 1 disponibili
Da: California Books, Miami, FL, U.S.A.
Condizione: New. Codice articolo I-9789811551239
Quantità: Più di 20 disponibili
Da: Lucky's Textbooks, Dallas, TX, U.S.A.
Condizione: New. Codice articolo ABLIING23Apr0412070089290
Quantità: Più di 20 disponibili
Da: Revaluation Books, Exeter, Regno Unito
Hardcover. Condizione: Brand New. 232 pages. 9.25x6.10x0.75 inches. In Stock. Codice articolo x-9811551235
Quantità: 2 disponibili