The Azure Data Lakehouse Toolkit (Paperback)
Ron L'Esteve
Venduto da AussieBookSeller, Truganina, VIC, Australia
Venditore AbeBooks dal 22 giugno 2007
Nuovi - Brossura
Condizione: Nuovo
Quantità: 1 disponibili
Aggiungere al carrelloVenduto da AussieBookSeller, Truganina, VIC, Australia
Venditore AbeBooks dal 22 giugno 2007
Condizione: Nuovo
Quantità: 1 disponibili
Aggiungere al carrelloPaperback. Design and implement a modern data lakehouse on the Azure Data Platform using Delta Lake, Apache Spark, Azure Databricks, Azure Synapse Analytics, and Snowflake. This book teaches you the intricate details of the Data Lakehouse Paradigm and how to efficiently design a cloud-based data lakehouse using highly performant and cutting-edge Apache Spark capabilities using Azure Databricks, Azure Synapse Analytics, and Snowflake. You will learn to write efficient PySpark code for batch and streaming ELT jobs on Azure. And you will follow along with practical, scenario-based examples showing how to apply the capabilities of Delta Lake and Apache Spark to optimize performance, and secure, share, and manage a high volume, high velocity, and high variety of data in your lakehouse with ease.The patterns of success that you acquire from reading this book will help you hone your skills to build high-performing and scalable ACID-compliant lakehouses using flexible and cost-efficient decoupled storage and compute capabilities. Extensive coverage of Delta Lake ensures that you are aware of and can benefit from all that this new, open source storage layer can offer. In addition to the deep examples on Databricks in the book, there is coverage of alternative platforms such as Synapse Analytics and Snowflake so that you can make the right platform choice for your needs.After reading this book, you will be able to implement Delta Lake capabilities, including Schema Evolution, Change Feed, Live Tables, Sharing, and Clones to enable better business intelligence and advanced analytics on your data within the Azure Data Platform.What You Will LearnImplement the Data Lakehouse Paradigm on Microsofts Azure cloud platformBenefit from the new Delta Lake open-source storage layer for data lakehouses Take advantage of schema evolution, change feeds, live tables, and moreWritefunctional PySpark code for data lakehouse ELT jobsOptimize Apache Spark performance through partitioning, indexing, and other tuning optionsChoose between alternatives such as Databricks, Synapse Analytics, and SnowflakeWho This Book Is ForData, analytics, and AI professionals at all levels, including data architect and data engineer practitioners. Also for data professionals seeking patterns of success by which to remain relevant as they learn to build scalable data lakehouses for their organizations and customers who are migrating into the modern Azure Data Platform. Intermediate user level Shipping may be from our Sydney, NSW warehouse or from our UK or US warehouse, depending on stock availability.
Codice articolo 9781484282328
Design and implement a modern data lakehouse on the Azure Data Platform using Delta Lake, Apache Spark, Azure Databricks, Azure Synapse Analytics, and Snowflake. This book teaches you the intricate details of the Data Lakehouse Paradigm and how to efficiently design a cloud-based data lakehouse using highly performant and cutting-edge Apache Spark capabilities using Azure Databricks, Azure Synapse Analytics, and Snowflake. You will learn to write efficient PySpark code for batch and streaming ELT jobs on Azure. And you will follow along with practical, scenario-based examples showing how to apply the capabilities of Delta Lake and Apache Spark to optimize performance, and secure, share, and manage a high volume, high velocity, and high variety of data in your lakehouse with ease.
The patterns of success that you acquire from reading this book will help you hone your skills to build high-performing and scalable ACID-compliant lakehouses using flexible and cost-efficient decoupled storage and compute capabilities. Extensive coverage of Delta Lake ensures that you are aware of and can benefit from all that this new, open source storage layer can offer. In addition to the deep examples on Databricks in the book, there is coverage of alternative platforms such as Synapse Analytics and Snowflake so that you can make the right platform choice for your needs.
Having several Azure Data, AI, and Lakehouse certifications under his belt, Ron has been a go-to technical advisor for some of the largest and most impactful Azure implementation projects on the planet. He has been responsible for scaling key data architectures, defining the road map and strategy for the future of data and business intelligence needs, and challenging customers to grow by thoroughly understanding the fluid business opportunities and enabling change by translating them into high-quality and sustainable technical solutions that solve the most complex challenges and promote digital innovation and transformation.
Ron is a gifted presenter and trainer, known for his innate ability to clearly articulate and explain complex topics to audiences of all skill levels. He applies a practical and business-oriented approach by taking transformational ideas from concept to scale. He is a true enabler of positive and impactful change by championing a growth mindset.
Le informazioni nella sezione "Su questo libro" possono far riferimento a edizioni diverse di questo titolo.
Visita la pagina della libreria
We guarantee the condition of every book as it's described on the Abebooks web sites. If you're dissatisfied with your purchase (Incorrect Book/Not as Described/Damaged) or if the order hasn't arrived, you're eligible for a refund within 30 days of the estimated delivery date. If you've changed your mind about a book that you've ordered, please use the Ask bookseller a question link to contact us and we'll respond within 2 business days.
Please note that titles are dispatched from our UK and NZ warehouse. Delivery times specified in shipping terms. Orders ship within 2 business days. Delivery to your door then takes 8-15 days.
Quantità dell?ordine | Da 25 a 60 giorni lavorativi | Da 8 a 59 giorni lavorativi |
---|---|---|
Primo articolo | EUR 31.77 | EUR 37.78 |
I tempi di consegna sono stabiliti dai venditori e variano in base al corriere e al paese. Gli ordini che devono attraversare una dogana possono subire ritardi e spetta agli acquirenti pagare eventuali tariffe o dazi associati. I venditori possono contattarti in merito ad addebiti aggiuntivi dovuti a eventuali maggiorazioni dei costi di spedizione dei tuoi articoli.