A well-architected data lakehouse provides an open data platform that combines streaming with data warehousing, data engineering, data science and ML. This opens a world beyond streaming to solving business problems in real-time with analytics and AI. See how companies like Albertsons have used Databricks and Confluent together to combine Kafka streaming with Databricks for their digital transformation.
In this talk, you will learn:
- The built-in streaming capabilities of a lakehouse
- Best practices for integrating Kafka with Spark Structured Streaming
- How Albertsons architected their data platform for real-time data processing and real-time analytics
발표자
Emma Liu
발표자
Nitin Saksena
At Albertsons, Nitin leads the Enterprise Architecture Team across eCommerce, Digital Shopping Experience, Marketing and Media Collective, Merchandising and Pharmacy. He has 19+ years of experience in industry predominantly in Retail. Nitin has architected several key initiatives in Loyalty, Offer Execution and Redemption, Order Management and Customer Support. His area of interest is to solve complex business problems with simple solutions.
발표자
Ram Dhakne
Ram works as a Staff Solutions Engineer at Confluent. He has a wide array of experience in NoSQL databases, Filesystems, Distributed Systems and Apache Kafka. His current interests are in helping customers adopt realtime event streaming technologies using Kafka. He supports various industry verticals ranging from large retailers, healthcare, telecom, and utilities companies towards their digital modernization journey.