Build Predictive Machine Learning with Flink | Workshop on Dec 18 | Register Now
Michelin, the world’s leading mobility company, was struggling with unreliable, legacy batch reporting across their global supply chain, with stale inventory and reporting data hampering global logistics operations. They needed a better way to modernize their operations by accessing real-time inventory data to remove these blockers across their entire supply chain.
This is why they chose Apache Kafka®, the de facto standard for data streaming, for their new event-driven architecture. However, self-managing open source Kafka on-prem led to frequent outages for mission-critical applications, requiring Michelin to dedicate valuable engineers to ongoing Kafka management and maintenance.
This was obviously not a sustainable solution. To free their engineers to focus on building mission-critical applications rather than the underlying infrastructure, Michelin turned to Confluent Cloud, a cloud-native and complete data streaming platform, fully supported by the Kafka experts.
In this interactive webinar, Paul Amar, Delivery Manager, and Sebastien Viale, Technical Kafka Expert, at Michelin, will dive into how the organization transitioned to an event-driven architecture, why they chose Confluent, and share best practices on how to achieve the benefits it has provided for teams across the company.
You’ll learn how Michelin: