Available On-demand
Learn how BigCommerce migrated 1.6 Billion events a day from Kafka to Confluent, saving 20+ hours a week in Kafka management in just five months.
BigCommerce had an ETL batch-based system that meant their merchants had to wait 8 hours before pulling any analytics and insights reporting (e.g., who is coming to their site, adding to their carts, popular products). The system consisted of 30 odd map-reduce jobs, which would sometimes fail and need manual intervention.
So what did they do?
This online talk showcases how BigCommerce transitioned their legacy ETL batch-based systems to an event-driven architecture in a four-phased approach.
Register now to learn how BigCommerce used Confluent to:
- Build a data streaming platform alongside Google Cloud Platform to deliver real-time features like product recommendations
- Provided real-time analytics and insights to their merchants via an open architecture
- Elastically scaled their Kafka deployment to support spikes from Black Friday traffic
- Increased system performance and reduced maintenance, such as software patches, blind spots in data infrastructure, and system updates
Presenters
Sophia Jiang
Sr. Product Marketing Manager, Confluent
Sophia Jiang is a Senior Product Marketing Manager at Confluent, where she is responsible for messaging and go-to-market activities for ksqlDB. Prior to Confluent, Sophia led the retail, CPG, and manufacturing GTM at MuleSoft.
Mahendra Kumar
VP Data and Software Engineering, BigCommerce
Mahendra Kumar is the VP of Data and Software Engineering at BigCommerce where he is responsible for Data Analytics, Search, ML strategy, and execution. Mahendra has built real-time distributed data platforms of massive scale in multiple organizations. He is passionate about building great teams and great products!