[Atelier] Le traitement des flux en toute simplicité avec Flink | S'inscrire

From Outages to On-Time Delivery: How Confluent Cloud Transformed a Delivery Company's Data Infrastructure

Écrit par

Today’s consumers expect their orders to be fulfilled rapidly. Retail giants like Amazon, Target, and Walmart all offer same day delivery, and consumers now demand the same from other retailers; according to a recent study by McKinsey, 30% of consumers consider same day delivery as table stakes when making purchases online.

To enable a reliable same day delivery service, however, delivery companies must implement a data infrastructure which seamlessly connects disparate systems – customer orders, vendor stock levels, and last-mile delivery platforms all need to work in tandem to ensure that items arrive with the consumer on time. 

In this blog, we’ll explain how one grocery delivery company guarantees same day delivery with data streaming and Confluent as the backbone of their data infrastructure. We’ll outline the challenges they were facing prior to partnering with Confluent, before delving into how their current event-driven architecture is enabling them to meet the lightning-fast demands of customers. 

Challenges with Traditional Approaches

Before turning to Confluent Cloud, this company had been relying on Apache Kafka in Azure HDInsight for core operations; it was powering everything from order and merchant updates to communication with delivery drivers. Despite playing such a central role in the business, however, their previous implementation presented a number of challenges. 

The main issue was that their team lacked the expertise required to maintain their Azure HDInsight implementation of Kafka. Without a fully managed Kafka service, the team spent a significant amount of time troubleshooting. Ultimately, software upgrades and security patches were missed, and their streaming pipelines failed frequently – this led to outages in their grocery delivery service, causing significant inconvenience to customers. 

In addition to this, their previous implementation of Kafka prevented them from innovating further with data streaming, while also taking up a substantial proportion of their operating budget. In their previous setup on Azure HDInsights (shown below), the company lacked managed connectors – not only did this increase the maintenance burden on the infrastructure team, but also severely limited the possibility to sync data to other datastores used in the organization. On top of this, their previous platform didn’t offer any form of stream processing, making it more difficult to use their data streams for real-time use cases. Lastly, as Azure HDInsights is Hadoop-based, the team had to stand up, run, and pay for an entire Hadoop cluster in order to use Kafka – an extra burden and cost.

Previous architecture

The Solution: A Cloud-Native Kafka Service 

In order to significantly reduce the risk of downtime and the burden of continual maintenance, the company needed a fully-managed data streaming platform. As a result, they decided to implement Confluent Cloud – a complete, cloud-native Kafka service powered by the Kora engine. With Confluent at the center of their streaming architecture, they could ensure a high availability and reliability of their Kafka implementation, and therefore avoid any disruption to their same day delivery service. 

Same Day Delivery: Example Architecture

In addition to being fully-managed, Confluent Cloud offers a number of features which facilitate the provision of the grocery delivery company’s same day delivery service:

  • Multi-AZ dedicated cluster with VNET peering on Azure ensures the high availability and reliability of their streaming data pipelines. 

  • Pre-built connectors – Confluent offers a library of over 120 pre-built connectors, 70 of which are fully-managed. This company leverages Debezium MongoDB and SQL Server connectors to stream customer, transaction, and shipping data (with average throughput of 10MBps) to Confluent, as well as a JDBC Sink connector to stream from Confluent into other databases.

  • Schema Registry – A key component of Stream Governance, Schema Registry is used by this company to ensure the consistency of their data – it validates data schemas between producers and consumers, providing compatibility checks, versioning, and facilitating the evolution of schemas.

  • Supported client libraries – Confluent supports a number of popular client libraries, including Java, Python, and REST, enabling a broad range of developers to interact with Kafka. This company took advantage of Confluent’s .NET client library to interact with their .NET applications. 

Result: Seamless Kafka Ops and More Innovation!

With Confluent Cloud, this company has been able to navigate a number of technical challenges and soft pains to provide a reliable, secure data infrastructure for their same day delivery service. Moving from a Kafka implementation on Azure HDInsight, they’ve been able to ensure the high availability and reliability of their streaming data pipelines, preventing downtime and time-intensive maintenance periods. Furthermore, having offloaded the burden of managing their Kafka clusters to Confluent, this company’s data team has been freed up to improve existing applications and deliver new features, which ultimately improves the customer experience.

Ready to try Confluent Cloud?

Resources 

Avez-vous aimé cet article de blog ? Partagez-le !

Win the CSP & MSP Markets by Leveraging Confluent’s Data Streaming Platform and OEM Program

This blog explores how cloud service providers (CSPs) and managed service providers (MSPs) increasingly recognize the advantages of leveraging Confluent to deliver fully managed Kafka services to their clients. Confluent enables these service providers to deliver higher value offerings to wider...


Atomic Tessellator: Revolutionizing Computational Chemistry with Data Streaming

With Confluent sitting at the core of their data infrastructure, Atomic Tessellator provides a powerful platform for molecular research backed by computational methods, focusing on catalyst discovery. Read on to learn how data streaming plays a central role in their technology.