Migrate from Kafka services to Confluent | Download Step-by-Step Guide

Real-time Order Notifications: How ACERTUS Drives Customer Updates with Confluent

Verfasst von

Today’s consumers expect to be kept updated on the status of their orders 24/7. Whether they’re requesting a ride, receiving a package, or purchasing items online, they want to know the current status without having to contact sellers directly.

At ACERTUS, we know that today’s car buyers expect that, too. If you’ve ever used a marketplace car buying company like Driveway, Vroom, Carvana, TrueCar, or CarGurus, odds are you’ve used our service without even knowing it.

As the only vehicle logistics company in the U.S. that handles transportation, storage, reconditioning, and registration services for car buying and selling, we make sure customers get that seamless experience they’re used to with a real-time view of their order status across different business lines.

ACERTUS’s goal is that everyone buying a car online has a transparent, accessible experience on our app platforms. But for a long time, we were operating far from our full potential. The center of the problem was that we didn’t have access to the real-time data from our apps to make that happen. Each of our units were operating with their own siloed, monolithic approach to data and relied on batch processing to communicate. That was due largely to acquisitions the company made in the 2010s.

At a time when data needed to be shared quickly and efficiently across business units, our teams were relying on manual processes, which were inefficient and error-prone. By the time data made it into a report, it was either not relevant anymore, or too late.

We knew we could do better, and that’s where Confluent stepped in.

Data cohesion built the foundation for real-time notifications

Prior to adopting Confluent, ACERTUS had three separate apps for each of its business units: one each for Car Haul, Drive Away, and Title and Registration. Each had its own systems and processes, with virtually no database coordination among the thousands of vehicles sold monthly—a cumbersome process that was bad for business. 

Applications depended on various databases (Postgres, MySQL, and Microsoft SQL Server) that didn’t communicate, so customers couldn’t get a real-time view of orders across lines of business. Without real-time integration between databases, we relied on inefficient, time-intensive, and error-prone processes to update customers on their orders. Resources were wasted, customer experience was suffering, and internally we couldn’t even retain data indefinitely, which prohibited us from developing new microservices down the road.

Building a modern microservices architecture on data streaming

We knew that we had to integrate these databases to move our infrastructure forward. The first step was building a centralized data warehouse that pulled that data together. 

Although this gave us the visibility we needed across our different databases, it didn’t help us deliver real-time use cases. This was because data was being sent to the warehouse via batch ETL, which meant that by the time reports were generated, the data was already out of date. 

So adopting event streaming was the key to real-time use cases and overall infrastructure modernization. Though we’d evaluated Amazon Kinesis and Amazon MSK (we’re generally an AWS shop), Confluent’s data streaming platform stood out through a number of features and offerings, including:

  1. Infinite storage and retention: Confluent can store data in perpetuity with no storage limits. This contrasts with Amazon Kinesis, which has a 24-hour retention by default and 365-day maximum, and Amazon MSK serverless, which has a one-day maximum retention.

  2. ksqlDB: Confluent offers an accessible way of processing data streams via ksqlDB. This facilitated a faster development of microservices. (Check out the kSQL code snippet below to see how we handle real-time order updates.) 

  3. Stream governance: We use various features included in Confluent’s Stream Governance package, including Schema Registry (for evolving and validating schemas) and Stream Lineage (to visualize and communicate data flows).

  4. Pre-built, fully managed connectors: We use Confluent’s CDC connectors for Postgres, MySQL, and SQLServer. Amazon MSK does not offer managed connectors.

  5. An outstanding support team: Amazon MSK was essentially the same as self-managing Kafka. We wanted the fully managed service Confluent was offering.

  6. Data in motion: We appreciated the fact that Confluent has a vision and roadmap.

All these things together allowed us to integrate data across our databases in real time to easily implement several new event-driven microservices, including real-time order notifications—it immediately made a difference for us.

Once the warehouse was built, accessing the right data in the first place was simple. We plugged Confluent into each of the three separate legacy databases, and it was almost immediately available centrally without having to go through any chokepoints.

ksqlDB code snippet that facilitates real-time order updates from the ACERTUS database

Building a better experience for the business and customers

The outcomes of onboarding Confluent have been substantial. The customer experience has improved greatly, as have operational efficiencies. And, our developer team can deploy microservices much faster than before, thanks to ksqlDB and Stream Governance. 

Because we can receive and process orders in real time, we save costs as well as time and  resources by not needing to manually request info from disparate databases. We also reduced costs by using the data to give offers on cars in real time. Instead of negotiating and finding the right price, Confluent lets us use AI to draw on data and find a good price that the customer will accept.

Customer outcomes are equally as impressive. When an order comes into our system, our process immediately analyzes customer needs and the customer profile, connects it with a carrier or service provider, and notifies the customer that a driver is on the way to pick up the car. That experience is invaluable to a user.

Here’s a look at the architecture we’re running today with Confluent:

Confluent Cloud has become the core of all our data storage, connecting Kafka externally with the marketplace gateway and consumers, and internally with Streaming ETL, Salesforce, and our data. —Jeffrey Jonathan Jennings, VP of Data and Integration at ACERTUS

 Real-time data drives real-time outcomes

We’ve seen solid business results across all the units that used to be siloed, and even earned new business. For example, for our customer CarGurus alone we’ve integrated systems so that all data is tracked both in and out with them. CarGurus chose to work with us because of our Confluent use, since they knew it would be easy to integrate and get real-time order updates.

This shared system acts as a fail-safe, too: if anything goes down on either end, we can retrieve data in Confluent and process it without missing a beat. Since we’re using Confluent for real-time data streaming, partners like CarGurus no longer need daily, monthly, or weekly reports—notifications are instant.

Confluent was able to help ACERTUS drive revenue growth and user experience by using the power of data to give customers real-time order notifications. 

If you’re looking to boost your order tracking, see how Confluent helps customers track order shipments in real time or how they’ve supported other uses of data like real-time inventory management with retailers like Walmart.

Ist dieser Blog-Beitrag interessant? Jetzt teilen