As the largest online marketplace in Switzerland, Ricardo is accustomed to tackling large-scale IT initiatives and completing them on time. So, when the company recently decided to undertake a migration from data center to the cloud, they made a bold move: canceling the contract with the data center in advance. Tobias Kaymak, Senior Data Engineer at Ricardo, recalls, “At that point, there was no going back; there would be no half-migration. Everyone was aware that we would be in the cloud by the target date, no excuses.”
The move was made simpler, in part, because Ricardo was already using Apache Kafka® for event streaming in its data center. Working with Confluent engineers, Kaymak and his data intelligence team easily transitioned from on-prem Kafka to Confluent Cloud, and completed the overall cloud migration on schedule. As he puts it, “We met our aggressive deadline, with no machines left behind.” Today, the team is deploying new streaming pipelines in the cloud with Confluent Cloud, Google BigQuery, Apache Beam, and Apache Flink.
The transformational event streaming and cloud journey underway at Ricardo is linked with Kaymak’s own personal journey with Kafka, which started when he was still a student. In his blog post, Kaymak shares his perspective on how far event streaming has come since he started with a single broker and Kafka 0.6 eight years ago, and how his individual journey is connected with those at Ricardo, other companies he has worked for, and the entire Kafka community.
When SEI Investments launched a transformation initiative in 2017, the company transitioned to an event-driven architecture based on Confluent Platform
Scrapinghub Accelerates Next-Generation Web Scraping Service with Confluent Cloud.
Severstal is using Confluent Platform to stream data from manufacturing sites, integrate microservices and feed machine learning models for predicting problems before they occur.