If your data platform is powered only by batch data processing, you know you are always trailing your customer. Your databases aren’t always up to date. Your inability to have a synchronized data flow across systems leads to operational inefficiencies. And, your dreams of running advanced real-time AI and ML applications can’t be fulfilled. However, you might be wary of the implications of turning your product into an event-driven one. In this presentation we’ll share our experience transforming our CDP-based marketing orchestration engine to be both real-time and highly scalable with the Kafka ecosystem. We will look into how we saved resources with Connect when ingesting and syncing data with NoSQL databases, data warehouses and third-party platforms. What we did to turn ksqlDB into our data transformation, aggregation and querying hub, reducing latency and costs. How Streams helps us activate multiple real-time applications such as building identity graphs, updating materialized views in high frequency for efficient real-time lookups and inferencing machine learning models. Finally, we will look at how Confluent Cloud solved our pre-rollout sizing and scaling questions, significantly reducing time-to-market.