Rise of the Kafka Heroes! Join the Data Streaming Revolution | Read the Comic

Citigroup Uses Event Streaming to Connect Banking Data at Scale in a Highly Regulated Environment

"You have real-time data: stock prices, things that are ticking. You have data that's pretty static: terms and conditions, things like that. And then you have data that's updating periodically: things like position updates. If you can use a tool like Kafka to pull all that data together and combine it in ways that you display to end users—whether they be traders, salespeople, managers—and you provide them analytics across that data, it's extremely powerful."

See the case study


Gone are the days of end-of-day batch processes and static data locked in silos. Financial institutions and fintech companies need to see their business in near real time, with the ability to respond in milliseconds—not hours. They have to integrate, aggregate, curate, and disseminate data and events within and across production environments, which is why Apache Kafka® is at the epicenter of seismic change in the way they think about data. In this keynote from Kafka Summit 2020, hear from Leon Stiel, Director, Citigroup, on how the company is tackling its data challenges and using event streaming to drive efficiencies and improve customer experiences.

More Customer Stories


How DATEV Used Confluent to Unlock Its Mainframe Data to Build Cloud-Native Applications

NEW DKV Logo rgb (1)

DKV Mobility Accelerates Innovation with Confluent Cloud


Demonware works with Confluent and Apache Kafka to provide a platform for real-time data access.