Gone are the days of end-of-day batch processes and static data locked in silos. Financial institutions and fintech companies need to see their business in near real time, with the ability to respond in milliseconds—not hours. They have to integrate, aggregate, curate, and disseminate data and events within and across production environments, which is why Apache Kafka® is at the epicenter of seismic change in the way they think about data. In this keynote from Kafka Summit 2020, hear from Leon Stiel, Director, Citigroup, on how the company is tackling its data challenges and using event streaming to drive efficiencies and improve customer experiences.
You have real-time data: stock prices, things that are ticking. You have data that's pretty static: terms and conditions, things like that. And then you have data that's updating periodically: things like position updates. If you can use a tool like Kafka to pull all that data together and combine it in ways that you display to end users—whether they be traders, salespeople, managers—and you provide them analytics across that data, it's extremely powerful.