[Ebook] The Builder's Guide to Streaming Data Mesh | Read Now

Presentation

Consistent, High-throughput, Real-time Calculation Engines Using Kafka Streams

« Kafka Summit London 2023

Building financial-grade applications involve performing complex calculations over a wide range of data from across different domains, with challenges including stringent accuracy requirements, latency constraints, along with the need to share states across distributed services.

During this session, I will cover how, at Morgan Stanley, we built a real-time, microservices based Liquidity Management platform using event streaming with Kafka Streams API, to tackle high volumes of data and to perform calculations on cross domain events, spanning wide time windows over the past and the future.

I will demonstrate how we used Kafka Streams & state stores, along with patterns like Saga to achieve eventual data consistency and use state-enriched events to decouple services when transferring them through multiple business domains. I will cover mechanisms to ensure accuracy and transparency with idempotency at heart along with error detection and replay strategies.

Finally, I will look at how we used a high-performant in-memory cache to stage the results of cascaded KStream based calculation engines, which powered our high-speed, ticking and stateful data visualisations.

Related Links

How Confluent Completes Apache Kafka eBook

Leverage a cloud-native service 10x better than Apache Kafka

Confluent Developer Center

Spend less on Kafka with Confluent, come see how