Eine Echtzeit-Brücke in die Cloud bauen – mit Confluent Platform 7.0 und Cluster Linking | Blog lesen

Using Kafka as a Database For Real-Time Transaction Processing

You have learned about Kafka event sourcing with streams and using Kafka as a database, but you may be having a tough time wrapping your head around what that means and what challenges you will face. Kafka’s exactly once semantics, data retention rules, and stream DSL make it a great database for real-time transaction processing. This talk will focus on how to use Kafka events as a database. We will talk about using KTables vs GlobalKTables, and how to apply them to patterns we use with traditional databases. We will go over a real-world example of joining events against existing data and some issues to be aware of. We will finish covering some important things to remember about state stores, partitions, and streams to help you avoid problems when your data sets become large.

Moderator

Chad Preisler

Chad has been developing applications in Java for over 20 years. When he isn’t writing code, he enjoys spending time with his wife and three kids. Biking, hiking, and camping are some of his favorite things. He is currently working at Northwestern Mutual. Over the last three years he has been designing and building the eSignature system at Northwestern Mutual using Kafka as an event sourcing backbone. The system comprises of 30 plus stream applications written with the Kafka Java Stream DSL and uses Kafka as its transactional database. The system has end to end encryption and processes millions of transactions a year.