Ahorra un 25 % (o incluso más) en tus costes de Kafka | Acepta el reto del ahorro con Kafka de Confluent


Handling eventual consistency in a transactional world

┬ź Kafka Summit London 2022

Change data capture (CDC) is a widely used solution to offload data in real time from legacy systems to Kafka in order to make it available to all the other downstream consumer applications. Despite other solutions CDC can in fact guarantee at the same time low latency and a very small footprint on the source system. However when data is moved from a relational database to a distributed stream platform what is gained in terms of throughput and latency is lost in terms of strong consistency and not all consumers are able to manage this loss by themselves. There are different upstream solutions that can be implemented to mitigate this problem preserving different levels of consistency.

In this talk weÔÇÖll:

  • see what is eventual consistency and where strong consistency is lost while moving data from a database to Kafka
  • describe different solutions to preserve consistency working at the source level (i.e. outbox pattern and call back pattern), working on Kafka topology or working on an external storage (i.e. integration hub)
  • analyze the pros and cons of all the presented solutions in terms of consistency guarantees and latency loss

Related Links

How Confluent Completes Apache Kafka eBook

Leverage a cloud-native service 10x better than Apache Kafka

Confluent Developer Center

Spend less on Kafka with Confluent, come see how