[Demo] Design Event-Driven Microservices for Cloud → Register Now

Presentation

Data Contracts In Practice With Debezium and Apache Flink

« Kafka Summit London 2024

Log-based change data capture (CDC) is an invaluable part of the data engineering toolbox: it enables a variety of use cases such as real-time analytics, full-text search, or cache invalidation by publishing data change events from your database. But when publishing change event streams across context or team boundaries, aren’t you tieing external consumers to your application’s data model, thus limiting yourself in evolving the same?

Enter data contracts—consciously designed abstractions between your internal data model and the outside world. Come and join us for this session to learn about:

  • Challenges you may encounter when exposing table level change event streams and how data contracts can mitigate them
  • Implementation strategies for data contracts such as the outbox pattern and stream processing
  • Evolving your data model and the corresponding data contracts, without breaking any existing consumers

We’ll also touch on some advanced topics at the intersection of CDC and stream processing, such as hydrating partial change events, using the popular change stream processing duo of Debezium and Apache Flink.

Related Links

How Confluent Completes Apache Kafka eBook

Leverage a cloud-native service 10x better than Apache Kafka

Confluent Developer Center

Spend less on Kafka with Confluent, come see how