[ウェビナー] ストリーミングデータメッシュを構築する方法 | 今すぐ登録

Presentation

Change Data Capture Pipelines with Debezium and Kafka Streams

« Kafka Summit 2020

Streams Change data capture (CDC) via Debezium is liberation for your data: By capturing changes from the log files of the database, it enables a wide range of use cases such as reliable microservices data exchange, the creation of audit logs, invalidating caches and much more.

In this talk we're taking CDC to the next level by exploring the benefits of integrating Debezium with streaming queries via Kafka Streams. Come and join us to learn:
• How to run low-latency, time-windowed queries on your operational data
• How to enrich audit logs with application-provided metadata
• How to materialize aggregate views based on multiple change data streams, ensuring transactional boundaries of the source database

We'll also show how to leverage the Quarkus stack for running your Kafka Streams applications on the JVM, as well as natively via GraalVM, many goodies included, such as its live coding feature for instant feedback during development, health checks, metrics and more.

Related Links

How Confluent Completes Apache Kafka eBook

Leverage a cloud-native service 10x better than Apache Kafka

Confluent Developer Center

Spend less on Kafka with Confluent, come see how