KSQL in Action: Enriching CSV Events with Data from RDBMS into AWS
Life would be simple if data lived in one place: one single solitary database to rule them all. Anything that needed to be joined to anything could be with a
Register for Apache Kafka®, Confluent, and the Data Mesh
Life would be simple if data lived in one place: one single solitary database to rule them all. Anything that needed to be joined to anything could be with a
One of the most frequent questions and topics that I see come up on community resources such as StackOverflow, the Confluent Platform mailing list, and the Confluent Community Slack group,
In this post I’m going to show what streaming ETL looks like in practice. We’re replacing batch extracts with event streams, and batch transformation with in-flight transformation. But first, a
KSQL is the streaming SQL engine for Apache Kafka®. It lets you do sophisticated stream processing on Kafka topics, easily, using a simple and interactive SQL interface. In this short
We saw in the earlier articles (part 1, part 2) in this series how to use the Kafka Connect API to build out a very simple, but powerful and scalable, streaming
In the previous article in this blog series I showed how easy it is to stream data out of a database into Apache Kafka®, using the Kafka Connect API. I
This short series of articles is going to show you how to stream data from a database (MySQL) into Apache Kafka® and from Kafka into both a text file and Elasticsearch—all
Note As of February 2021, Confluent has launched its own Oracle CDC connector. Read this blog post for the latest information.
Use CL60BLOG to get an additional $60 of free Confluent Cloud