Getting Started with the Kafka Streams API using Confluent Docker Images
Introduction What’s great about the Kafka Streams API is not just how fast your application can process data with it, but also how fast you can get up and running
배포를 선택하세요.
Introduction What’s great about the Kafka Streams API is not just how fast your application can process data with it, but also how fast you can get up and running
Note: The blog post Ensure Data Quality and Data Evolvability with a Secured Schema Registry contains more recent information. If you use Apache Kafka to integrate and decouple different data
Seems that many engineers have “Learn Kafka” on their new year resolution list. This isn’t very surprising. Apache Kafka is a popular technology with many use-cases. Armed with basic Kafka
This blog post is the third in a series about the Streams API of Apache Kafka, the new stream processing library of the Apache Kafka project, which was introduced in Kafka v0.10.
This is a guest blog from Fangjin Yang. Fangjin is the co-founder and CEO of Imply, a San Francisco based technology company, and one of the main committers of the Druid
TLS, Kerberos, SASL, and Authorizer in Apache Kafka 0.9 – Enabling New Encryption, Authorization, and Authentication Features Apache Kafka is frequently used to store critical data making it one of
When Apache Kafka® was originally created, it shipped with a Scala producer and consumer client. Over time we came to realize many of the limitations of these APIs. For example,
Previously, I posted about the Kafka REST Proxy from Confluent, which provides easy access to a Kafka cluster from any language. That post focused on the motivation, low-level examples, and
This post has been written in collaboration with Derrick Harris from Mesosphere and Joe Stein, a Kafka committer. For an updated version of this article, please see Apache Mesos, Apache Kafka and
Use CL60BLOG to get an additional $60 of free Confluent Cloud