Elevate your stream processing w/ The Force of Kafka + Flink Awakens | Read Now


From Bytes to Objects: Describing Your Events

« Kafka Summit Americas 2021

Events stored in Kafka are just bytes, this is one of the reasons Kafka is so flexible. But when developing a producer or consumer you want objects, not bytes. Documenting and defining events provides a common way to discuss and agree on an approach to using Kafka. It also informs developers how to consume events without needing access to the developers responsible for producing events.

This talk will introduce the most popular formats for documenting events that flow through Kafka, such as AsyncAPI, Avro, CloudEvents, JSON schemas, and Protobuf.

It will discuss the differences between the approaches and how to decide on the documentation strategy for you. Alongside the formats, this session will also look at the tooling available for the different approaches. Tools for testing and code generation can make a big difference to your day-to-day developer experience. If you aren't already documenting your events or want to see other approaches, then this is the talk for you.

Related Links

How Confluent Completes Apache Kafka eBook

Leverage a cloud-native service 10x better than Apache Kafka

Confluent Developer Center

Spend less on Kafka with Confluent, come see how