[Ebook] The Builder's Guide to Streaming Data Mesh | Read Now


Why you should have a Schema Registry

« Kafka Summit Europe 2021

Kafka moves blobs of data from one place to another. That's its job. Kafka doesn't care what the blob is or what it looks like. This can be a boon because it's simple and it allows for a multitude of use cases. It can also be a curse in those cases when you DO want to have control over what that blob may look like. Especially when you want to share a topic with another team it is important that you have clear-cut rules for what you want to allow on that topic and what not. Or in other words, you need a clearly defined interface contract. In the RESTful world the case is clear: You would define an OpenAPI spec and give it to the other team. Done. What about the event streaming case though? Would you treat your topic like an API? If you're not sure about the answer then this talk is for you. You'll learn about the schema registry, a centralized data governance tool which allows you to define, and more importantly, enforce interface contracts among Kafka clients.

Related Links

How Confluent Completes Apache Kafka eBook

Leverage a cloud-native service 10x better than Apache Kafka

Confluent Developer Center

Spend less on Kafka with Confluent, come see how