[Demo+Webinar] New Product Updates to Make Serverless Flink a Developer’s Best Friend | Watch Now
The concept of the Data Mesh is making headway in enterprise data design, fueled by core principles of contextual data domains, local governance, and decentralized integration. Kafka makes the data mesh scalable and resilient with event sourcing and replication. But how do you join multiple data domains on a single node in your mesh, where they all need to stay consistent on the same data changes, without going back to the central data store?
In this session we’ll introduce the concept of the Canonical Stream, an ordered, declarative event stream of information about a thing that exists in the real world, with its own context and governance. The Canon is technology agnostic, and data context agnostic - events on the Canon provide updates about the thing itself, and must be consumed and interpreted differently for each data domain.
You’ll learn: