Rise of the Kafka Heroes! Join the Data Streaming Revolution | Read the Comic


Practical Data Mesh: Building Decentralized Data Architectures with Event Streams

Data Mesh Architectures with Event Streams

Why a data mesh?

Predicated on delivering data as a first-class product, data mesh focuses on making it easy to publish and access important data across your organization. An event-driven data mesh combines the scale and performance of data in motion with product-focused rigor and self-service capabilities, putting data at the front and center of both operational and analytical use-cases.

Underpinned by four major principles, data mesh is a renegotiation of social responsibilities in combination with modern event streaming technologies. The result is a network of continually updating event-streams that provide both fresh information and a historical record, enabling consumers to choose and use the data as they see fit.

Author Adam Bellemare explains how an event-driven data mesh built on top of Apache Kafka® provides the optimum way to access important business data and unify the operational and analytical planes. He also walks you through a proof of concept self-service platform built using Confluent Cloud, that ties together the data mesh principles with real-world technical tradeoffs.

Don’t just take it from us - a real-world case study illustrates the implementation of an event-driven data mesh by Saxo Bank, including their challenges, technology choices, and the implementation of the data mesh principles.

  • Learn about a brief history of data problems, including those faced in data warehouses and data lakes
  • Get a full overview of data mesh, including the four main principles and how they apply to event streams
  • Learn principles for building a data product, including the data product alignment, integrating with sources, and building for consumer use-cases
  • Adopt guidelines for creating a self-service platform for data product owners and consumers, to streamline both the creation and management of data products as well as discovery and usage
  • Explore how an event-driven data mesh reduces barriers between operational and analytical use-cases, while simultaneously enabling both real-time applications and batch-based jobs
  • Learn about Saxo Bank’s data mesh implementation and technical challenges, as well as their recommendations for success
  • Explore the Confluent data mesh proof of concept self-service platform, check out the source code, and try it for yourself


Adam Bellemare

Staff Technologist

Adam is a technologist in the Technology Strategy Group at Confluent. He has worked on a wide range of projects, including event-driven data mesh theory and proof of concepts, event-driven microservice strategies, and event and event stream design principles.

Before Confluent Adam worked in several e-commerce companies as a big data platform engineer, focused on building event-driven solutions with Apache Kafka. Adam is the author of O'Reilly's Building Event-Driven Microservices (2020).

Get the eBook

Additional Resources

cc demo
kafka microservices
Image-Event-Driven Microservices-01

Additional Resources

cc demo
kafka microservices