[Webinar] Q1 Confluent Cloud Launch Brings You the Latest Features | Register Now

eBook

Practical Data Mesh: Building Decentralized Data Architectures with Event Streams

Data Mesh Architectures with Event Streams

Why a data mesh?

Predicated on delivering data as a first-class product, data mesh focuses on making it easy to publish and access important data across your organization. An event-driven data mesh combines the scale and performance of data in motion with product-focused rigor and self-service capabilities, putting data at the front and center of both operational and analytical use-cases.

Underpinned by four major principles, data mesh is a renegotiation of social responsibilities in combination with modern event streaming technologies. The result is a network of continually updating event-streams that provide both fresh information and a historical record, enabling consumers to choose and use the data as they see fit.

Author Adam Bellemare explains how an event-driven data mesh built on top of Apache Kafka® provides the optimum way to access important business data and unify the operational and analytical planes. He also walks you through a proof of concept self-service platform built using Confluent Cloud, that ties together the data mesh principles with real-world technical tradeoffs.

Don’t just take it from us - a real-world case study illustrates the implementation of an event-driven data mesh by Saxo Bank, including their challenges, technology choices, and the implementation of the data mesh principles.

  • Learn about a brief history of data problems, including those faced in data warehouses and data lakes
  • Get a full overview of data mesh, including the four main principles and how they apply to event streams
  • Learn principles for building a data product, including the data product alignment, integrating with sources, and building for consumer use-cases
  • Adopt guidelines for creating a self-service platform for data product owners and consumers, to streamline both the creation and management of data products as well as discovery and usage
  • Explore how an event-driven data mesh reduces barriers between operational and analytical use-cases, while simultaneously enabling both real-time applications and batch-based jobs
  • Learn about Saxo Bank’s data mesh implementation and technical challenges, as well as their recommendations for success
  • Explore the Confluent data mesh proof of concept self-service platform, check out the source code, and try it for yourself

Author

Adam Bellemare

Staff Technologist, Office of the CTO

Adam Bellemare is a staff technologist at Confluent, and formerly a data platform engineer at Shopify, Flipp, and BlackBerry. He has worked in the data space for over a decade, with a successful history in event-driven microservices, distributed data architectures, and integrating streaming data across organizations. He is also the author of the O’Reilly titles “Building Event-Driven Microservices” and “Building an Event-Driven Data Mesh.”

Get the eBook

Additional Resources

cc demo
kafka microservices
Image-Event-Driven Microservices-01

Additional Resources

cc demo
kafka microservices
microservices-and-apache-kafka