Rise of the Kafka Heroes! Join the Data Streaming Revolution | Read the Comic

eBook

Practical Data Mesh: Building Decentralized Data Architectures with Event Streams

Data Mesh Architectures with Event Streams

Why a data mesh?

Predicated on delivering data as a first-class product, data mesh focuses on making it easy to publish and access important data across your organization. An event-driven data mesh combines the scale and performance of data in motion with product-focused rigor and self-service capabilities, putting data at the front and center of both operational and analytical use-cases.

Underpinned by four major principles, data mesh is a renegotiation of social responsibilities in combination with modern event streaming technologies. The result is a network of continually updating event-streams that provide both fresh information and a historical record, enabling consumers to choose and use the data as they see fit.

Author Adam Bellemare explains how an event-driven data mesh built on top of Apache Kafka® provides the optimum way to access important business data and unify the operational and analytical planes. He also walks you through a proof of concept self-service platform built using Confluent Cloud, that ties together the data mesh principles with real-world technical tradeoffs.

Don’t just take it from us - a real-world case study illustrates the implementation of an event-driven data mesh by Saxo Bank, including their challenges, technology choices, and the implementation of the data mesh principles.

  • Learn about a brief history of data problems, including those faced in data warehouses and data lakes
  • Get a full overview of data mesh, including the four main principles and how they apply to event streams
  • Learn principles for building a data product, including the data product alignment, integrating with sources, and building for consumer use-cases
  • Adopt guidelines for creating a self-service platform for data product owners and consumers, to streamline both the creation and management of data products as well as discovery and usage
  • Explore how an event-driven data mesh reduces barriers between operational and analytical use-cases, while simultaneously enabling both real-time applications and batch-based jobs
  • Learn about Saxo Bank’s data mesh implementation and technical challenges, as well as their recommendations for success
  • Explore the Confluent data mesh proof of concept self-service platform, check out the source code, and try it for yourself

Auteur

Adam Bellemare

Staff Technologist

Adam est technologue au sein du bureau du directeur technique de Confluent. Il a travaillé sur de nombreux projets, dont la théorie des maillages de données orientés événements, ainsi que sur des démonstrations de faisabilité, des stratégies de microservices orientées événements et des principes de conception d'événements et de diffusions d'événements.

Avant de rejoindre Confluent, Adam était ingénieur de plateforme Big Data au sein de plusieurs entreprises de e-commerce spécialisées dans la conception de solutions orientées événements avec Apache Kafka. Adam est l'auteur de « O'Reilly's Building Event-Driven Microservices » (2020).

Téléchargez l’e-book

Ressources supplémentaires

cc demo
kafka microservices
Image-Event-Driven Microservices-01

Additional Resources

cc demo
kafka microservices
microservices-and-apache-kafka