[Webinar] Capital One’s Tips for Achieving Data Streaming Success | Register Now!


Build a Data Mesh with Event Streams

Data mesh architectures simplify how teams publish and access important data across their organization.

The data mesh concept redefines how teams share responsibilities for data management and take advantage of event streaming technologies. Organizations that implement this architecture can stand up networks of event streams that deliver data as a first-class product—providing fresh information, preserving a historical record, and enabling consumers to use the data as they see fit.

In this ebook, Adam Bellmare explains how building an event-driven data mesh on top of Apache Kafka® can unify the operational and analytical planes. He’ll also walk you through:

  • Four principles that underpin data mesh architecture
  • Guidelines for creating a self-service platform for the creation, management, discovery, and usage of data products for owners and consumers
  • A proof-of-concept self-service platform built using Confluent Cloud, including its source code
  • A real-world case study by Saxo Bank that covers its challenges, technology choices, data mesh implementation, and recommendations for success

Get the eBook