Confluent Platform 7.0과 Cluster Linking으로 클라우드로의 실시간 브리지 구축 | 블로그 읽기


Building an Event Driven Global Data Fabric with Apache Kafka

Agencies are grappling with a growing challenge of distributing data across a geographically diverse set of locations around the US and globally. In order to ensure mission success, data needs to flow to all of these locations rapidly. Additionally, latency, bandwidth and reliability of communication can prove to be a challenge for agencies. A global data fabric is an emerging approach to help connect mission to data across multiple locations and deliver uniformity and consistency at scale.

This on-demand webinar will cover:

  • An overview of Apache Kafka and and how an event streaming platform can support your agencies mission
  • Considerations around handling varying quality communication links
  • Synchronous vs asynchronous data replication
  • New multi-region capabilities in Confluent Platform for Global Data Fabric


Jeffrey Levy

Deputy Chief of Digital Services

Will LaForest

Chief Technology Officer, US Public Sector

Watch Now