Created by the original creators of Apache Kafka, Confluent brings a wide array of talks with leading technologists and industry experts.
Maygol will walk us through the new streaming data pipeline demo that showcases a finserv use case showcasing streaming data pipelines between Oracle database and RabbitMQ on-prem systems, to migrate data to MongoDB on the cloud.
Every aspect of the financial services industry is undergoing some form of transformation. By leveraging the power of real-time data streaming, financial firms can drive personalized customer experiences, proactively mitigate cyber risk, and drive regulatory compliance.
What is data mesh and why is it gaining rapid traction among data teams? Join us on February 13 to talk with Michele Goetz, VP, Principal Analyst at Forrester for a deep dive into the business benefits of a data mesh architecture.
In this two-part series, you’ll get an overview of what Kafka is, what it's used for, and the core concepts that enable it to power a highly scalable, available and resilient real-time event streaming platform.
Learn how Apache Kafka® on Confluent Cloud streams massive data volumes to time series collections via the MongoDB Connector for Apache Kafka®.
Join Ryan James, Chief Data Officer of Vitality Group, to learn how Vitality Group future-proofed its event-driven microservices with Confluent and AWS
This fireside chat will cover Suman’s learnings from implementing 2 critical use cases at Walmart that continue to play a critical role in customer satisfaction: real-time inventory and real-time replenishment.
During the pandemic the Asia Pacific region (APAC) fared relatively well compared to its US and European counterparts. But is the pressure of global inflation and the threat of recession finally catching up?
Join this webinar to find out how a data mesh can bring much-needed order to a system in both cases, resulting in a more mature, manageable, and evolvable data architecture.
Jetzt zum Newsletter anmelden