Created by the original creators of Apache Kafka, Confluent brings a wide array of talks with leading technologists and industry experts.
This three-part online talk series introduces key concepts, use cases, and best practices for getting started with microservices.
In this session, we'll explore how Confluent helps companies modernize their database strategy with Confluent Cloud and modern Azure Data Services like Cosmos DB. Confluent accelerates getting data to the cloud and reduces costs by implementing a central-pipeline architecture using Apache Kafka.
Join us for a session on August 23rd during which we’ll break down the concept of event-driven microservices and how streaming data pipelines play a critical role in interservice communication.
In this three-part series, you’ll get an overview of what Kafka is, what it's used for, and the core concepts that enable it to power a highly scalable, available and resilient real-time event streaming platform.
This forum will walk through a story of a Bank who uses an Oracle database to store sensitive customer information and RabbitMQ as the message broker for credit card transaction events.
Subscribe to the content categories of your choice and be auto-registered for our next session.