[Technical Demo] How to Migrate to Confluent Cloud | Register Now
Change data capture is a popular method to connect database tables to data streams, but it comes with drawbacks. The next evolution of the CDC pattern, first-class data products, provide resilient pipelines that support both real-time and batch processing while isolating upstream systems...
Confluent Cloud Freight clusters are now Generally Available on AWS. In this blog, learn how Freight clusters can save you up to 90% at GBps+ scale.
Build event-driven agents on Apache Flink® with Streaming Agents on Confluent Cloud—fresh context, MCP tool calling, real-time embeddings, and enterprise governance.
Explore the hidden costs of real-time streaming—compare infrastructure, ops, and ROI between Confluent Cloud and self-managed Apache Kafka®. Learn how auto-scaling and governance lower TCO.
Learn how to build a real-time compliance and audit logging pipeline using Apache Kafka® or the Confluent data streaming platform with architecture details and best practices including schemas, immutability, retention, and more.
Use Confluent’s open source Connect migration utility to discover, assess, map, validate, and cut over self-managed Kafka connectors to fully managed in minutes.
Learn how software engineer Akshatha has grown as a strategic thinker and added new technical skills since joining Confluent’s Developer Productivity team.
Learn how to manage connectors in Confluent Cloud as code using the Confluent Terraform Provider—complete with the role bindings and access controls needed to integrate external systems with Apache Kafka.
Confluent, powered by Kafka, is the real-time backbone for agentic systems built with Google Cloud. It enables agents to access fresh data (MCP) and communicate seamlessly (A2A) via a decoupled architecture. This ensures scalability, resilience, and observability for complex, intelligent workflows.
AWS Lambda's Kafka Event Source Mapping now supports Confluent Schema Registry. This update simplifies building event-driven applications by eliminating the need for custom code to deserialize Avro/Protobuf data. The integration makes it easier and more efficient to leverage Confluent Cloud.
Confluent’s Cluster Linking enables fully managed, offset-preserving Kafka replication across clouds. It supports public and private networking, enabling use cases like disaster recovery, data sharing, and analytics across AWS, Azure, Google Cloud, and on-premises clusters.
Confluent Cloud now offers native Kafka Streams health monitoring to simplify troubleshooting. The new UI provides at-a-glance application state, performance ratios to pinpoint bottlenecks (code vs. cluster), and state store metrics.
At Current 2025 in New Orleans (Oct 29–30), developers, data engineers, operators, architects & tech execs unlock real-time data + AI insights.
Confluent is providing our customers and prospects with a full package to build trust and innovate securely with Confluent. With our technical documentation, foundational principles and a new level of transparency.
Powering analytics and AI requires reliable, consistent, and easily discoverable data to reach the data lake. To enforce these needs, strong and holistic governance is an important of building better platforms for getting from raw data to valuable insights and actions.
See how Confluent and its partner ecosystem are making it easier to use real-time data streaming as the fuel for your agentic AI and advanced analytics applications.
Learn how to scale Kafka Streams applications to handle massive throughput with partitioning, scaling strategies, tuning, and monitoring.