[Demo] How to Build Streaming Agents with Flink, Claude LLM, & Anthropic’s MCP | Register Now

Context-Driven AI Reigned Supreme at Current New Orleans

Écrit par

AI is redefining what it means to build data-driven businesses. It’s no longer about mining insights from data—it’s about creating intelligent systems that can understand the state of the business and act on its behalf in real time. And in this new era, context data is king

That was the recurring theme at Current New Orleans, the data streaming event that drew in thousands of attendees, in-person and online.

Product & Partner News + Can't-Miss Customer Stories From Current New Orleans

Current New Orleans attendees learned about news from across the Confluent ecosystem, including launching Confluent Intelligence, Confluent Private Cloud, Unified Stream Manager for Confluent Platform, production-ready Tableflow and Cluster Linking releases for Confluent Cloud, as well as welcoming the Airy team. Read the recap to learn what's next for data streaming and AI.

Want to see what happened at Current New Orleans for yourself? Explore all the recorded sessions on demand.

How Real-Time Data Drives Intelligent Systems

Attendees took home insights into why AI agents must understand, learn from, and act on real business data that’s context-rich and constantly evolving. Without that, no intelligent system can truly perform effectively for real-world use cases—like agentic customer service, outcome-based billing, and even software development. And they heard firsthand from technical leaders at Anthropic, Metronome, and Marriott why accessing and harnessing that data mandates a real-time streaming foundation.

“The big problem is this data is being processed in batch…and for feeding context to a system that’s making real-time decisions, that’s going to be interacting as part of the business—this is a complete non-starter,” said Confluent CEO and Co-Founder Jay Kreps during the opening keynote at Current New Orleans. 

Jay Kreps Gives the Day 1 Keynote at Current New Orleans

While the success of these intelligent systems depends on both the models and the context data, most organizations have little control over advancing the foundation models themselves, he said. But what they do have is control over the data that feeds them.

“The context data that is spread all across the organization that you want to harness for these problems, that’s where you can make things iteratively better. That’s where you can take a system from 90% good enough to 100% good enough. That’s how you take a system from demo to production by getting that right,” he told the audience.  

Watch this video to see how Confluent is making it easy for organizations to serve trustworthy, structured context to any AI app or agent via Model Context Protocol (MCP).

The opening keynote was packed with demos, partner announcements, and product launches that reinforced Confluent’s mission: “We want to make streaming ubiquitous, get all your data moving from the simplest applications to the AI agents that are transforming your business. When streaming is ubiquitous, we can find a better way to build. We can shift governing and processing left—closer toward the source—do that work once, and use that data to power AI agents, analytics, and real time customer experiences. Stream once, use everywhere.” – Shaun Clowes, Chief Product Officer at Confluent.

Anthropic, Salesforce, Marriott & Metronome: Why Context Data Is Key for AI Success

Rachel Lo, head of applied AI enterprise at Anthropic, shared how AI is rapidly evolving from simple chatbots to sophisticated multi-agent systems capable of reasoning, planning, and acting across an enterprise. But making this leap from experimentation to real business impact requires a strong data foundation. 

“While building agents is becoming easier now, it’s the data foundation that’s the critical piece of the conversation. No matter how intelligent the model is, if we don’t have all of the data architecture that’s available it’s much more difficult for businesses to be able to build these powerful agents,” Lo said.

These systems rely on both structured and unstructured data, historical and real time, all tied to the unique context of the business. Technologies like MCP play a critical role in grounding AI in this data foundation—ensuring agents stay accurate, context-aware, and ready for real operational use. “This [Real-Time Context Engine] release from Confluent allows all of these live context streams directly into Claude and all other frontier models to be able to power the agents themselves.” 

To help unlock a new generation of AI-driven, personalized customer experience, Confluent also announced its partnership with Salesforce on Day 1 of Current New Orleans. This partnership connects the world’s leading data streaming platform with the world’s leading CRM.

“There’s a growing divide between the type of agentic applications that deliver outcomes and those that never really achieve the potential that they set out to achieve. Context is really the key to bridging this divide…especially real-time, governed, personalized context makes all the difference there,” said Gunther Hagleitner, SVP of engineering, DATA 360, at Salesforce.

Confluent customers Marriott and Metronome reinforced why a real-time streaming foundation is key to driving business value.

Rajesh Kandasamy, VP of application development and architecture at Marriott International, shared how Marriott Bonvoy’s data modernization journey powered by data streaming has transformed guest experiences—from shop to book to stay—by making real-time data core to its business. 

Apart from enabling 900+ critical business transactions and 2000+ consumers to consume the most valuable data in real time, data streaming has brought down partner integration time to six weeks—which previously took six to nine months.

Kandasamy shared a few key recommendations: “Make your business-critical transactions real-time and operationalize them; once you have real-time data, enrich it and make it into a more mature data product using technology like Apache Flink®; apply AI on top of that to provide hyper-personalized experiences to the most valuable customers; and develop ecosystems and tools for your developers and business communities.” 

Cosmo Wolfe, chief technology officer at Metronome, a San Francisco-based usage-based billing platform for software companies, shared how the company wouldn't be able to power modern monetization infrastructure without data streaming. 

“At Metronome, we have to run a highly available, incredibly low latency, and exactly end-to-end correct transactional observability system—and we have to run that at an extreme scale…Our product wouldn’t have worked without that core data streaming architecture,” Wolfe shared.

Building Systems That Are Always Ready for What’s Next – Engineering in the Age of AI

Confluent’s Adi Polak opened up the Day 2 keynote with a demo walking the audience through how the AI hype cycle has rapidly transformed what’s required to build useful, trustworthy business systems and applications. And, while interviewing special guests, Tim Berglund explored how developers, data engineers, software engineers, and architects are becoming “data streaming engineers” to drive the innovation that’s turning business intelligence and black-box AI to event-driven, context-powered intelligent systems.

"Without that trustworthy system, up-to-date context, all AI produces for us is actually slop [and] inconsistent decisions. Things we just cannot explain with a straight face." 

— Adi Polak, Director of Advocacy and Developer Experience Engineering

"We are collectively data streaming engineers...you build systems that are always ready for what's next, right? That's the deep structure of the systems we build. They're designed with change in mind."

— Tim Berglund, Vice President, Developer Relations

There was something for everyone at Current New Orleans—from lightning talks and breakout sessions to trainings and certifications. Attendees gained real-world insights into how businesses are leveraging technologies like Apache Kafka®, Apache Flink®, and Apache Iceberg™ as the foundation of their data backbone and to power innovation and build intelligent systems. 

8 Session Highlights From 2 Days of Learning & 100+ Expert Speakers

  • From Queues to Intelligence: The Evolution of Streaming Infrastructure for AI: Aravind Suresh shared how streaming infrastructure has evolved inside OpenAI: from simple durable queues to a sophisticated architecture powering various products and research. He highlighted how streaming has become a foundational layer in their AI product stack.

  • Unlocking Inter-Agent Collaboration: Confluent Powers Scalable AI with Google Cloud's A2A: The next frontier in AI is intelligent agentic systems, where agents collaborate to achieve complex goals. Google Cloud’s Merlin Yamssi and Confluent’s Dustin Shammo and Pascal Vantrepote shared how Confluent's platform capabilities empower AI agents for intelligent automation and dynamic orchestration within the Google Cloud environment. 

  • Powering Real-Time Vehicle Intelligence at Rivian with Apache Flink: Rivian shared how they are using Flink and Kafka to process massive volumes of telemetry data to power instant alerts, proactive maintenance, and intelligent customer experiences. Rupesh More and Guruguha Sreenivasa showcased how their scalable, cloud-native architecture balances low latency responsiveness with long-term analytical insight, enabling continuous innovation across Rivian’s connected fleet. 

  • Scaling Agentic AI Delivery: How Infosys Leverages the Confluent OEM Program: Infosys’ Paresh Oswal shared how evolving customer demands and the rise of agentic AI are shaping their technology strategy, and why Infosys chose to partner with Confluent to accelerate innovation and deliver differentiated solutions. 

  • Robinhood’s Use of WarpStream for Logging: Robinhood software engineers Ethan Chen and Renan Rueda walked through a tiered approach to migrating Kafka workloads to WarpStream while preventing downtime, data loss, and unnecessary data duplication. They explained how WarpStream's zero-disk architecture both keeps Robinhood's data in its own cloud environment and provides the elastic scaling needed to match daily trading platform traffic patterns. This dramatically reduced the on-call burden for their team and infrastructure costs.

  • The Evolution of Notion’s Event Logging Stack: Notion’s Adam Hudson detailed the evolution of their event logging stack, built to handle billions of daily events from more than 100 million users. They transitioned from expensive, inflexible third-party SaaS tools to a flexible architecture centered on Confluent Cloud, allowing for dynamic event routing and robust privacy controls. By combining Kafka with Snowpipe Streaming and Apache Pinot, Notion now powers real-time, user-facing analytics and eliminates slow production queries, achieving massive gains in performance and flexibility.

  • Orchestrating a Successful Kafka Migration: Jack Burns, a data platform engineer at Nordstrom's, shared how his team migrated the company's "Tier 0" data platform from a self-managed Confluent Platform to Confluent Cloud. This platform supports thousands of applications and hundreds of engineering teams, which made coordinating producer downtime across teams a complex undertaking. To manage the complexity, his team built a topic migration readiness checklist and a metrics dashboard, which was essential for orchestrating the multi-stage migration successfully.

  • Future of Streaming: Emerging Trends for Event-Driven Architecture: In a standing-room-only session, Matthew Walker from JP Morgan Chase explored the future of event-driven architecture in the age of AI. He detailed how JPMC's Fusion platform evolved from historical model training to the company's AI sharing and data governance platform. This shift mirrors what he's seeing across the industry: primarily, the problem of feeding multi-agent AI systems the real-time data they need to understand the current state of the business. He argued that before organizations can build agentic systems, they must first build and maintain data products at scale. The new "autonomous data stack," powered by technologies like Kafka, Flink, and MCP, is what he expects to form "the intelligence layer" needed to make AI accountable, effective, and context-aware.

And for those who missed out on our exclusive Executive Summit event this year, which happened on Day 0 of Current, we had 148 attendees spend an immersive day learning about how to harness the power of data in the age of AI. Our hand-picked panelists shared their data streaming journeys, key use cases, best practices, business outcomes, how they are getting ready for—and using—agentic AI, and why they are shifting left when it comes to data governance and data processing.

Until Next Time

It wasn’t all learning—attendees soaked up New Orleans vibes with great food, music, and a little magic at the Current Party. Fun and insight came together for an unforgettable experience.

Celebrating Day 1 at Current New Orleans

Missed this one? Don’t worry—check out the full Day 1 keynote and all the recorded sessions or get ready to come join us at Current Bengaluru 2026 on April 22, where you can look forward to an action-packed day of sessions presented by data streaming and AI experts on every topic imaginable.

Get Early Access to Registration


Apache®, Apache Kafka®, Kafka®, Apache Flink®, Flink®, Apache IcebergTM, IcebergTM are either registered trademarks or trademarks of the Apache Software Foundation. No endorsement by the Apache Software Foundation is implied by the use of these marks. 

  • Mekhala Roy is a senior writer on the Brand Marketing team at Confluent. Prior to Confluent, Mekhala has worked in the cybersecurity industry—and also spent several years working as a tech journalist.

  • Zion Samuel is a writer and content strategist at Confluent. Working closely with subject matter experts, she creates content that helps practitioners and business leaders learn new skills, solve technical challenges, and work strategically. Prior to Confluent, Zion spent several years writing and developing messaging in technology & SaaS, healthcare & life sciences, and biotech.

Avez-vous aimé cet article de blog ? Partagez-le !