[Demo+Webinar] New Product Updates to Make Serverless Flink a Developer’s Best Friend | Watch Now

Kafka Summit NYC Systems Track: What to Expect

Written By
  • Jun RaoCo-founder of Confluent and original co-creator of Apache Kafka®
  • Rajini SivaramPrinciple Engineer I

In our previous post on the Streaming Pipelines track, we highlighted some of the sessions not to be missed at Kafka Summit NYC. As a follow on to that, let’s talk about the Systems track at the event. The Systems track was designed specifically for those who are looking to learn how to get the most out of Apache Kafka®. In this track, community and industry experts will share the latest and greatest Kafka features and best practices on how to run and scale Kafka in production. There’s a little something for everyone in this section of the agenda.

Speakers in this track come from companies like LinkedIn, Goldman Sachs, Confluent, Target, Pivotal and Heroku and will address the challenges they faced when running Kafka at scale, lessons learned, new features in Kafka they used and more. Don’t miss these thought-provoking sessions designed to inspire and guide you as you’re working with Kafka at your own organization.

Here are some of our notable sessions for this track:

Kafka in the Enterprise: What if it Fails? 
Anton Gorshkov, Managing Director, Goldman SachsIn this talk, Anton will explore the key questions that Goldman Sachs answered in their journey to getting comfortable running critical parts of their infrastructure on Kafka and offer their solutions along the way.

 

Introducing Exactly Once Semantics in Apache Kafka
Apurva Mehta, Engineer, ConfluentExactly Once Semantics is one of the most popular upcoming features for Apache Kafka. Come hear all the exciting new details in this session and the new use cases it will enable.

 

How to Lock Down Apache Kafka and Keep Your Streams Safe
Rajini Sivaram, Principal Software Engineer, PivotalSecurity is a critical requirement for many Kafka deployments, especially in the cloud. In this talk, Rajini will look at Kafka security features that protect data streams and ensure the safety and integrity of data stored in Kafka topics.

 

Running Hundreds of Kafka Clusters with 5 People
Thomas Crayford, Infrastructure Engineer, HerokuRunning hundreds of Kafka clusters requires automation but it’s still very Ops friendly! In this talk, Thomas will cover key lessons learned in running many clusters with different workloads, including war stories and ways Kafka could be more friendly towards automation.


For anyone running Kafka at scale, this track is a must as it will provide all sorts of helpful advice to avoid common pitfalls.

Kafka Summit will be the largest gathering of Kafka experts across a wide range of industries, we hope to see you there. Similar to the event last year, we’re on track to sell out, so be sure to register ASAP.

  • Jun Rao is the co-founder of Confluent, a company that provides a stream data platform on top of Apache Kafka. Before Confluent, Jun Rao was a senior staff engineer at LinkedIn where he led the development of Kafka. Before LinkedIn, Jun Rao was a researcher at IBM's Almaden research data center, where he conducted research on database and distributed systems. Jun Rao is the PMC chair of Apache Kafka and a committer of Apache Cassandra.

  • Rajini Sivaram is a principal engineer at Confluent, designing and developing geo-replication and security features for Confluent Platform and Confluent Cloud. She is an Apache Kafka Committer and member of the Apache Kafka Program Management Committee.

Did you like this blog post? Share it now