Kafka in the Cloud: Why it’s 10x better with Confluent | Find out more

Join us for Kafka Summit Hackathon in New York City

Get started with Confluent, for free

Watch demo: Kafka streaming in 10 minutes

Written By

I’m happy to announce Confluent will be hosting another Kafka Summit Hackathon on May 7th in New York City! The free hackathon will take place a day prior to Kafka Summit NYC  and is designed to help the community learn how to build streaming applications with Apache Kafka®.

Whether you’re a beginner or a seasoned expert, join us to create cool stream processing applications, chat with fellow Kafka committers and developers, and maybe win a prize in the process!

Here are the key details:

Kafka Summit Hackathon New York City

When: Sunday, May 7, 2017 from 6:00 PM to 10:00 PM (EDT)

Where: New York Midtown Hilton – 1335 Avenue Of The Americas, New York, NY 11019 – View Map (Sutton South)

How to join: Hackathon Registration

We’ll have food, drinks and prizes for participants, along with Kafka developers on hand to help you with any questions. Interested? Then register for the hackathon.

Want to know more? Read on below for more details.

About Last Year’s Hackathon

Last year, we hosted the Kafka Summit Hackathon to create new connectors for the Kafka community. More than 100 participated  and it was so fun, many stayed long after the 10pm close.

The winners of the 2016 Kafka Summit Hackathon were:

  • First place: Aravind Yarram of Equifax, for the creation of a Jenkins connector
  • Second place (tied): Ashish Singh from Cloudera, for an end-to-end Twitter sentiment analysis application that leveraged Kafka’s Streams API
  • Second place (tied): the Silicon Valley Data Science team, with a connector for a brain monitoring device

This Year’s Hackathon Theme: Microservices and Kafka

This year’s theme is microservices, and notably microservices implemented with the Kafka Streams API or with Kafka producer and consumer clients for programming languages such as Java, C/C++, Python, .NET or Go.

The goal: to combine the power of stream processing with the agility and composability of microservices.

We’ll be looking for applications that demonstrate how you can quickly build services that would be challenging to write without Kafka. They might leverage Streams API features such as interactive queries, session windows and streaming joins.

About the Kafka Streams API

The Streams API in Apache Kafka, introduced in version 0.10, is the easiest way to process data in Kafka. Since its initial release in summer 2016, more and more companies across industries such as finance, travel, advertising, gaming, cybersecurity and social media have been leveraging the Streams API to build mission critical, real-time applications that power their core business – all the way from small to large-scale use cases that handle millions of events per second.

Technically, the Streams API is a powerful yet easy-to-use Java library that allows you to build “normal” applications (e.g. in the form of microservices) for your stream processing needs that you can package, deploy and operate however you want, rather than forcing you to build out separate processing clusters or similar special-purpose infrastructure. Even so, your streams applications with Kafka will be elastic, scalable, distributed, fault-tolerant and secure – think: “a cluster to go.” On top of that the Streams API provides your applications with database functionality such as the first-class abstractions for streams and tables, fast and fault-tolerant state management as well as interactive queries to access the latest processing results – think: “a database to go.” You can leverage features such as windowing support, sessionization, event-time processing and highly performant joins for streams and tables. And of course, the Streams API is 100% compatible and integrated with Kafka.

About Kafka clients for your favorite programming language

In addition to the Kafka Streams API described above, there’s a vibrant ecosystem and community that provide application developers with Kafka clients for essentially all popular programming languages, including Java (shipped with Apache Kafka), C/C++, Python, Go, .NET, NodeJS and more – see the Kafka Clients overview in the Apache Kafka wiki.  

Project Ideas

Interested in the hackathon but not sure what to build? Here are a few ideas to jumpstart your creativity:

  • Non-Java clients can do anything Java clients can as many implementations achieve feature parity with the clients that ship with Apache Kafka. Maybe you want to analyze tweets using Go? Or hack out a chat app using Python?
  • Perhaps you want to create a simple but generally useful microservice using the Kafka Streams API for a de-duplication or event filtering service, for CSV-to-Avro conversion, a service that pushes compacted to non-compacted topics, or even a multi-stage event driven architecture for your domain of choice.
  • You might bring the power of Streaming Materialized Views to your JavaScript platform of choice through Kafka’s interactive queries exposed by a REST API layer. There’s a demo application called Kafka Music to get you started in the Confluent Examples repository.
  • Maybe you want to hack a full-fledged stream processing app:
    • Monitor network traffic for attacks
    • Monitor Kafka’s internal metrics for anomalies
    • Calculate “surge pricing” for an online shop
    • Find the landing pages in a website that lead to the longest sessions or to eventual sales using the new session-windows feature of Kafka Streams API
    • Create a real-time dashboard displaying IoT sensor data using the new Interactive Queries feature

Still not sure what to build? During registration we’ll help connect you with other participants so you can work as a team to come up with and implement a project.

Prizes

Entries will be evaluated by a panel of judges at the end of the hackathon based on creativity, features, and completeness.

  • 1st place: Nintendo Switch
  • 2nd and 3rd place: $100 giftcards for Amazon or iTunes
  • Everyone: T-Shirt and stickers

Prizes will only be awarded to entries that open source their code, making it available on a code sharing site like GitHub or Bitbucket.

FAQs

Is attendance restricted to Kafka Summit attendees?
No, this is a community event and anyone is welcome to register and participate – it’s free.

Do I need to be familiar with the Kafka Streams API?
No previous experience with the Streams API is required, but we encourage you to review some of the resources listed below to get some basic familiarity with the framework. This will let you focus on designing and writing your streaming applications during the hackathon.

Do I need to know what I’m going to build before I arrive?
No, although it will help you get up and running more quickly if you come with a few ideas. We’ve provided some examples of possible projects in the “Project Ideas” section above to give you an idea of the types of applications you might build with the Kafka Streams API.

Can I work in a team?
Absolutely, and we encourage it! To help form teams, you can include projects you are interested in building with your registration. We’ll connect you with other participants with similar interests at the beginning of the event.

What type of food will be provided?
Light dinner and drinks.

Am I required to submit my code or open source it?
You are not required to do either, but to be eligible for the prizes you must publish your code under an open source license. We recommend the Apache v2 License, but other popular open source licenses are acceptable.

How complete are projects expected to be at the end of the hackathon?
The hackathon is just one evening, but enough time to get a prototype up and running. We hope this will motivate you to get started on a fully featured Kafka Streams application, but the expectation is to only have a prototype by the end of the night.

Will a skeleton or demo application be provided to help get started?
Yes, example applications for Kafka’s Streams API (in Java and in Scala) can be found in the Confluent Examples repository. Examples for non-Java clients are typically available at the respective project websites or GitHub repositories.  We encourage starting from such demo application so you can make the most of the time during the hackathon.

Who will be available to provide help with Kafka’s Streams API or with non-Java clients (e.g. C/C++, Python)?
Kafka committers, Confluent engineers, and community members will attend the event to help you go from design to implementation of your application.

How will projects be judged?
Near the end of the hackathon we’ll ask you to give a brief overview of what you’ve built and provide us a link to the repository. No need for a fancy demo, just a quick summary. A small panel of judges will select the most outstanding project, based on creativity, features, and completeness.

Registration

The Kafka Summit Hackathon is a free event, but all attendees must register. Head over to the Kafka Summit Hackathon NYC website for more details and to complete your registration.

Resources

The hackathon will be most productive if you’ve done a bit of prep work so you can get straight to coding. Here are some resources you might find useful.

The Streams API in Kafka

Kafka clients for Java, C/C++, Python, etc.

Interested? Register now.

  • Michael is a principal technologist in the Office of the CTO at Confluent, the company founded by the creators of Apache Kafka. He focuses on longer-term product and technology strategy. Previously, Michael was the lead product manager for stream processing at Confluent, where his team created Kafka Streams and the streaming database ksqlDB. He is a well-known technology blogger in the big data community (www.michael-noll.com) and a committer/contributor to open source projects such as Apache Storm and Apache Kafka.

Get started with Confluent, for free

Watch demo: Kafka streaming in 10 minutes

Did you like this blog post? Share it now