Build your real-time bridge to the cloud with Confluent Platform 7.0 and Cluster Linking | Read the blog

Succeeding at 100 Days Of Code for Apache Kafka

Some call it a challenge. Others call it a community. Whatever you call it, 100 Days Of Code is a bunch of fun and a great learning experience that helps developers build strong coding habits, and there is a community of people doing it with you and supporting you through social accountability.

The 100 Days of Code challenge boils down to two general rules:

  1. Start hands-on learning for at least one hour every day for the next 100 days
  2. Share your progress every day on social media (Twitter, LinkedIn—wherever you want) and include #100DaysOfCode @apachekafka

If you’re a developer interested in event streaming and would like to use the 100 Days Of Code model as a way to learn about Apache Kafka®, the 100 Days Of Code page on Confluent Developer provides a curated list of resources to give you some inspiration. The content spans:

  • Introductory material
  • Languages and CLIs
  • Schemas
  • Stream processing
  • Microservices
  • Data pipelines
  • Event sourcing
  • Performance and observability
  • Operations

On your mark…

Perhaps you’re motivated to join 100 Days Of Code because there is a specific set of hot new technologies you want to tinker with. Think about what that set of tech looks like and how the pieces relate to each other. Or you might be driven to join the challenge because you have a specific project in mind, like building a real-time app that checks for hard-to-get products in an online store (hey there, global supply chain!), creating an adventure game that streams game events, evaluating network usage on your home Wi-Fi, etc. Think about what tech stack you need to learn to accomplish your project. Knowing what your goals are will help you create a good plan.

Once you’ve figured out your goal, the original 100 Days Of Code website suggests that you publicly commit to the challenge to hold yourself accountable. If you decide to commit to the 100 Days Of Code challenge for Apache Kafka, tweet your commitment to the world, making yourself accountable to the community.

100 Days Of Code – Apache Kafka

Get set…

You should be able to cover a lot of ground in 100 days, even at just one hour per day. You don’t need to plan out each and every day ahead of time—in fact, some people recommend that you do not do this because it’s too rigid, whereas 100 Days Of Code should be a creative journey. But you should still identify a set of resources to aid your learning effort.

If your goal is to become more proficient with Kafka, visit Confluent Developer to build your own self-directed learning journey. Kafka may be your entire focus for the 100 days of learning, but more likely than not, you also want to become more proficient in the broader ecosystem. So make sure to incorporate other resources within the tech stack, to complement your Kafka learning: microservices, or serverless, or event sourcing, or programming languages like Python, Go, or .NET, or analytics, or web development, or databases, SQL or GraphQL….

Additionally, create a project space, both literally and online. Literally: Think about where you are physically going to sit and code, especially if you are doing this outside of work hours. Online: Figure out what your online platform looks like. In this day and age, you can find plenty of cloud environments that you can use as a sandbox for learning. For example, in Confluent Cloud, you can create separate environments or Kafka clusters for different projects you’re working on, and then tear them down when you’re done. Go set up a cool space in Confluent Cloud for your learning—and get $100 of free usage with the promo code 100DAYSKAFKA!

Go!

Pick a super easy task for Day 1. Nothing is more of a buzzkill than choosing something defeatingly difficult for the first day. For Kafka, get going with the short and easy quick start (as a bonus, the quick start will help you set up your cloud sandbox for working with Kafka throughout the rest of the challenge).

Throughout the 100 days, try to work on the challenge daily, but don’t worry if you miss a day. Take a day off when you need it, but any more than one consecutive missed day, and you could start to lose the habit you’re working to form.

While good, lasting habits are hard to build, it helps to have some kind of support from others. This is where the community aspect of the challenge comes into play. Join the Confluent community, ask all your Kafka-related questions, find answers, and encourage others. Keep the community updated on your daily progress and share links to or screenshots of any applications you write. For information about other ways to connect with the 100 Days Of Code community, see the Connect page on the official challenge website.

Let us know how it goes

For us in the Confluent DevRel team, we are especially excited to see what you build with Kafka, so please tweet to us @confluentinc or post in the Community Forum and let us know how it goes. Good luck!

Get Started

Yeva Byzek is an integration architect at Confluent designing solutions and building demos for developers and operators of Apache Kafka. She has many years of experience validating and optimizing end-to-end solutions for distributed software systems and networks.

Did you like this blog post? Share it now

Subscribe to the Confluent blog

More Articles Like This

A Guide to Stream Processing and ksqlDB Fundamentals

Event streaming applications are a powerful way to react to events as they happen and to take advantage of data while it is fresh. However, they can be a challenge

A Guide to Kafka Streams and Its Uses

Kafka Streams is an abstraction over Apache Kafka® producers and consumers that lets you forget about low-level details and focus on processing your Kafka data. You could of course write

Spring for Apache Kafka 101

Extensive out-of-the-box functionality, a large user community, and up-to-date, cloud-native features make Spring and its libraries a strong option for anchoring your Apache Kafka® and Confluent Cloud based microservices architecture.