Project Metamorphosis: Unveiling the next-gen event streaming platformLearn More

How to Work with Apache Kafka in Your Spring Boot Application

Choosing the right messaging system during your architectural planning is always a challenge, yet one of the most important considerations to nail. As a developer, I write applications daily that need to serve lots of users and process huge amounts of data in real time.

Usually, I use Java with the Spring Framework (Spring Boot, Spring Data, Spring Cloud, Spring Caching, etc.) for this. Spring Boot is a framework that allows me to go through my development process much faster and easier than before. It has come to play a crucial role in my organization. As the number of our users quickly grew, we realized our apparent need for something that could process as many as 1,000,000 events per second.

When we found Apache Kafka®, we saw that it met our needs and could handle millions of messages quickly. That’s why we decided to try it. And since that moment, Kafka has been a vital tool in my pocket. Why did I choose it, you ask?

Apache Kafka is:

  • Scalable
  • Fault tolerant
  • A great publish-subscribe messaging system
  • Capable of higher throughput compared with most messaging systems
  • Highly durable
  • Highly reliable
  • High performant

That’s why I decided to use it in my projects. Based on my experience, I provide here a step-by-step guide on how to include Apache Kafka in your Spring Boot application so that you can start leveraging its benefits too.

Prerequisites

  • This article requires you to have Confluent Platform
  • Manual install using ZIP and TAR archives
    • Download
    • Unzip it
    • Follow the step-by-step instructions, and you’ll get Kafka up and running in your local environment

I recommend using the Confluent CLI for your development to have Apache Kafka and other components of a streaming platform up and running.

What you’ll get out of this guide

After reading this guide, you will have a Spring Boot application with a Kafka producer to publish messages to your Kafka topic, as well as with a Kafka consumer to read those messages.

And with that, let’s get started!

Table of contents

Step 1: Generate our project
Step 2: Publish/read messages from the Kafka topic
Step 3: Configure Kafka through application.yml configuration file
Step 4: Create a producer
Step 5: Create a consumer
Step 6: Create a REST controller

Step 1: Generate our project

First, let’s go to Spring Initializr to generate our project. Our project will have Spring MVC/web support and Apache Kafka support.

Generate a Spring Boot project with Spring Initializr

Once you have unzipped the project, you’ll have a very simple structure. I’ll show you how the project will look like at the end of this article so you can easily follow the same structure. I’m going to use Intellij IDEA, but you can use any Java IDE.

Step 2: Publish/read messages from the Kafka topic

Now, you can see what it looks like. Let’s move on to publishing/reading messages from the Kafka topic.

Start by creating a simple Java class, which we will use for our example: package com.demo.models;

public class User {

    private String name;
    private int age;

    public User(String name, int age) {
        this.name = name;
        this.age = age;
    }
}

Spring Boot with Kafka – Intellij IDEA

Step 3: Configure Kafka through application.yml configuration file

Next, we need to create the configuration file. We need to somehow configure our Kafka producer and consumer to be able to publish and read messages to and from the topic. Instead of creating a Java class, marking it with @Configuration annotation, we can use either application.properties file or application.yml. Spring Boot allows us to avoid all the boilerplate code we used to write in the past, and provide us with much more intelligent way of configuring our application, like this:

server: port: 9000
spring:
   kafka:
     consumer:
        bootstrap-servers: localhost:9092
        group-id: group_id
        auto-offset-reset: earliest
        key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
        value-deserializer: org.apache.kafka.common.serialization.StringDeserializer
     producer:
        bootstrap-servers: localhost:9092
        key-serializer: org.apache.kafka.common.serialization.StringSerializer
        value-serializer: org.apache.kafka.common.serialization.StringSerializer

If you want to get more about Spring Boot auto-configuration, you can read this short and useful article. For a full list of available configuration properties, you can refer to the official documentation.

Step 4: Create a producer

Creating a producer will write our messages to the topic.

@Service
public class Producer {

    private static final Logger logger = LoggerFactory.getLogger(Producer.class);
    private static final String TOPIC = "users";

    @Autowired
    private KafkaTemplate<String, String> kafkaTemplate;

    public void sendMessage(String message) {
        logger.info(String.format("#### -> Producing message -> %s", message));
        this.kafkaTemplate.send(TOPIC, message);
    }
}

We just auto-wired KafkaTemplate and will use this instance to publish messages to the topic—that’s it for producer!

Step 5: Create a consumer

Consumer is  the service that will be responsible for reading messages processing them according to the needs of your own business logic. To set it up, enter the following:

@Service
public class Consumer {

    private final Logger logger = LoggerFactory.getLogger(Producer.class);

    @KafkaListener(topics = "users", groupId = "group_id")
    public void consume(String message) throws IOException {
        logger.info(String.format("#### -> Consumed message -> %s", message));
    }
}

Here, we told our method void consume (String message) to subscribe to the user’s topic and just emit every message to the application log. In your real application, you can handle messages the way your business requires you to.

Step 6: Create a REST controller

If we already have a consumer, then we already have all we need to be able to consume Kafka messages.

To fully show how everything that we created works, we need to create a controller with single endpoint. The message will be published to this endpoint, and then handled by our producer.

Then, our consumer will catch and handle it the way we set it up  by logging to the console.

@RestController
@RequestMapping(value = "/kafka")
public class KafkaController {

    private final Producer producer;

    @Autowired
    KafkaController(Producer producer) {
        this.producer = producer;
    }

    @PostMapping(value = "/publish")
    public void sendMessageToKafkaTopic(@RequestParam("message") String message) {
        this.producer.sendMessage(message);
    }
}

Let’s send our message to Kafka using cURL:

curl -X POST -F 'message=test' http://localhost:9000/kafka/publish

Add Apache Kafka to your Spring Boot Project

Basically, that’s it! In fewer than 10 steps, you learned how easy it is to add Apache Kafka to your Spring Boot project. If you followed this guide, you now know how to integrate Kafka into your Spring Boot project, and you are ready to go with this super tool!

Interested in more?

If you’d like to know more, you can download the Confluent Platform, the leading distribution of Apache Kafka. You can also find all the code in this article on GitHub.

Further reading

This is a guest post by Igor Kosandyak, a Java software engineer at Oril, with extensive experience in various development areas.

Did you like this blog post? Share it now

Subscribe to the Confluent blog

More Articles Like This

Spring for Apache Kafka – Beyond the Basics: Can Your Kafka Consumers Handle a Poison Pill?

You know the fundamentals of Apache Kafka®. You are a Spring Boot developer working with Apache Kafka. You have chosen Spring Kafka for your integration. You have implemented your first […]

Announcing the Snowflake Sink Connector for Apache Kafka in Confluent Cloud

We are excited to announce the preview release of the fully managed Snowflake sink connector in Confluent Cloud, our fully managed event streaming service based on Apache Kafka®. Our managed […]

How Merging Companies Will Give Rise to Unified Data Streams

Company mergers are becoming more common as businesses strive to improve performance and grow market share by saving costs and eliminating competition through acquisitions. But how do business mergers relate […]

Sign Up Now

Start your 3-month trial. Get up to $200 off on each of your first 3 Confluent Cloud monthly bills

New signups only.

By clicking “sign up” above you understand we will process your personal information in accordance with our Privacy Policy.

By clicking "sign up" above you agree to the Terms of Service and to receive occasional marketing emails from Confluent. You also understand that we will process your personal information in accordance with our Privacy Policy.

Free Forever on a Single Kafka Broker
i

The software will allow unlimited-time usage of commercial features on a single Kafka broker. Upon adding a second broker, a 30-day timer will automatically start on commercial features, which cannot be reset by moving back to one broker.

Select Deployment Type
Manual Deployment
  • tar
  • zip
  • deb
  • rpm
  • docker
or
Auto Deployment
  • kubernetes
  • ansible

By clicking "download free" above you understand we will process your personal information in accordance with our Privacy Policy.

By clicking "download free" above, you agree to the Confluent License Agreement and to receive occasional marketing emails from Confluent. You also agree that your personal data will be processed in accordance with our Privacy Policy.

This website uses cookies to enhance user experience and to analyze performance and traffic on our website. We also share information about your use of our site with our social media, advertising, and analytics partners.