Project Metamorphosis: Unveiling the next-gen event streaming platformLearn More

Kafka Streams API

Stream processing designed for developers

The Streams API in Kafka is an open-source solution that you can integrate into your application to build and execute powerful stream processing functions. If you use Kafka for stream data transport, the Streams API in Kafka can immediately add stream processing capabilities to your application without the burden of adding an entirely separate distributed processing cluster for another stream processing framework. Whether you have an IoT application, a monitoring function, a complex continuous query, or you are tracking inventory changes, the Streams API in Kafka enables you to build your application with ease.

The Streams API in Kafka is included with the Apache Kafka release v 0.10 as well as Confluent Enterprise v3.0.

Download Confluent Platform

Today's Stream Processing Environments are Complex

Stream processing API's are very powerful tools. With this immense capability however, comes some complexity; they need their own dedicated cluster of machines, and often rely on a distributed database to perform look-ups and aggregations. Also, your application likely will be required to handle re-processing of old data separately.

The outcome? A lot of moving parts that need to be kept in synchronization with each other to make the application work.

Meet the Streams API in Kafka.

Read our blog on the Streams API in Kafka

Kafka Streams API: The Power without the Weight

  • Powerful

    • Highly scalable, elastic, fault-tolerant
    • Stateful and stateless processing
    • Event-time processing
  • Lightweight

    • No dedicated cluster required
    • No message translation layer
    • No external dependencies
  • Fully Integrated

    • 100% compatible with Kafka v0.10
    • Easy to integrate into existing applications
    • No artificial rules for deploying applications
  • Real Time

    • Millisecond processing latency
    • Does not microbatch messages
    • Windowing with out-of-order data
    • Allows for late arrival of data

Use Cases Well-suited for the Streams API in Kafka

  • Stream-based Microservices

    Microservices are often built upon a Kafka-based data stream. Instead of a large monolithic application, microservices deliver small, decoupled processes that execute a constrained feature set against a stream. These are ideal for the Streams API because they work on real-time streams, depend on reliable message delivery, and might not warrant the cost of deploying a separate stream processing framework.

  • Continuous Queries

    One of the best ways to take advantage of stream data is to compare or analyze data in different streams against each other, or to join streams together to create more meaningful information. Continuous queries are used to automate real-time intelligence at scale across an organization. The Streams API allows functions of this scale to be implemented with a low infrastructure footprint and entirely within the Kafka environment.

  • Continuous Transformations

    Continuous transformations modify or aggregate the data in a stream. These humble but critical processes are low level services that prepare data for real-time analysis. Their small scale is well-suited to a lightweight solution like the Streams API in Kafka, which supports both stateful and stateless processing.

  • Event-Triggered Processes

    The endless search for potentially meaningful anomalies in organizational data is moving into a real-time environment. If data pipelines exist, implementing event-triggered processes against them using the Streams API in Kafka is a simple extension of the streaming infrastructure Kafka already provides.

Ready to Talk to Us?

Have someone from Confluent contact you.

Contact Us

This website uses cookies to enhance user experience and to analyze performance and traffic on our website. We also share information about your use of our site with our social media, advertising, and analytics partners.