Build Predictive Machine Learning with Flink | Workshop on Dec 18 | Register Now

Online Talk

Stream Processing Fundamentals: Why, When, and How

Part 1 - Monday, September 12, 2022
Part 2 - Monday, September 19, 2022
Part 3 - Monday, September 26, 2022

10 AM PDT | 1 PM EDT | 10 AM BST | 10:30 AM IST | 1 PM SGT | 3 PM AEST

Stream processing is a data processing technology used to collect, store, and manage continuous streams of data as it’s produced or received. Also known as data streaming or event streaming, stream processing has grown exponentially in recent years due to its powerful ability to simplify data architectures, provide real-time insights and analytics, and react to time-sensitive data required for IoT, multiplayer video games, or location-based applications.

Businesses use stream processing as the backend process for everything from billing, fulfillment and fraud detection to Netflix recommendations and ride-share apps like Lyft.

In this three-part online talk series, you’ll find everything you need to know about stream processing and how you can get started, including:

  • The benefits of event stream processing with Confluent Cloud and how it works
  • How stream processing helps businesses succeed in today's data-driven world
  • An introduction to the two most common ways to get started with stream processing in Kafka
    • Confluent ksqlDB, an event streaming database purpose-built to help developers create stream processing applications on top of Apache Kafka.

The Kafka Streams API, an open source client library for building stream processing applications part 1: How Stream Processing Works: Basic Concepts of Streaming

The event-driven model behind data streaming provides many benefits: It decouples dependencies between services, provides some level of pluggability to the architecture, and enables services to evolve independently.

Apache Kafka is often the foundation of a data streaming architecture, with Confluent Cloud the managed service that brings Kafka to enterprise readiness. Kafka acts as a central dataplane that holds shared events and keeps services in sync. Its distributed cluster technology provides availability, resiliency and performance properties that strengthen the architecture, so you can focus on writing and deploying load-balanced, highly available client applications.

In this webinar session, you’ll learn about the use of Apache Kafka as a platform for streaming data and how stream processing can make your data systems more flexible and less complex.

Register now to learn:

  • Advantages of event stream processing over batch processing
  • Common stream processing use cases
  • High-level differences between Kafka Streams and ksqlDB

Part 2: Stream Processing with Kafka Streams An event streaming platform would not be complete without the ability to manipulate that data as it arrives. The Streams API within Apache Kafka is a powerful, lightweight library that allows for on-the-fly processing, letting you aggregate, create windowing parameters, perform joins of data within a stream, and more. Perhaps best of all, it is built as a Java application on top of Kafka, keeping your workflow intact with no extra clusters to maintain.

Register now to learn:

  • How Kafka Streams integrates with your applications
  • The purpose and features of Kafka Streams
  • What an application using the Streams DSL (Domain-Specific Language) looks like

Part 3: Introduction to ksqlDB You’ve got streams of data that you want to process and store? You’ve got events from which you’d like to derive state or build aggregates? And you want to do all of this in a scalable and fault-tolerant manner? You’re in luck.

ksqlDB enables you to build event streaming applications with the same ease and familiarity of building traditional applications on a relational database. It also simplifies the underlying architecture for these applications so you can build powerful, real-time systems with just a few SQL statements.

In this online talk, you’ll learn the concepts and capabilities of ksqlDB. You’ll see how you can apply transformations to a stream of events from one Kafka topic to another as well as using ksqlDB connectors to bring in data from other systems and use that data to join and enrich streams.

Register now to learn:

  • What is ksqlDB and how does it work?
  • ksqlDB use cases, architecture and components
  • How to process streams of events
  • The semantics of streams and tables and of push and pull queries
  • How to use the ksqlDB API to get state directly from the materialized store
  • What makes ksqlDB elastically scalable and fault-tolerant

Related Links

How Confluent Completes Apache Kafka eBook

Leverage a cloud-native service 10x better than Apache Kafka

Confluent Developer Center

Spend less on Kafka with Confluent, come see how