[ウェビナー] ストリーミングデータメッシュを構築する方法 | 今すぐ登録

Getting Started with Python for Apache Kafka

Watch demo: Kafka streaming in 10 minutes

Get started with Confluent, for free

作成者 :

It's becoming increasingly important for companies to be able to respond to the events that affect their business in real time. Apache Kafka® has become the de facto standard for meeting the need for real-time data processing.

If you're a Python developer, our free Apache Kafka for Python Developers course will show you how to harness the power of Kafka in your applications. You will learn how to build Kafka producer and consumer applications, how to work with event schemas and take advantage of Confluent Schema Registry, and more. Follow along in each module as Dave Klein, Senior Developer Advocate at Confluent, covers all of these topics in detail. Hands-on exercises occur throughout the course to solidify concepts as they are presented. At its end, you will have the knowledge you need to begin developing Python applications that stream data to and from Kafka clusters.

Introduction to Apache Kafka for Python

The course begins with an introduction that explains why Python is becoming such a popular language for developing Kafka client applications. You will learn about several benefits that Kafka developers gain by using the Python language. This module will also introduce you to the Python classes you will use during the rest of the course as you develop Kafka producer and consumer client applications.

Hands on: set Up the exercise environment for Confluent Cloud and Python

Throughout this course, we’ll introduce you to developing Apache Kafka event streaming apps with Python through hands-on exercises that will have you produce data to and consume data from Confluent Cloud. This exercise will get you set up for the exercises that follow by completing the following tasks:

  • Sign up for Confluent Cloud if you haven’t already done so

  • Set up a Kafka cluster in Confluent Cloud that you will stream data to and from

  • Set up Confluent Streams Governance in support of producing and consuming data using schemas

  • Set up a Python dictionary that defines the connection details required by your client applications to connect with the Kafka cluster in Confluent Cloud

  • Install Confluent Kafka Python libraries on your local machine

The Python Producer class

In this module, you will learn how to send events to Kafka topics using the Python Producer class.

Hands on: use the Python Producer class

In this exercise, you will use the Producer class to write events to a Kafka topic in Confluent Cloud.

The Python Consumer class

In this module, you will learn how to read events from Kafka topics using the Python Consumer class.

Hands on: use the Python Consumer class

In this exercise, you will use the Consumer class to read events from a Kafka topic in Confluent Cloud.

Integrate Python clients with Schema Registry

In this module, you will learn how to integrate applications that use the Python Producer and Consumer classes with Confluent Schema Registry.

Hands on: use the Python Producer class with Schemas

In this exercise, you will define a JSON schema and then produce events using a Producer, a JSONSerializer, and Schema Registry.

Hands on: Use the Python Consumer class with Schemas

In this exercise, you will consume the events you produced during the previous exercise and use the JSONDeserializer to turn those events into objects you can work with in your Python application.

The Python AdminClient class

In this module, you will learn how to complete simple Kafka cluster administrative tasks using the Python AdminClient class.

Hands on: use AdminClient to create a topic and alter its configuration

In this exercise, you will use the AdminClient class to create a new Kafka topic and alter one of its configuration properties.

Beyond simple Python apps

In this module, you will learn how you can use Python to satisfy the requirements of more complex Kafka event streaming use cases. Follow along as Dave Klein covers what your next steps might be using the Python development skills you learned in this course.

Next steps

Learn more about Apache Kafka for Python developers in this free course on Confluent Developer!

  • Dave Shook is a senior curriculum developer at Confluent. He previously worked as an instructor for Confluent and as a curriculum developer and instructor at CA Technologies. Most recently, Dave collaborated with Jun Rao in writing the Apache Kafka Internal Architecture course. In his spare time, Dave enjoys many outdoor activities including hiking, cycling, and kayaking as well as spending time with his grandchildren.

Watch demo: Kafka streaming in 10 minutes

Get started with Confluent, for free

このブログ記事は気に入りましたか?今すぐ共有