Version 1.3.2
Plugin type:
Source
Enterprise support:
Confluent supported
Verification:
Confluent built
Author:
Confluent, Inc.

Kafka Connect Azure Blob Storage Source

The Azure Blob Storage Source Connector integrates Azure Blob Storage with Apache Kafka. The connector provides the capability to read data exported to Azure Blob Storage by the Kafka Connect Azure Blob Storage Sink connector and to publish the data back to a Kafka topic in either Avro, JSON, or ByteArray format. Depending on your environment, the Azure Blob Storage source connector can export data by guaranteeing at-least-once delivery semantics to consumers of the Kafka topics it produces.

The Azure Blob Storage source connector periodically polls data from Azure Blob Storage and pushes the data to Kafka. Depending on the format and partitioner used to write the data to Azure Blob Storage, this connector can write to the destination topic using the same partitions as the original messages exported to Azure Blob Storage, and can maintain the same message order. The connector selects folders based on the partitioner configuration and reads each folders Azure Blob Storage objects in alphabetical order.

Show more

Installation

Confluent Hub CLI installation

Use the Confluent Hub client to install this connector with:
confluent-hub install confluentinc/kafka-connect-azure-blob-storage-source:1.3.2
Copy

Download installation

Or download the ZIP file and extract it into one of the directories that is listed on the Connect worker's plugin.path configuration properties. This must be done on each of the installations where Connect will be run.
By downloading you agree to the terms of use and software license agreement.

Configure an instance of your connector

Once installed, you can then create a connector configuration file with the connector's settings, and deploy that to a Connect worker.

Support

This connector is supported by Confluent as part of a Confluent Platform subscription.