Confluent
Kinetica Joins Confluent Partner Program and Releases Confluent Certified Connector for Apache Kafka™
Connecting to Apache Kafka

Kinetica Joins Confluent Partner Program and Releases Confluent Certified Connector for Apache Kafka™

Chris Prendergast

This guest post is written by Chris Prendergast, VP of Business Development and Alliances at Kinetica.

Today, we’re excited to announce that we have joined the Confluent Partner Program and completed development and certification of our Apache KafkaTM Connector, which lets you read and write data directly between Kafka and Kinetica’s GPU-Accelerated, In-Memory Analytics Database so you can ingest real-time data streams from Apache Kafka and take immediate action on incoming data.

Joint customers can now ingest streaming data from sensors, mobile apps, connected devices, and social media into Kinetica, combine it with data at rest, and analyze it in real-time to improve customer experience, deliver targeted marketing offers, and for operational efficiencies.

The Certified Kinetica Connector enables you to:

  • Easily leverage Kinetica’s GPU-accelerated, in-memory analytics database with Kafka for streaming analytics, so you can power real-time decision making.
  • Gain powerful insights by using Kinetica for machine learning, deep learning, and OLAP on real-time, streaming data
  • Develop robust data integration based on Kafka’s Connect API.
  • Build stream processing applications with the streams API in Kafka.

The Source code for the connector is available here: https://github.com/kineticadb/kinetica-connector-kafka

The connector contains two classes that integrate Kinetica database with Kafka:

  • KineticaSourceConnector: A Kafka Source Connector, which receives a data stream from the Kinetica database via table monitor. Data is streamed in flat Kafka Connect “Struct” format with one field for each table column. A separate Kafka topic is created for each database table configured.
  • KineticaSinkConnector: A Kafka Sink Connector, which receives a data stream from a Kafka Source Connector and writes it to the Kinetica database. Streamed data must be in a flat Kafka Connect “Struct” that uses only supported data types for fields (BYTES, FLOAT64, FLOAT32, INT32, INT64, and STRING). No translation is performed on the data and it is streamed directly into a table. The target table and collection will be created if they do not exist.

The Kinetica Connector can be deployed into any Confluent cluster from the Control Center GUI or command line using the Kafka Connect RESTful API. The Kafka Connect API ensures fault-tolerant integration between the Kafka topic stream and the Kinetica. For example, retailers can use the Kinetica connector to capture real-time, streaming geospatial data from shopper’s mobile phones as Kafka streams, combine it with customer loyalty data in Kinetica, and push out targeted, personalized, location-based offers through mobile apps.

You can now seamlessly add Kinetica to your scalable and secure stream data pipelines. Kinetica’s GPU-accelerated, distributed, in-memory analytics database provides truly real-time response to queries on large, complex and streaming data sets.

Combined with the Confluent enterprise-grade streaming data platform, this powerful solution will help you capitalize on streaming data to power real-time decision making and drive your business results.

Subscribe to the Confluent Blog

Subscribe
Email *

More Articles Like This

Andrew Stevenson

The Connect API in Kafka Cassandra Sink: The Perfect Match

Andrew Stevenson . .

This guest post is written by Andrew Stevenson, CTO at DataMountaineer. Andrew focuses on providing real-time data pipelines allowing reactive decision making, streaming analytics and big data integration. He has ...

David Tucker

Announcing the Certified DataStax Connector for Confluent Platform

David Tucker . .

Apache CassandraTM, with its support for high volume data ingest and multi-data-center replication, is a popular and preferred landing zone for web-scale data. DataStax Enterprise (DSE), a data management layer ...

Kafka Connect Sink for PostgreSQL from JustOne Database
Duncan Pauly

Kafka Connect Sink for PostgreSQL from JustOne Database

Duncan Pauly . .

The following post was written by guest blogger Duncan Pauly from JustOne. JustOne is a member of the Confluent partner program. Duncan is the Chief Technology Officer at JustOne and possesses over two decades of senior technical ...

Leave a Reply

Your email address will not be published. Required fields are marked *

Try Confluent Platform

Download Now