Confluent
May Preview Release: Advancing KSQL and Schema Registry
Confluent Platform

May Preview Release: Advancing KSQL and Schema Registry

Rohan DesaiRan Ma

We are very excited to announce the Confluent Platform May 2018 Preview release! The May Preview introduces powerful new capabilities for KSQL and the Schema Registry UI. Read on to learn more, and remember to share your feedback and help shape Confluent software! You can do that by visiting the Confluent Community Slack channel (particularly the #ksql and #control-center channels) or by contributing to the KSQL project on GitHub, where you can file issues, submit pull requests, and contribute to discussions.

Download The Preview Release

 

Confluent Control Center

Schema Registry

Schema Registry management has been one of the most requested features from our customers. In this preview release, we’re introducing the new Schema Registry UI, which allows users to see the schema per topic, along with its version history, and easily compare between a previous schema and the current one.

The new UI is designed to help the operations team with schema management and allow beginners to learn about Schema Registry.

To access the topic’s schema, simply navigate to the new SCHEMA tab in the topic details page or click on ‘•••’ and select “Schema”.

Notice the “Value” and “Key” tabs, where we show the schemas for both the key and value of the messages in the topic. You can view all versions of the schema by clicking on the “Version History” button and compare it against the current version. You can also download the schema by clicking on the “Download” button on the top right.

KSQL Editor Supports Autocompletion

We’ve added autocomplete to the KSQL query editor to help you compose queries faster. No more trying to figure out available streams and tables and specific KSQL syntax, autocomplete will help you as you type.

KSQL

INSERT INTO

INSERT INTO is a new statement that lets you write query output into an existing stream. INSERT INTO is currently not supported for tables. You can use INSERT INTO to merge output from multiple queries into a single output stream.

For example, suppose you are a retailer with separate streams for online and in-store sales. You want to compute your daily total sales for different items. You can use INSERT INTO to populate a stream for all sales and aggregate that stream:

CP Docker Images for KSQL

Confluent Platform Docker images are now available for the preview versions of both the KSQL server and KSQL CLI. You can use the confluentinc/cp-ksql-server image to deploy KSQL servers in interactive (default) or headless mode. You can use the confluentinc/cp-ksql-cli image to start a KSQL CLI session inside a Docker container.

Documentation for these images can be found at docs.confluent.io.

Going forward, we’ll continue to release these Docker images for each preview release as well as for each Confluent Platform stable release.

Topic and Schema Cleanup

The DROP statement for streams and tables now supports an option for also deleting the underlying Kafka topic and, for streams and tables in AVRO format, the registered Avro schema. To have DROP clean up topics and schemas, you need to add  DELETE TOPIC to your DROP statement:

This lets you ensure you don’t leave topics and schemas around as you create and drop streams and tables. This is helpful particularly during iterative development and testing. If you do want to keep your topics and schemas, say for consumption by some other system, then simply omit the DELETE TOPIC option from your statement:

Where to go from here

Try out the new Confluent Platform May 2018 Preview release and share your feedback! Here’s what you can do to get started:

Download The Preview Release

Subscribe to the Confluent Blog

Subscribe

More Articles Like This

Team with minimum privileges
Gwen Shapira

Kafka Streams and KSQL with Minimum Privileges

Gwen Shapira .

The principle of least privilege dictates that each user and application will have the minimal privileges required to do their job. When applied to Apache Kafka® and its Streams API, ...

Using Apache Kafka to Drive Cutting-Edge Machine Learning
Kai Waehner

Using Apache Kafka to Drive Cutting-Edge Machine Learning

Kai Waehner .

Machine learning and the Apache Kafka® ecosystem are a great combination for training and deploying analytic models at scale. I had previously discussed potential use cases and architectures for machine ...

Kafka Connect Deep Dive – Converters and Serialization Explained
Robin Moffatt

Kafka Connect Deep Dive – Converters and Serialization Explained

Robin Moffatt .

Kafka Connect is part of Apache Kafka®, providing streaming integration between data stores and Kafka. For data engineers, it just requires JSON configuration files to use. There are connectors for ...

Leave a Reply

Your email address will not be published. Required fields are marked *

Try Confluent Platform

Download Now

We use cookies to understand how you use our site and to improve your experience. Click here to learn more or change your cookie settings. By continuing to browse, you agree to our use of cookies.