Register for Apache Kafka®, Confluent, and the Data Mesh

Announcing ksqlDB 0.18.0

We’re pleased to announce ksqlDB 0.18.0! This release includes pull queries on table-table joins and support for variable substitution in the Java client and ksqlDB’s migration tool. We’ll step through the most notable changes, but check out the changelog for a complete list of features and fixes.

Pull queries on even more tables

In ksqlDB 0.17.0, we released pull queries on materialized views created using the CREATE TABLE AS SELECT statement. We are now extending pull queries to materialized views created using table-table joins. Here is an example of fetching the current state of your materialized view INNER_JOIN by using a pull query:

CREATE TABLE LEFT_TABLE (id BIGINT PRIMARY KEY, name VARCHAR, value BIGINT)
 WITH (kafka_topic='left_topic', value_format='json', partitions=4);

CREATE TABLE RIGHT_TABLE (id BIGINT PRIMARY KEY, f1 VARCHAR, f2 BIGINT)
 WITH (kafka_topic='right_topic', value_format='json', partitions=4);

CREATE TABLE INNER_JOIN AS SELECT L.id, name, value, f1, f2 
FROM LEFT_TABLE AS L JOIN RIGHT_TABLE AS R ON L.id = R.id;

You can fetch the current state of your materialized view INNER_JOIN by using a pull query:
SELECT * FROM INNER_JOIN [ WHERE where_condition ];. For more information on pull queries, read the documentation.
Fetching the current state of your materialized view INNER_JOIN by using a pull query

More ways to substitute variables

You can now use variable substitution in the Java client and the ksqlDB migration tool.

The Java client now contains two methods, define and undefine, which work similarly to ksqlDB’s DEFINE and UNDEFINE statements. Here is an example of how they can be used:

// Define some variables
client.define("topic", "people");
client.define("format", "json");

// Execute the statement with all the variables replaced. This is equivalent to executing
// CREATE STREAM S (NAME STRING, AGE INTEGER) WITH (kafka_topic='people', value_format=’json’);
client.executeStatement("CREATE STREAM S (NAME STRING, AGE INTEGER) WITH (kafka_topic='${topic}', value_format=’${format}’);");

// Undefine a variable
client.undefine("topic");

// Return a map of all defined variables. This will return the map {“format”, “json”}
Map<String, Object> variables = client.getVariables();

To define variables in a migration, you can either add DEFINE statements in your migration file, or you can pass the --define flag when running the migration tool’s apply command.

Consider the following migration file:

DEFINE FIELD=’NAME’;
CREATE STREAM S (${FIELD}  STRING, AGE INTEGER) WITH (kafka_topic='${topic}', value_format=’${format}’);

If you run ksql-migrations --config-file /my/migrations/project/ksql-migrations.properties apply --next --define topic=people --define format=json to apply the above migration, the tool will execute the resulting statement:

CREATE STREAM S (NAME STRING, AGE INTEGER) WITH (kafka_topic='people', value_format=’json’);

Get started with ksqlDB

ksqlDB 0.18.0 widens support for pull queries and variable substitution. For a complete list of changes, refer to the changelog.

Get started with ksqlDB today, via the standalone distribution or with Confluent, and join the community to ask a question and find new resources.

Get Started

Zara Lim is an engineer on the ksqlDB team at Confluent. She joined in 2020 after completing her bachelor’s degree in engineering physics at the University of British Columbia.

Did you like this blog post? Share it now

Subscribe to the Confluent blog

More Articles Like This

Real-Time Gaming Infrastructure for Millions of Users with Apache Kafka, ksqlDB, and WebSockets

The global video game market is bigger than the music and film industry combined. It includes Triple-A, casual/mid-core, mobile, and multiplayer online games. Various business models exist, such as hardware

Real-Time Wildlife Monitoring with Apache Kafka

Wildlife monitoring is critical for keeping track of population changes of vulnerable animals. As part of the Confluent Hackathon ʼ22, I was inspired to investigate if a streaming platform could

Serverless Stream Processing with Apache Kafka, Azure Functions, and ksqlDB

Serverless stream processing with Apache Kafka® is a powerful yet often underutilized field. Microsoft’s Azure Functions, in combination with ksqlDB and Confluent’s sink connector, provide a powerful and easy-to-use set