Data Serialization

Changing Data Serialization Format from JSON to Avro

Data comes in different formats. Typically, you have to convert data from JSON to Avro every time you want to use the data in the Avro format. KSQL provides a powerful way for developers to reserialize the data in a Kafka topic by defining new streams with the desired serialization of the new topic, populated by the streaming events of the original topic.

The fantastic thing here is that these are streaming transformations, so not only does all existing data on the topic get converted, but so does every single message that subsequently arrives on the source topic.

Directions

Convert a topic from JSON to Avro.

Here is the source data, in a topic called mysql_users:

{"uid":1,"name":"Cliff","locale":"en_US","address_city":"St Louis","elite":"P"}

1. Define the source topic’s schema:

CREATE STREAM source_json (uid INT, name VARCHAR, locale VARCHAR, address_city VARCHAR, elite VARCHAR) \
WITH (KAFKA_TOPIC='mysql_users', VALUE_FORMAT='JSON');

2. Create a derived topic in Avro format:

CREATE STREAM target_avro WITH (KAFKA_TOPIC='mysql_users_avro',VALUE_FORMAT='AVRO') AS \
SELECT * FROM source_json;
< Back to the Stream Processing Cookbook

We use cookies to understand how you use our site and to improve your experience. Click here to learn more or change your cookie settings. By continuing to browse, you agree to our use of cookies.