Get started with RBAC at scale, Oracle CDC Source Connector, and more within our Q2 Launch for Confluent Cloud | Register for demo
Developers writing event streaming applications can use Kafka Connect to capture events from end systems and then use the Kafka Streams API to transform that data. In building these pipelines, they need to consider data stream format and serialization. For example, some Kafka Streams methods require record keys to be non-null, so either the connector or the application may need to add keys to the original event stream. Or, another consideration is how the record keys or record values are serialized—you must use the appropriate serializer/deserializer (SerDes) to convert data in each direction.
Download the white paper to dive into full Kafka examples, with connector configurations and Kafka Streams code, that demonstrate different data formats and SerDes combinations for building event streaming pipelines:
Example 1: Confluent CLI Producer with String
Example 2: JDBC source connector with JSON
Example 3: JDBC source connector with SpecificAvro
Example 4: JDBC source connector with GenericAvro
Example 5: Java producer with SpecificAvro