[Webinar] Harnessing the Power of Data Streaming Platforms | Register Now

Presentation

Lowering the Barrier to Stream Processing

« Current 2022

Providing an easy way for developers to use event streams without having to learn all the complexity of how to build a streaming application from scratch was key to the adoption of an event-driven architecture at Babylon.

Java/Scala engineers could be well catered for by KStreams and the like. But we wanted a super simple way for all of our python developers to be able write code that does super simple things (like: react to an event, do some processing and then optionally output some more events). They should only have to write a single (typed) function, and not have to worry about consumers, producers, serdes or any other things that us Kafka-folk love to harp on about.

We found that by using the ""agent"" concept in faust we could provide our engineers with a ""Function as a Service""-like experience specifically for processing events on Kafka streams. By ruthlessly focussing on the developer experience we massively increased the number of developers using the data streams available at Babylon and reduced the time to deployment for a new stream processing app by an order of magnitude.

Related Links

How Confluent Completes Apache Kafka eBook

Leverage a cloud-native service 10x better than Apache Kafka

Confluent Developer Center

Spend less on Kafka with Confluent, come see how