What is one of the most popular patterns of using Kafka & AWS? Sending topic data to S3.
It’s the ultimate data sink for long term storage and analytics. Naturally then, it also makes a good fit for storing Kafka topic data.
Since saving data in S3 is so cheap & popular, enabling it for Kafka data opens-up many use-cases:
- To drive analytics
- For Disaster Recovery (DR) of Kafka
- To migrate Kafka on-prem to cloud
- Develop safely in separate dev Kafka cluster
- To save costs
Can you do it now? Yes, but it’s painful. You can use a connector and battle with a poor developer experience. And it’s possibly very expensive if you use a non open-source connector.
Can we do better? Enter AWS with Lenses.
In this talk we will show you:
- The latest feature-packed Lenses open-source S3 Source & Sink connectors
- Patterns & best practices for how to backup/restore to S3
- How to have a seamless, one-click experience to backup and restore in Lenses 5.3
Presenter
Adamos Loizou
Lenses.ioAdamos is a technologist with Lenses.io, currently working as a Product manager. With a background in distributed systems and an interest in human-centric design, he works on taming complex high-tech to solve human problems, simply and holistically. He has been fortunate to work on software of various forms, languages and paradigms. From coding nuclear-turbine design tools with Java, IoT eHealth bracelets, all the way to petabyte platforms-as-a-service in Scala and Python, his path has landed him in the world of streaming and Kafka. He spends lots of time thinking about software principles some traces of which you can find in his blog at https://adamosloizou.github.io/ . He hails from Athens, Greece, lives in London, UK and can play a mean hand of Jungle Speed.