Do you want to know what streaming ETL actually looks like in practice? Or what you can REALLY do with Apache Kafka once you get going—using config & SQL alone?
This project integrates live data from the UK rail network via ActiveMQ along with data from other sources to build a fully-functioning platform. It includes analytics through Elasticsearch and exploring graph relationships in Neo4j, as well as real-time alerts delivered through Telegram.
This talk will show how I built the system, and include live demos and code samples of the salient integration points in ksqlDB and Kafka Connect.
The data may be domain-specific but the challenges of handling batch and stream data to drive both applications and analytics are encountered by many. This talk will give people lots of concrete examples of patterns and techniques for integration and stream processing.