Eine Echtzeit-Brücke in die Cloud bauen – mit Confluent Platform 7.0 und Cluster Linking | Blog lesen

Low-latency Real-time Data Processing at Giga-scale with Kafka

Data volumes continue to grow, demanding new, more scalable solutions for low-latency data processing. Previously, the default approach to deploying such systems was to throw a ton of hardware at the problem. However, that is no longer necessary, as newer technologies showcase a level of efficiency that enables smaller, more manageable clusters while handling extreme workloads. Processing billions of events per second on Kafka can now be done with a modest investment in compute resources. In this session, you will learn how to architect and build the fastest data processing applications that scale linearly, and combine streaming data and reference data data-in-motion and data-at-rest with machine learning. We will take you through the end-to-end framework and example application, built on the Hazelcast Platform, an open source software engine designed for ultra-fast performance. We will also show how you can leverage SQL to further explore the operational data in the solution including querying Kafka topics and key-value data on the in-memory data store. Attendees will also get access to the Github sample application shown.

Moderator

John DesJardins

John DesJardins is Chief Technology Officer at Hazelcast, setting technology strategy for the Real-time Intelligent Applications Platform. His expertise in large scale computing spans Big Data, Internet of Things, Machine Learning and Cloud. He is an active speaker and writer. John brings over 25 years of experience in architecting and implementing global scale computing solutions with top Global 2000 companies at Hazelcast, Cloudera, Software AG and webMethods. He holds a BS in Economics from George Mason University, where he first built predictive models, long before that was considered cool.