Rise of the Kafka Heroes! Join the Data Streaming Revolution | Read the Comic
The business systems of an organization are a continuous source of events. Each system also needs to know about events happening in the other systems. Exchanging these events through direct API calls creates a web of inter-dependencies, is fragile and fails to scale. We examine how this problem can be solved through the use of right integration patterns implemented as a light-weight event hub that leverages the power of Kafka and Confluent to operate at enterprise scale. We demonstrate how JavaScript with its event-driven programming model can be a good fit for implementing an event hub that ensures guaranteed message delivery in the face of failures within the individual subscriber systems.
Many organizations having large engineering teams skilled in NodeJS and a multitude of NodeJs applications. We show how these teams can easily leverage the power of Kafka and scale their applications with the right architectural building blocks. We also offer insights from our own experience of building NodeJS based Kafka applications.