In an integrated business environment where heterogeneous database technologies are deployed, Kafka Connect offers data sinks and sources that easily enable seamless integration, abstracting data exchange details from senders and receivers. However, challenges may arise as integration grows in complexity.
Recently, the Enterprise Business Information Systems division at Jet Propulsion Laboratory was tasked with delivering an event-driven data exchange between two of its major systems. Delivering this solution successfully required conquering complex data dependencies across tables and respecting business and atomicity requirements. However, enterprise data exchanges also require a robust feedback loop to properly identify, disseminate and remediate errors in the process to maintain data integrity and user trust.
In this talk, we will discuss how we overcame these challenges and delivered a fully automated and robust data exchange solution by extending Kafka Connect, leveraging ksqlDB streams/tables and aggregations, and developing custom microservices.