White Paper

Mainframe Integration - Use Case Implementation


Mainframes are fundamental to mission-critical applications across a range of industries. They’re reliable, secure, and able to manage huge volumes of concurrent transactions.

Despite their usefulness, however, they’ve proven to be difficult to integrate into more modern, cloud-based architectures. This is because accessing mainframe data can be:

  1. Difficult – due to complex, legacy COBOL code developed over decades.
  2. Risky – due to the sensitivity of making changes to business-critical applications.
  3. Expensive – due to consumption and network billing models.

Ultimately, this makes innovating with mainframe data more challenging. This is where streaming data pipelines with Confluent come in.

By building streaming data pipelines from the mainframe, you unlock mainframe data for use in real-time applications across cloud-native, distributed systems, without causing any disruption to existing mission-critical workloads. By creating a forward cache of your mainframe data, you also significantly reduce mainframe costs.

In this paper, we’ll detail the most common patterns for integrating your mainframe with Confluent. 


cc demo

Confluent Cloud Demo

Join us for a live demo of Confluent Cloud, the industry’s only fully managed, cloud-native event streaming platform powered by Apache Kafka
kafka microservices

Kafka Microservices

In this online talk series, learn key concepts, use cases and best practices to harness the power of real-time streams for microservices architectures
Image-Event-Driven Microservices-01

e-book: Microservices Customer Stories

See how five organizations across a wide range of industries leveraged Confluent to build a new class of event-driven microservices