[Webinar] Harnessing the Power of Data Streaming Platforms | Register Now

Presentation

Modern Data Flow: A Better Way of Building Data Pipelines

« Current 2022

Data pipelines perform much of the undifferentiated heavy lifting in any data intensive application. And yet despite their being critical to the data value stream, pipelines haven't always served us well. What can we learn from recent trends in the data platform space that can help us build better pipelines?

Recent trends, from Data Mesh, through the explosion of new data platform products and services that comprise the modern data stack, have helped unlock the ability for all parts of an organization to contribute reusable data capabilities into the data value stream. Practices that have emerged over the last twenty years in the adjacent areas of software architecture and development are now influencing the ways in which organizations manage, share and derive value from their data using data pipelines. We've looked at these trends and modern practices, and from this analysis we've derived five principles for building better pipelines, which we call Modern Data Flow - a better way of building data pipelines.

In this session we'll review the Modern Data Flow principles, and discuss them in the context of trends in the data landscape and modern software engineering practices.

Related Links

How Confluent Completes Apache Kafka eBook

Leverage a cloud-native service 10x better than Apache Kafka

Confluent Developer Center

Spend less on Kafka with Confluent, come see how