[Demo+Webinar] New Product Updates to Make Serverless Flink a Developer’s Best Friend | Watch Now

Moving Up the Curve: 5 Tips For Enabling Enterprise-Wide Data Streaming

Written By

Confluent recently released its 2023 Data Streaming Report: Moving Up the Maturity Curve”. The report found that data streaming is delivering business value with IT leaders driving up to 10x return on investment.

That said, as companies continue to expand their data streaming use cases, many struggle with non-technical hurdles around scaling, setting up operations, and hitting organizational silos.

In fact, 74% of respondents reported fragmented projects and uncoordinated teams and budgets as the major challenges. But despite these hurdles, 89% of IT leaders still indicated that investments in data streaming are important, with 44% citing it as a top strategic priority.

Background: The 5 stages of data streaming adoption

The data streaming report leverages the concept of the Confluent maturity curve, which includes stages 1-5 of adoption, outlined below. 

The Confluent Maturity Curve - created in 2017

I conceived of this model in 2017 when first working with customers implementing Confluent Platform.  Back then, most organizations were early adopters of data streaming. We observed individual project teams solving for specific use cases or solutions. Data streaming adoption tended to be tech-led, driven from the bottom-up. Adoption was perhaps more tactical than strategic.

Even today,  a number of related factors work in favor of bottom-up adoption of data streaming technologies like Apache Kafka®:

  • Kafka is open source, meaning teams can download it and experiment at will. Smart developers and operators are often self-taught. The business doesn’t necessarily need to be made aware of the tech.  

  • Again, because Kafka is open source, it is not necessary to justify a significant budget or complete an extensive selection or procurement process. 

  • As a result, Kafka adoption flies below the radar with little reason to escalate or gain C-suite sponsorship.

While these factors help drive rapid, early adoption of Kafka from stages 1 to 3, the very same factors act as blockers when moving from stage 3 to 4 of the adoption curve. As we move to level 4, we often see:

  • Lack of general business awareness

  • Lack of allocated budget for Kafka or data streaming and lack of any formal selection or procurement process

  • No senior level or C-suite sponsorship.

Challenges at later stages of adoption

Level 4 of the adoption journey is characterized when the disparate teams using Kafka and data streaming start to coordinate and combine forces. These teams typically coalesce to implement some sort of Kafka-as-a-Service offering or a Center of Excellence, building a data streaming platform that serves the entire enterprise and leverages economies of scale.

Pulling disparate projects and teams together into a top-down strategic motion, companies require the very things that are missing during early adoption:

  • Education and awareness from the business for wider buy-in

  • A budget or solid business case outlining real, enterprise-wide business benefits 

  • Senior-level sponsorship from people who care enough to drive change.

Because most organizations get to level 3 adoption without these factors, teams often continue to operate in a siloed manner, using data streaming tactically per project. They hit the ‘level 4 barrier’ and stall at level 3.

Knowing this challenge, we should perhaps re-draw the Confluent maturity curve to emphasize the step from level 3 to 4:

So, how do organizations accelerate this step-change, which involves a pivot from a bottom-up, tech-led motion to one that is joined by a top-down, business-led motion? 

Let's take a deep dive into my five key recommendations to make this happen.

1. Show what a level 4 & 5 data streaming platform looks like

We recommend writing a short strategy or internal proposal to describe an enterprise-wide data streaming platform, highlighting the:

  • Current situation: Multiple data streaming teams and lack of common standards that support data sharing. 

  • Target state: An enterprise-wide data streaming platform that allows for managing data as a product.  

  • Required capabilities: A new team with an enterprise-wide mandate for data streaming. This requires budget, sponsorship, and a focus on building skills, expertise, and change management to drive consumption of the service. 

  • Metrics and ROI: Determine what will be measured to test success and track cost savings and opportunities to increase the strategic value of data.

  • Proof-points: Show where this has been done before.

Note that moving to level 4 of adoption does not have to mean “centralization” of all data. Many organizations are aiming to implement domain ownership of data and maintain a federated model for data ownership. This is possible with level 4—but some level of central control of processes and standards of managing data is critical.

2. Make the business case (specifically for level 4+)

Organizations need to explain why the data streaming platform—at Level 4—is important. The benefits include both below-the-line consolidation and standardization savings. But more significantly, above-the-line benefits include driving real business value from data; we see improved service levels, so less risk with a more reliable and resilient service.  Also, we see a network effect (Metcalfe’s Law), where more consumers of a data streaming platform write back more data to the platform, which in turn results in more consumers, and so on.  Organizations can start to fully leverage the strategic value of [shared] data by “managing data like a product.”

To help outline the strategic business case for level 4, we can focus on the following two areas:

1. The value of managing data like a product. McKinsey states: 

  • New business use cases can be delivered as much as 90 percent faster.

  • Total cost of ownership, including technology, development, and maintenance costs, can decline by 30 percent.

  • The risk and data governance burden can be reduced.

2. Reducing the cost of managing data. Level 4 can help create a much more efficient data foundation, in addition to enabling near-term, bottom-line impact. 

  • McKinsey states a midsize institution with $5 billion of operating costs spends more than $250 million on managing data, so savings in this area can be significant.

3. Show how to implement enterprise-wide data streaming with a Target Operating Model

Multiple definitions describe what an operating model is, but put simply, it is a way of translating strategic intent into operational capabilities, describing how people, process, and technology will deliver value. It is worth noting that when designing a Target Operating Model (TOM), there is no one-size-fits-all.

People Layer

Perhaps the hardest, yet most important, thing to get right is getting the right people in place. At stages 1, 2, and 3 of the maturity curve, we typically see teams creating a data streaming capability so they can use it.  At level 4, the ‘delivery’ of data streaming and ‘consumption’ functions split. So, the data streaming operating model needs to cover the ways of working between delivering a service and consuming the service: 

1. Delivery of the data streaming service (Service perspective)
  • Team structure, talent, and ways of working.

  • Efficient backend automated operations. 

  • Driving common standards, processes, and procedures to ensure a common, shared technology.

  • Providing SLAs and how this will impact the business.

This may require a completely new team or an evolution of the existing team.  

2. Consumption of data streaming (Consumer perspective)
  • Access to a standard data streaming service—how to leverage the full capabilities of data streaming.

  • How application developer teams will operate. 

  • Significant internal marketing campaign to get teams to switch to use the service. 

  • Change management.

Think Tech Talks, internal comms, success stories, etc. 

Process Layer

Create the team charter (tied to the data streaming strategy), which includes:

  • Key activities: Outline of main service offerings

  • Governance framework: This includes team members/reporting lines 

  • Engagement model: This includes service providers/recipients/ways of working

  • Internal marketing: Communication, education, training, change management

  • Budget: Chargeback, business value attribution, results, managing risks, and reporting success metrics

  • Planning and next steps: Link to Kafka/data streaming strategy

Tasks of the centralized service/dedicated team should include providing oversight, coordinating other teams, allocating resources, and setting standards/agreeing to processes and ways of working, including engagement with business units and IT.

Technology Layer

Questions when designing the technology layer include: 

  • Should the organization aspire to a single, central Kafka cluster, or federated Kafka clusters across LoBs, geographies, or functions? 

  • Similarly, what should be shared within a multi-tenant environment vs. when should applications or teams have their own dedicated, single-tenancy platforms?

  • What is on-prem vs. in the cloud?

What should the overall architecture look like in terms of Disaster Recovery?

The answers to these questions ultimately depend on a number of business and technical considerations. Our professional services team is always happy to discuss these situations and provide customers with recommendations.

4. List all stakeholders involved

Breaking into level 4 of data streaming adoption is a cross-organization strategic shift. We pivot from bottom-up, tech-led adoption to top-down, strategic adoption.  Any transformation of this type requires ‘change management’—critical to this is identifying all stakeholders involved, including those providing senior-level sponsorship. 

Change management can take the form of formal training, education sessions, tech talks, setting up communities of practice, and new performance metrics. 

Many organizations are exploring concepts around data mesh, which allow for federated data ownership among domain data owners who are held accountable for providing their data as products, while also facilitating communication across different BUs or locations. The stakeholders for data streaming are likely the same for data mesh-type concepts, which federates data across the BUs, while some level of central coordination is still required.

5. Confirm when to implement this—specifically at level 3

From our experience, it is important to note two things when looking at the organization design and timing for moving to level 4:

  1. This isn’t a single project—this data streaming setup should be a cross-organizational, strategic function that is set up over time.  

  2. While we suggest a central team to oversee data streaming across the enterprise, the overall model does NOT have to be centralized. It also requires some level of federation (and domain ownership of data). This will help speed up decision making and bring people together in teams that are relentlessly focused on delivering solutions built on data streaming. 

For organizations that have had poor experiences implementing central services or shared teams, and prefer their operations to align more closely with BUs, it may be worth slowing down the central team and setting up more of a virtual team to set standards. 

As a minor but important point, give the project a name.  

Implementing these recommendations should help organizations transition from siloed project teams to a horizontally interconnected service for data streaming. 

These actions will provide the foundation for realizing the business and technology benefits of enterprise-wide sharing of real-time data.

Download your copy of the 2023 Data Streaming Report: Moving Up the Maturity Curve” today to see how data streaming is delivering massive real-world benefits.

  • Lyndon is a Director of Customer Solutions at Confluent. He helps organizations model how Digital and specifically how event-centric thinking can enable new business models and improve company performance. Prior to joining Confluent, Lyndon was a; Director of Digital Strategy at Acquia (2014-2017), Head of Customer Success at a UK digital start-up (2009-2014) and an IT Strategist and Enterprise Architect at Accenture (1996-2008). Lyndon holds an MSc in Neuroscience incl. Neural Networks & AI, from Oxford University. He runs his own blog on all things Digital & Data here: https://lyndon.london

Did you like this blog post? Share it now