Kafka in the Cloud: Why it’s 10x better with Confluent | Find out more

Real-Time or Real Value? Assessing the Benefits of Event Streaming

Watch demo: Kafka streaming in 10 minutes

Get started with Confluent, for free

Written By

As a bitter, cynical, battle-hardened technologist, I’ve survived my fair share of platform architecture redesigns and tech stack changes. I’ve suppressed my gag reflex and tried to earnestly hear the team out many times when I’ve been pitched a program increment plan showing a team spending the next quarter refactoring with the shiny new framework, coding paradigm, or database. Our problems will continue unless we switch to React, Go Microservices, CockroachDB, GitLab CI/CD, and k8s, they lament!  I anticipate such a conversation whenever a new technology seems to prove the Barabasi-Albert Scale-free property because everyone uses it. Usually, the team offers some lame rationalization around improving build times by 20%, reducing cloud costs by 10%, or halving the lines of code for new features, but I can’t help but wonder if it’s just a fear of missing out.

Sometimes these investments pay off big, overcoming obstacles holding the company back. The right tech initiatives unlock scale and resiliency, which may be the difference between the business becoming viable or not. Far more frequently, the juice isn’t worth the squeeze. One commonly-cited estimate suggests that 70% of tech projects fail. While these aren’t all related to software development, something like a two-thirds failure rate feels right to me. These experiences make us incredulous about the marketing claims around new technologies. How often have we taken significant risks to incorporate new tech only to find the integration was far more challenging than expected and the improvements far less than promised? Often, we regret these changes. We would have been better off just spending the lost quarter bolting on new features to the old systems. At least then, the sales team would have been much happier.

Given the risks and expense, how do tech leaders and architects reason about the RoI of event streaming? Product and technology organizations are notoriously bad at rationalizing return on investment. Future development acceleration is among the most common potential benefits of new technology, but it’s hard to quantify and even harder to predict. Even benefits the CFO assures us can be calculated in the abstract, such as TCO, become difficult to measure in real organizations. Sometimes, RoI is justified in terms of promoting agency within the teams. At least once, I’ll admit I’ve authorized a new tech stack as a kind of excuse removal. I wouldn’t accept another busted sprint that the team blamed squarely on the legacy application.

Event streaming may have a lot of hype, but one can’t compellingly argue that it hasn’t changed the world. Activities most of us do daily, like ordering packages from Amazon, using a credit card, or call an Uber, have us interacting with systems that depend on event messaging systems. That doesn’t necessarily mean that event streaming makes sense for every use case.

So, how does one decide if it makes sense in your enterprise? Practitioners often cite “real-time” as streaming’s chief virtue. Yet, event streaming may still be worth the investment even if reducing the latency between data generation and insight isn’t that important.  I’ve met many who dismiss streaming as just a faster “batch” that should only be used when low latency is required. I’ve come to believe this is the same fallacy Microsoft made when it simply saw mobile as a desktop experience with a small screen. Apple instead saw mobile as a new paradigm that would transform how people interacted with technology, and it won. Similarly, many incumbents didn’t recognize the internet’s potential. They were displaced: from newspapers seeing the internet as just high-margin distribution to shops only seeing the internet as a new channel to sell existing inventory. Paradigm changes don't offer a faster, better world of yesterday but a fundamentally new way of doing things. History is full of examples of companies that embrace the new paradigms to beat those that just see faster horses and smaller devices.

Event-driven architectures are a similar enabling capability. They create new opportunities and approaches to building software and structuring technology teams.  In particular, streaming’s reduced time-to-market for new insights is transformational. As companies become software, we can’t anticipate every use of data in our products a priori. With event streaming, a new product or insight is often a matter of building a new consumer group rather than needing to break apart an existing data model. The bigger data gets, the more specialized one must be in how it is organized and queried. In event streaming, the specialization is that data is optimized for consumption rather than a particular query pattern. More than other paradigms, it supports new ideas and product features. This creates another important effect: by decoupling applications from bespoke point-to-point communication paths, dependencies between teams become better specified and scalable. Event messaging provides a foundation for different teams to produce events than those that consume them.

In my experience, the technology bets that foster process improvements that better align the product and technology organizations to respond to business realities yield far better outcomes than those that are better mousetraps that incrementally improve TCO, development velocity, or application performance. And that may be the most important RoI consideration for deciding if event streaming makes sense in your organization.

  • Andrew Sellers leads Confluent’s Technology Strategy Group, a team supporting strategy development, competitive analysis, and thought leadership.

Watch demo: Kafka streaming in 10 minutes

Get started with Confluent, for free

Did you like this blog post? Share it now