[Webinar] Shift Left to Build AI Right: Power Your AI Projects With Real-Time Data | Register Now
In our previous blog post, we explored how Confluent has evolved into a comprehensive data streaming platform (DSP). Now that we understand what a DSP is, let's address a key question: How does it deliver business value?
When business stakeholders assess value, they typically look at how a solution might help drive benefits across three key areas:
Making money (driving top-line revenues)
Saving money (driving the bottom line, i.e., reducing expenses that will reduce profit)
Protecting money (mitigating risk)
So how does a DSP support these key areas? How can we join the dots by applying reasoning, approximate estimates, and sound assumptions to arrive at quantified value calculations for a DSP?
For the purpose of this article, we’ll explore some hypothetical examples to explain how a DSP can help drive revenue, save money, and protect money.
First, let’s start with a basic premise: Software, including Software-as-a-Service, is becoming increasingly critical to how businesses make money. In 2019, Jay Kreps wrote a blog, Every Company is Becoming Software, which outlines how businesses are not only using more software but are software. Their core products or services are specified, monitored, and executed in software. In 2024, Jay went further to say that companies are becoming software and AI. This idea is mirrored in Scott Galloway's 2025 Predictions, in which Galloway predicts that all software will be applied with a “thick layer of AI.” Galloway introduces the term “Service-as-a-Software” and states that this market will be worth $10 trillion, compared to a software-only market of $650 billion.
The message is clear: Software, including AI, is more important to businesses than ever. So how does the DSP drive revenue in this context? We see two distinct areas of value:
Unlocking new applications acting on data by supporting new business solutions that would be hard to deliver with traditional software methods that rely on data at rest.
Expediting application development with real-time data accessibility and easier integration, shortening development cycles and speeding up time-to-market, thereby accelerating revenue and profit.
In summary, the DSP is both an enabler and accelerator of new solutions and use cases. The value of the DSP in this context is inextricably linked to the use case or solution it supports. This is a divergence from Confluent’s previous measures of value as outlined in Part 1 of this blog post. During Confluent’s Act 1 and Act 2, we calculated value by comparing Confluent with “the next best alternative.” This meant we compared running Confluent Platform with self-managing open source Apache Kafka® or compared Confluent Cloud with using an alternative managed Kafka service. Crucially, we separated the underlying value of Kafka from any value calculations.
For the DSP, this is no longer the case. Confluent’s DSP is now a platform play. The “value” gloves are off. To assess the value of the complete platform, including Connect, Stream, Process, and Govern, we’re closer to the whole solution and use case, and we can claim a larger part of the value pie.
Let’s take a high-level, hypothetical example of a DSP unlocking and accelerating application development, which expedites a business’s ability to make money. We can aim to quantify this value as outlined in Example 1.
Imagine a financial technology (fintech) company that creates a new real-time payments platform with a DSP at the core. The platform drives annual revenue of $50M. To be successful, the platform has to complete payment processing within milliseconds of a transaction’s occurrence.
Prior to the DSP, most enterprise architectures would manage data across two distinct and different estates: the operational estate, which serves the transactional and real-time systems that run the business, and the analytical estate for after-the-fact analysis and business decision-making. In a legacy data world, data is collected and processed in batches or through APIs, resulting in time gaps between the occurrence of an event and its reflection in other systems. This results in inconsistency of information across the estates.
In contrast, the DSP powers payment processing with immediate updates to adjacent solutions and systems. A stream-first platform processes data at speeds and scale that were not possible with previous technologies. As a result, the team can create discoverable, contextualized, trustworthy, and reusable data products that are “bottled at the source,” within milliseconds of a business event that triggers the data. Data products, such as a “customer,” “payment,” and “location,” become foundational data assets that can be reused across a number of new operational and analytical add-ons. These can include new functionalities such as fraud prevention, real-time notifications, and cross-sell and upsell next-best actions—driving further revenue and creating additional opportunities to assess value.
The DSP also drives agility in the development process. The data architecture team is able to shift left to improve response times to events and increase the operational readiness of data, minimizing the load on downstream architectures. Instead of repeatedly wrangling and cleansing data downstream, application development teams can access and process data in real time with ease. The DSP bridges the analytical and operational divide.
Before the DSP, application development teams would spend a large amount of time wrestling with data, including finding the source of truth for trustworthy data and integrating this across operational and analytical systems. Now, the DSP helps reduce time to market and accelerate the first-year revenue streams significantly.
So how do we quantify all this value?
In this example for Value Lever 1, we can look at the overall value of the solution—the payments platform—and see that it drives $50M annual revenue. The DSP is clearly not solely responsible for all of this revenue. It’s simply a core component of the solution. We can therefore assert that the DSP significantly influences this value by having a contributing benefit. So the question is: If we are quantifying value, what proportion can we claim? Answer: It depends. One guideline is to use the proportion of development and/or operations costs to attribute value.
If, for example, the DSP accounts for 20% of the ongoing run costs, we can use 20% as a factor for value attribution.
20% of $50M annual revenue = $10M annual benefit, which can be claimed by the DSP. It’s not perfect, but it’s one approach to attributing value.
Value Lever 2—operational efficiency—represents a hard benefit, as the time-to-market advantage can be directly attributed to the DSP’s ability to accelerate development effort. In our example, this equates to $25M savings.
There are other benefits too. The DSP significantly drives agility and reduces the cost of development, resulting in capital expenditure savings (e.g., ~20% of the total cost to implement the complete platform of $250M). We’ll cover savings in the next section.
Of course, this is a high-level, top-down illustrative example. In the real world, we would likely break down this analysis, assessing more granular variables, assumptions, estimates, and proof points.
When summed up, the benefits of driving revenue can amount to tens of millions of dollars. These amounts may appear large, but they’re generally corroborated by external data points:
Forrester estimates that a modest 10% enhancement in data accessibility could drive more than $65 million in additional net income for typical Fortune 1000 businesses.
Harvard Business Review states that companies that treat their data like a product can reduce the time it takes to implement data in new use cases by as much as 90%, which will accelerate time to market and help drive revenue.
In the context of these two external data points, the potential value of a DSP in Example 1 feels conservative. Now let’s look at how the DSP can help save money.
As we’ve seen, Confluent has consistently demonstrated that it offers the most cost-effective solution when compared to self-managing Kafka or using alternative managed Kafka services. Confluent can help reduce:
Infrastructure (including the need to outsize your capacity)
Development and operations personnel costs
Risk and hidden costs, including intangibles associated with security incidents and downtime
But the DSP goes beyond this simple total cost of ownership model of savings. It also reduces the costs tied to managing enterprise data as a whole, including architecture, governance, and data consumption costs. In our Act 1 and Act 2, we centered on Kafka costs. In Act 3, we’re widening across the enterprise data landscape.
McKinsey estimates that a midsize institution with $5B of operating costs spends more than $250M on overall data management. The DSP drives significant savings across several of these data management categories, but most notably the following.
Architecture: By shifting-left data processing and governance, the DSP drives the operational efficiency of managing data, including costly infrastructure and downstream data stores. It eliminates the need to duplicate and process data through compute-heavy, expensive batch jobs, streamlining the entire data pipeline. The common medallion architecture of processing data from bronze to silver to gold standards for analytics can be simplified with quality data processing upstream.
Governance and consumption through standardization, consolidation, and simplification: Like any great platform, the DSP introduces efficiencies and supports transformation and modernization initiatives through standardization. By using connectors to connect your existing applications and systems to Confluent, you can easily start rewriting or migrating your producer or consumer applications and move away from legacy applications, future-proofing your architecture. The role of reusable, real-time, trustworthy data products that are managed within a DSP can be game-changing. We can also help you migrate from expensive legacy integration tools that are no longer fit for purpose.
Overall, the DSP can drive massive savings in addition to the traditional savings from a fully managed or supported solution. We quantify this value as outlined in Example 2 below.
Let’s take a hypothetical retail organization that has a core enterprise resource planning (ERP) system and significant legacy estate. This company is in the process of building new capabilities and modernizing its data architecture to take advantage of real-time insights, AI, and more. It has recently invested in implementing a modern data lake.
Currently, the retailer spends approximately $260M a year managing data. This is a rounded number because the costs are spread across different business units and silos and can be hard to calculate.
The retailer aims to model information from multiple sources and apply insights in-store and online. It has to manage the data, ensuring that it’s clean, that privacy is protected, and that compliance responsibilities are met.
The DSP helps eliminate costs across the legacy estate in the following areas.
Architecture simplification and consolidation: After adopting the DSP, the retailer replaced costly peer-to-peer integrations with an enterprise-wide streaming architecture. Data can be offloaded from mainframe systems and processed and streamed into modern reporting tools, shifting left away from costly static data stores and reprocessing in the data lake. The simplified data environment enables the retailer to fully decommission data repositories, streamline data management, and improve data quality and consistency in the lakehouse, accelerating system and process updates.
Governance: The retailer was able to transition from fragmented data repositories to standardization of data management protocols, easing data ingestion, data cleansing, and data streaming with common standards. This enabled the creation of reusable, real-time data products that drastically reduced governance overhead.
Consumption: Prior to the DSP, data consumers spent about 25% of their time searching for data and even more time cleansing data. With the DSP—including the data catalog, lineage, and quality—productivity and performance improved significantly.
Overall, the retailer significantly avoided waste and manual effort, cutting its annual data spend by 20%. This would result in annual savings of about $52M across the enterprise.
Again, this example is high-level, top-down, and illustrative only. In the real world, we would complete a more granular and detailed analysis.
As in Example 1, the dollar savings here may appear high, but the numbers are backed up by external data points.
Harvard Business Review states that companies that treat data like a product can decrease their total ownership (technology, development, and maintenance) costs by up to 30%, which would result in $75M in annual savings for a company with a $250M annual data management expense. In one example, it describes a telecommunications company that tracked the impact of creating a single data product across 150 use cases. It estimated hundreds of millions of dollars in cost savings and new revenue within three years and a cumulative financial impact of $5B over 10 years.
Managing data comes with risk. Consequences of data breaches, downtime, or even performance degradation can range from a relatively low impact to severe reputational damage that could negatively affect shareholder value. Confluent’s DSP is highly scalable, resilient, and fault-tolerant. It reduces the threat profile and the chances of downtime, data loss, corruption, read-write errors, and security breaches.
Despite the link between Confluent’s DSP and mitigation against risk costs, assigning a dollar value to the DSP can be contentious. We can measure risk using the formula Risk = Likelihood x Impact and show that the DSP can reduce risk by decreasing the probability of an event occurring (Likelihood), limiting the potential severity of an event’s consequences (Impact), or both. If the result of this calculation is an overall reduction to risk cost, it demonstrates the DSP value.
However, our customers mostly steer us toward unquantified value in the risk mitigation bracket. For many, the value here is simply the peace of mind that comes with knowing their data systems are reliable. Sometimes it’s just about sleeping better at night.
There is a final point. Many IT organizations will care about future-proofing architecture—reducing technical and data debt and building a scalable architecture that can adapt to changing market conditions. This is especially true in the era of artificial intelligence (AI) and machine learning (ML), which is estimated to have an impact of trillions of dollars on the wider economy. Most organizations are now realizing that every AI challenge is actually a data challenge and that access to trustworthy data remains one of the main barriers to value creation from generative AI.
We believe the DSP and real-time data products will be fundamental to unlocking the massive value potential in fueling generative AI applications. Again, this is backed up by the 2024 Data Streaming Report in which 90% of IT leaders surveyed say that DSPs can lead to greater innovation in AI and ML development. What is the quantified value here? Again, this is an open area where we may choose to assign dollar amounts—or, more likely, we will simply highlight this as an area of massive business benefit without attributing a quantified dollar amount.
Unlock the potential of real-time data for your business. Explore our Tech Exec Hub to discover how industry leaders are achieving 3x faster decision-making with Confluent's DSP. Start your transformation journey today.
Our mission is to build a DSP and put it at the heart of every company so that you can build faster, save money, and reduce risk. What can you do if you’ve read this far and you’re intrigued by the opportunity ahead for the DSP in your organization? We’re so confident in the potential value of the enterprise-wide DSP that we offer a free-of-charge custom service that includes detailed discovery and discussions to create a business case for the DSP. Contact our Business Value Consulting team (bvc-team@confluent.io) or your account executive to start the conversation today.
Apache®, Apache Kafka®, Kafka®, Apache Flink®, and Flink® are registered trademarks of the Apache Software Foundation. No endorsement by the Apache Software Foundation is implied by the use of these marks.
Discover how Confluent transformed from a self-managed Kafka solution into a fully managed data streaming platform and learn what this evolution means for modern data architecture.
Want to drive smarter, faster decisions? A data-driven culture gives teams real-time insights, accelerates innovation, and maximizes efficiency. Discover how to break down data silos, craft a winning strategy, and transform your organization.