[ウェビナー] Confluent と AWS を使用して生成 AI スタックを構築 | 今すぐ登録

Unleashing the Potential of Demand Forecasting with Data Streaming

作成者 :

The Need for Demand Forecasting

Data is a crucial asset for businesses in the digital age, serving as a foundation for informed decision-making and driving strategic growth. In today's highly competitive landscape, companies that harness and analyze data effectively gain a significant competitive advantage. Data provides valuable insights into customer behavior, preferences, and trends, enabling businesses to tailor their products and services to meet the specific needs of their target audience. By understanding their customers better, companies can improve customer experiences, increase customer loyalty, and foster long-term relationships.

Furthermore, data plays a pivotal role in optimizing operational efficiency and resource allocation within organizations. By analyzing internal data related to processes, supply chain management, and employee performance, businesses can identify inefficiencies and areas for improvement. Leveraging data-driven insights, businesses can also identify untapped markets, launch new products, and position themselves strategically to capitalize on emerging trends, ensuring sustainable growth and long-term success.

As such, demand forecasting has revolutionized the way businesses make data-driven decisions by harnessing the power of data to predict future customer demands accurately to not only find major efficiency savings throughout the supply chain but also tap into market opportunities ready to be seized. Let’s look at this in action by examining Confluent customer Acme Inc. (pseudonym). Acme’s business model revolves around providing a comprehensive and accurate platform to forecast demand based on location-based intelligence for events. The company aggregates, enriches, and standardizes event data from numerous sources worldwide, offering businesses an unprecedented understanding of how events impact their operations. Whether it's concerts, conferences, sports events, holidays, or weather-related occurrences, the company collates data from multiple streams, curates it, and provides real-time insights to its customers to allow them to predict demand with a high degree of certainty.

Acme recently underwent a transformative journey by leveraging the power of data streaming using Confluent Cloud, a fully managed cloud-native data streaming platform. This blog explores the business value of data streaming for demand forecasting so organizations can accurately anticipate fluctuations in demand for their products, services, and staff by taking advantage of curated data over uncurated data.

How Data Streaming Addresses the Challenges with Traditional Approaches

A few years back, Acme faced challenges with the technologies they were using at the time. Their business necessitated the use of real-time event data which required their operations team to manage infrastructure for batch pipelines to move data between their core systems. Their team spent an inordinate amount of man hours each week maintaining the systems rather than developing solutions to benefit their clients, which created inefficiencies that hampered the growth of the business. 

To address these challenges, Acme turned to data streaming by leveraging open source Apache Kafka®. Unfortunately, additional technical challenges required the use of a cloud-native data streaming platform such as Confluent Cloud. Let’s take a closer look at those challenges.

Figure: There were challenges building connectors using open source Kafka to connect to monitoring and cloud data warehouse systems.

Scalability and reliability

Acme's demand forecasting venture faced a critical challenge concerning scalability and reliability with their open source Kafka infrastructure. As their data volumes grew, maintaining and managing Kafka clusters became increasingly complex and resource-intensive. The migration to Confluent Cloud provided a solution to these issues. By adopting a fully managed service, Acme could effortlessly scale their operations based on fluctuating data demands. For example, major sports events in a city resulting in increased demand for hotels, restaurants, and taxis, and conversely earthquakes and hurricanes resulting in demand contraction of the same services. Confluent Cloud offers a robust and reliable platform that ensures minimal downtime and uninterrupted data streaming.

Simplified operations

Managing an on-premises Kafka infrastructure requires dedicated resources and expertise. In this example, a failed node in a cluster would take months to recover due to the large amount of data behind each Kafka node. With open source Kafka alone, you’re on the hook to build and maintain foundational tooling and infrastructure, such as connectors, data governance and security, disaster recovery capabilities, and more. By moving to Confluent Cloud, Acme no longer had to burden themselves with infrastructure management tasks. The platform offers a user-friendly interface and automates various administrative processes. This allowed Acme's team to focus on developing and deploying applications rather than dealing with the complexities of maintaining Kafka clusters.

Eliminating the need for patching and upgrading

In an open source Kafka environment, regular patching and upgrades are essential to ensure data security and performance. However, this task can be time-consuming and occasionally prone to human error. Confluent Cloud, as a managed service, takes over the responsibility of patching and upgrading Kafka clusters, providing customers with the latest features and security enhancements seamlessly, and without downtime. 

Deep Dive Into the Architecture

The final deployment of the solution is depicted below, with the light blue components in the architecture backed by Confluent Cloud. (Note: Topic data and stream data has been renamed to maintain customer anonymity.) 

Terabytes of information from event data flow through a continuous data pipeline comprised of the following ecosystem:

  • Elasticsearch

  • Snowflake

  • MongoDB

  • Snowpipe

  • Confluent

Some examples of event data are included below:

  • Event data

    • Attendance peaks

    • Airport peak land times

    • Hotel attendance

    • Anticipated public transport usage

    • Anticipated taxi/Uber/hire car usage

    • Restaurant venue attendance

    • Delivery optimization

      • Real-time traffic and route optimization

    • Workforce optimization

      • Rostering based on forecasted demand

    • Inventory tracking in real time

Here's how the tools work together:

1. Confluent:

  • Confluent is a distributed streaming platform that acts as a central hub for data ingestion and distribution.

  • It can collect and publish data from various sources, such as applications, sensors, or databases.

  • Confluent publish-subscribe model ensures that data producers and consumers can work independently and at their own pace.

  • High availability through multi-region clusters with an SLA of 99.95%.

2. Elasticsearch: Confluent Sink

  • Elasticsearch is a powerful search and analytics engine designed for real-time data exploration and analysis.

  • It can be used to index and search structured and unstructured data, making it suitable for log analysis, full-text search, and more.

  • Confluent can send data to Elasticsearch in real time, allowing you to search and visualize data as it streams in.

3. MongoDB: Confluent Sink

  • MongoDB is a NoSQL database that is often used for storing and managing semi-structured or unstructured data.

  • It provides flexibility in data modeling and can handle large volumes of data.

  • You can use Kafka to stream data into MongoDB, enabling real-time updates and analytics.

4. Snowflake: Confluent Sink

  • Snowflake is a cloud-based data warehousing platform that provides a scalable and secure environment for data storage and processing.

  • It allows you to store structured data, perform complex queries, and run analytics at scale.

  • Confluent can be used to feed data into Snowflake for further analysis and reporting.

5. Snowpipe: Confluent Sink

  • Snowpipe is a component of Snowflake that automates the loading of data from external sources into Snowflake data warehouses.

  • Confluent can act as a source for Snowpipe, allowing it to ingest real-time data from Confluent topics and load it into Snowflake without manual intervention.

How they work together in a curated data ecosystem:

  • Data is collected from various sources, such as web applications, IoT devices, and databases, and ingested into Confluent topics.

  • Confluent serves as a central data pipeline, routing data to different destinations based on your requirements.

  • Real-time data, such as logs or event streams, can be indexed and analyzed in Elasticsearch.

  • Structured data can be stored in Snowflake data warehouses for more in-depth analysis, reporting, and data sharing.

  • MongoDB can store semi-structured or unstructured data that doesn't fit neatly into a relational model.

  • Snowpipe ensures that data from Kafka is automatically loaded into Snowflake, ensuring up-to-date information for analytics and reporting.

In summary, this curated data ecosystem leverages the strengths of each tool to create a comprehensive data pipeline, from data ingestion through storage to analysis, enabling organizations to make data-driven decisions and gain insights from various types of data in real-time scenarios. In this instance, Confluent Cloud provides curated data to downstream sources which are then ingested by Acme’s customers.

Outcomes with Confluent Cloud

With the help of Confluent Cloud, Acme was able to achieve the following outcomes to ultimately better serve their customers:

Data aggregation and enrichment

Acme uses cutting-edge technology to gather data from a diverse range of sources. This includes partnerships with ticketing platforms, event websites, social media, and government databases. By pulling data from multiple streams, the business creates a comprehensive database of events worldwide. Moreover, the company enriches this raw data with relevant context, such as the event's location, time, and expected attendance, providing clients with a deeper understanding of its potential impact on their businesses.

Standardized and scalable data delivery

One of the key strengths of the business model lies in Acme’s ability to standardize event data. By establishing a consistent format for all event entries, Acme ensures seamless integration into their clients' existing systems. This standardized approach allows businesses to access, analyze, and utilize event intelligence without the burden of data conversion or manipulation. Additionally, their scalable data delivery system ensures that clients receive real-time updates, enabling them to respond to events with agility and precision.

Applications of Demand Forecasting Services

The demand forecasting/event intelligence services offered to Acme’s end users apply to diverse applications across industries, helping companies across the globe improve operations, optimize resources, and enhance customer experiences. Here are a few examples of their customers use cases:

Travel and hospitality

In the travel and hospitality sector, real-time event intelligence is invaluable for airlines, hotels, and other travel-related businesses. Acme’s data allows them to anticipate surges in demand during major events, plan promotions, and optimize pricing strategies accordingly. This enables companies to maximize revenue and provide better services to their customers.

Retail and e-commerce

Retailers can harness the data that is offered to identify potential demand spikes and tailor marketing campaigns around major events. By understanding the correlation between events and consumer behavior, businesses can strategically align their promotions, stock levels, and staffing to capitalize on increased footfall and online traffic.

Finance and investment

In the finance and investment sector, real-time event intelligence plays a pivotal role in risk assessment and portfolio management. By analyzing how specific events may impact industries or regions, investors can make informed decisions and minimize exposure to unforeseen market fluctuations. 

Value Proposition and Future Prospects

Acme’s unique business model offers several key value propositions to its customers:

Enhanced decision-making

By providing real-time event intelligence, they empower businesses with accurate and up-to-date insights, enabling them to make informed decisions promptly.

Risk mitigation

The ability to anticipate and respond to events in real-time allows businesses to mitigate potential risks and capitalize on opportunities.

Improved customer experience

With the data offering, businesses can align their services with customer demands, leading to enhanced customer satisfaction and loyalty.

The future prospects for Acme look promising. As the company continues to expand its data sources and refine its enrichment processes, the depth and accuracy of its event intelligence will increase. This will lead to even more diverse applications across industries and the potential for Acme to become a central player in shaping data-driven business strategies.

Business models based on providing real-time event intelligence have revolutionized how companies perform demand forecasting and make data-driven decisions in various industries. By aggregating, enriching, and standardizing event data, our customer has become a trailblazer in the realm of event intelligence services. Their platform empowers businesses to anticipate events, optimize resources, and enhance customer experiences, unlocking new possibilities for success. As businesses increasingly recognize the value of real-time data insights, businesses who offer valuable data lead the charge in the era of data-driven decision-making.

What’s Next

It has been a transformative journey for Acme’s use case dealing in predictive analytics. By embracing a cloud-native data streaming service, Acme has been able to deliver differentiated customer experiences to their clients across different industries. As the world becomes increasingly data-centric, the need for demand forecasting will continue to be the catalyst for business success.

  • Sanvy Sabapathee comes from a DevOps and automation background and currently brings this specialization to customers and products to increase the value delivered by Confluent to its customers.

このブログ記事は気に入りましたか?今すぐ共有

Win the CSP & MSP Markets by Leveraging Confluent’s Data Streaming Platform and OEM Program

This blog explores how cloud service providers (CSPs) and managed service providers (MSPs) increasingly recognize the advantages of leveraging Confluent to deliver fully managed Kafka services to their clients. Confluent enables these service providers to deliver higher value offerings to wider...


Atomic Tessellator: Revolutionizing Computational Chemistry with Data Streaming

With Confluent sitting at the core of their data infrastructure, Atomic Tessellator provides a powerful platform for molecular research backed by computational methods, focusing on catalyst discovery. Read on to learn how data streaming plays a central role in their technology.