Level Up Your Kafka Skills in Just 5 Days | Join Season of Streaming On-Demand

US Foods Builds Digital-First Ecommerce Platform with Data Streaming

作成者 :

Headquartered just outside of Chicago, US Foods is a food distribution company that supports companies across all industries and sectors. Operating more than 70 brick-and-mortar retail locations, the company not only focuses on B2B distribution but now runs a thriving B2C business as well.

In a recent fireside chat, Confluent’s Scott Heller sat down with Brett Eschbach, Senior Director of Digital Platform Services at US Foods, to talk about how the company has been using data streaming to drive its event-driven vision for digital food services and future-proof the organization’s data stack.

Learn why US Foods turned first to Apache Kafka®—and then Confluent Cloud—to bring its data mesh strategy to life in this Q&A recap.

Why US Foods Needed Apache Kafka for Better Digital Platform Services

Scott: To start, can you tell us about the role you and your team play in shaping US Foods’ technical direction?

Brett: As the senior director of digital platform services, I really built this department and team from the ground up. Ultimately, our goal was to prepare us for a significant digital transformation and journey that US Foods has undergone over the last two years.

Our team focuses on building digital platform services that many of our product teams rely on to provide new features and capabilities to our customers and third-party partners.

Within my department, we have three to four engineering teams—across cloud, DevOps, and cyber reliability, as well as the data integration team.

Together, it’s our responsibility to figure out how we can actually scale out these services, and that involves technical engineering as well as the operations that ensure we can deploy, scale, and manage the platform as needed.

Turning US Foods into a Digital-First Food Service With Data Streaming

Scott: What made you think Kafka was the right technology for US Foods?

Brett: Around two and a half years ago, there were five or six of us sitting in a conference room looking at a blank whiteboard. We were trying to figure out a more digital-friendly way to deploy a new ecommerce system—one that would allow us to deploy features faster to our customers and, at the same time, focus on scale and reliability.

Once we had determined what we wanted the overall architecture to look like, data streaming and data mesh architecture were at the forefront of our minds.

Many parts of US Foods’ backend systems were supporting low-latency transactions. We have all this data downstream that we heavily rely on, so the question was: how do we get all that data and get it as close to the microservices we’ve deployed so that we get that real-time reliability, resiliency, and consistent performance?

So we knew that having a data mesh architecture, built with Kafka, would be critical to how our services operate. It empowers us as a team and as an organization to deliver better features to the market, faster.

Scott: So now that you’ve begun implementing a data mesh architecture, tell us—what are some of the problems you’re trying to solve with data streaming at a high level?

Brett: We’ve mainly been focused on delivering the new e-commerce platform that I mentioned earlier. The platform we built is called MOXe, and it allows us to do a complete transformation of our previous e-commerce systems. And we’re in the process of migrating existing customers to this new platform.

Many of our internal conversations are focused on how we can transform US Foods into a “digital food company.” Although that idea of “digital-first” is often watered-down and overused, the way that we’re approaching it is that we’re challenging ourselves to build systems that can holistically scale on demand, while remaining highly resilient and performant.

Why US Foods Chose Confluent to Take Data Streaming to the Next Level

Scott: So with that vision in mind, what made you want to adopt Confluent Cloud versus going forward with self-managing open-source Kafka?

Brett: When I think about our digital journey over the last couple of years, we started out very small. At first, we were really trying to understand the architecture, and we focused on building out proofs of concept (PoCs) with open-source Kafka and handling all the operations and management ourselves.

Later, we moved into hosted services like Amazon MSK. Those steps helped us move the needle at the start, but we began looking at Confluent as our next step because I wanted to ensure my team could focus its energy on the right things.

Our dedicated, eight-person team manages a pretty significant platform with a number of technologies running on top. It didn’t make sense to have that entire team focused only on managing Kafka.

Instead, we wanted to have just one person on that team tasked with overseeing the Kafka platform services—but rather than have that person managing Kafka, they would instead focus on building services and integrating with upstream and downstream connections.

What allowed us to really push forward were the enterprise capabilities, scalability, and reliability that Confluent Cloud offers.

Scott: You touched on briefly how many other technologies are already in your tech and data stack. Can you tell us more about that?

Brett: Honestly, it’s a little bit of everything. We have two mainframe systems that handle core, and backend data processing for everything we do. We also run a hybrid environment of Microsoft SQL and Oracle databases, and US Foods is a huge Snowflake customer as well.

So, being able to get downstream data across all these systems and then getting that data up into Confluent Cloud is critical. And eventually, that data makes its way into our MongoDB Atlas environment, where a lot of the data stores live for our domain-based APIs.

US Foods relies on all kinds of source and sink connectors to get data into Confluent.

Scott: It sounds like a pretty diverse technology stack. What’s interesting is seeing how these all can tie together through Kafka and Confluent. 

When looking at new or future technologies you want to adopt, it’s easy to think, “If only I could start all over.” But the reality is there is no starting over. It’s all about figuring out how to bring new and old technologies together in a way that moves the business forward while it’s still running.

Brett: I think where Confluent really shines is being able to connect different downstream systems and help us get that event streaming architecture built out. 

Benefits of Standing up a Digital Platform With Confluent Cloud

Scott: Which specific features has US Foods gotten the most value from?

Brett: The managed connectors are probably the most important feature that we use from Confluent. In the past, we had to write our own connectors and build our own services through Zookeeper

Now, there are so many backend processes we just don’t have to handle anymore. Leveraging Kafka truly as a service removes so much labor from my team and makes our integration so much easier, especially with the ecosystem of connectors that Confluent provides.

Stream Governance is also really important to us. From a security perspective, being able to use role-based access (RBAC) to control who has access to what has been critical. 

Additionally, we’re striving to be a very DevOps-driven organization, so realizing real-time, enterprise observability through DevOps pipelines is super important. We want to know when there are scaling events or potential issues happening—or about to happen—on the backend.

All of that data gets shoved into our downstream observability platform and gives us reliable visibility into what’s happening across all of those systems.

What’s great is that we don’t really have to do anything management-wise, but we still know what’s happening. Confluent allows us to get both the hands-off experience and continuous visibility we need.

Scott: What are the most impactful technology and business benefits you’ve gotten from using Confluent Cloud?

Brett: When we embarked on this journey, we set goals around the levels of elasticity, reliability, and performance we wanted to realize. Confluent really helps excel in those three areas because of its on-demand scaling.

US Foods relies on a microservices architecture that is fairly large and robust right now. Confluent can really handle that focus because of its infinite storage. It takes a lot of that worry away because we know the underlying platform is highly fault-tolerant and resilient across all of our cloud systems.

That trust is vital to our ability to scale because, in turn, we can deliver on the service-level agreements (SLAs) that we promise to our customers.

Distributed architectures like Kafka look great on paper, but they’re really hard to actually manage—it takes a lot of methodical planning. Having a partner like Confluent has really helped us get to what I would call an enterprise-class state of data streaming.

Glean Data Streaming Insights from Technology and Industry Experts 

Across industries, companies like US Foods are using Confluent-managed Kafka to transform their real-time operations and retail services. Want to hear from the experts and business leaders making this transformation happen?

  • Zion Samuel is a writer on the Brand Marketing team at Confluent. Prior to Confluent, Zion has spent several years writing for organizations in various technology and healthcare sectors.

このブログ記事は気に入りましたか?今すぐ共有

Win the CSP & MSP Markets by Leveraging Confluent’s Data Streaming Platform and OEM Program

This blog explores how cloud service providers (CSPs) and managed service providers (MSPs) increasingly recognize the advantages of leveraging Confluent to deliver fully managed Kafka services to their clients. Confluent enables these service providers to deliver higher value offerings to wider...


Atomic Tessellator: Revolutionizing Computational Chemistry with Data Streaming

With Confluent sitting at the core of their data infrastructure, Atomic Tessellator provides a powerful platform for molecular research backed by computational methods, focusing on catalyst discovery. Read on to learn how data streaming plays a central role in their technology.