"With Confluent, the overarching win is that we’re able to spend more time implementing design and better workflows, making improvements within our current systems, and working on product enhancements. We now have the ability to be driven by business needs."
Burton Williams
Principal Data Platform Engineer, Vimeo
Vimeo, a leading SaaS video platform, is on a mission to simplify what it takes to make, manage, and share videos for its customers. Launched in 2004, Vimeo now claims over 260 million users around the world, from enterprises to small businesses and individual content creators—and has 1.7 million-plus paid subscribers and 100 billion-plus video views.
And as Vimeo continues to drive success, a top business priority for the company is understanding user experiences and behavior in order to add ever greater value by providing optimized, adaptive, real-time experiences—and the most relevant products.
To this end, access to real-time data has become mission critical for Vimeo.
That’s where Confluent’s data streaming platform and streaming pipelines come into play.
When it comes to video platforms, success largely depends on delivering stellar user experiences.
And for Vimeo—a company that’s earned the reputation of being a high-quality video platform—maintaining and improving on this recognition is nothing short of a business imperative.
As a result, understanding user behavior to help improve and optimize user experience—and inform product decisions—is crucial for Vimeo.
This is where data streaming provides value.
Key business drivers for Vimeo to implement a data streaming platform include:
Deriving real-time insights: Understanding user experience and behaviors in real-time
Agile decision-making: Being able to quickly pivot on product, campaigns, A/B tests, experiments, etc.
Delivering best-in-class user experiences: Providing optimized, adaptive user experiences at scale, with zero buffering or other issues
Gaining faster iterations: Enabling different teams to utilize data quickly and leverage a faster feedback loop to deliver the best video quality, user experience, and features (with new AI capabilities)
“Data streaming is critical to Vimeo because now we’re able to understand user behaviors and experiences in real-time—and provide our users with better quality video (sans buffering) and a better experience,” Bashiri said. “It allows us to adjust and pivot in-product messaging, online and email marketing campaigns, and more. Especially for product and campaign launches, where we are monitoring the performance among millions of users, real-time visibility helps immensely in making timely decisions such as campaign adjustments or pivoting where needed.”
“Data streaming also allows for optimizing the video quality to ensure smooth playback even in fluctuating network conditions,” he added.
Pre-Streaming Data Hurdles
Prior to data streaming, Vimeo relied on traditional data warehousing solutions with daily batch ETL processes. This resulted in delayed insights into user experience, performance, and behaviors—long after users moved on to the next thing.
“There was a one-day delay before insights would reach our analytics teams,” Bashiri explained. “This delay ultimately limited our decision-making, and we weren’t able to make real-time decisions or quickly pivot after a launch or campaign.”
But the challenges didn’t just end there. Here are some of the other challenges that ensued from implementing point-to-point batch data pipelines:
It slowed down the rate of iteration that Vimeo is capable of
Data teams had to re-run batch jobs if something failed
Business stakeholders didn’t have the latest insights due to stale data
Different teams (e.g., Product Management, Product Marketing, Business Analysts) couldn’t readily use real-time data in a self-serve way
Limited Vimeo to mid- and long-term decision-making, as opposed to real-time decision-making with quick feedback loops
The good news?
Embracing data streaming meant Vimeo was able to unlock new real-time use cases and empower internal teams with access to the data and insights they needed.
Amplifying Customer and Business Impact with Confluent Cloud
“Today, having access to real-time data has become mission-critical for Vimeo. This means Vimeo is moving more toward real-time data processing,” Bashiri said.
Vimeo’s data streaming journey started with self-managed open-source Apache Kafka®.
“This resulted in several challenges related to broker version upgrades, scaling, and data observability—resulting in unnecessary engineering bandwidth spent on managing the infrastructure versus time spent on working on the product,” said Burton Williams, Principal Data Platform Engineer at Vimeo.
Vimeo’s infrastructure primarily runs on Google Cloud, which made the move to Confluent Cloud in that same environment a no-brainer. Vimeo was able to leverage Google Cloud’s networking layer to localize its data and achieve lower latencies, which means data can be transmitted faster—ultimately resulting in a better user experience with fewer outages. Additionally, moving to Confluent meant that Vimeo teams could focus on more value-added activities with improved observability and ease of managing infrastructure and resources.
Plus, Confluent ensures data quality and security with Stream Governance — and allows Vimeo to safely scale and share data streams across their business.
“With Confluent, the overarching win is that we’re able to spend more time implementing design and better workflows, making improvements within our current systems, and working on product enhancements,” Burton said. “We now have the ability to be driven by business needs.”
Another reason for choosing Confluent is the large ecosystem of pre-built, fully managed connectors that Confluent offers. Confluent connectors make it easy to instantly stream between popular data sources and sinks without the hassle of building and self-managing connectors. Vimeo is looking to take advantage of fully managed Confluent connectors for Snowflake, S3, MySQL, and more.
According to Bashiri, with Confluent, Vimeo is now able to:
Gain real-time insights: “Real-time analytics enables Vimeo to build powerful Customer 360 views to understand user behavior in real-time and help product teams with their roadmap. For example, when Vimeo’s developing various products and features, this real-time insight into how users are interacting with the product help guide and inform Vimeo on product strategy. We can also use these insights to help users with product discovery, so they can get the most out of Vimeo as a SaaS.”
Drive agile decision-making: “Timely decision-making is very important for Vimeo as a SaaS company. Our Product Managers, Product Marketers, Data Science and Analytics teams can now understand how a campaign is performing, monitor product launch performance, or how an A/B test is doing in real-time. This enables them to pivot if they need to and make decisions quickly. Overall, that drives growth and optimizes the cost, which impacts Vimeo’s profitability and bottom line.”
Deliver best-in-class user experiences: “With our adaptive bitrate streaming use case, for example, we are able to look at the network connection of the user or any information from the user platform or the infrastructure in general and then adapt the video quality to provide that seamless video experience, be it live streaming or video playbacks, to our customers.”
Power faster iterations: “By helping us accelerate our speed of iteration, Confluent has helped us improve our workflow and productivity, enabling teams to work faster and more effectively easier than ever.”
Here’s a look at the business impact Confluent helps Vimeo achieve:
User growth driven by improved customer experiences from using real-time analytics.
Faster time to market for data products (from weeks or days to hours) for internal stakeholders. For example, various teams can now use self-service data streaming to create a dashboard to perform advanced analytics if needed.
Lower total cost of ownership (TCO) with Confluent Cloud—by allowing FTEs to focus on product development versus managing infrastructure.
“If we were to do what Confluent is doing for us today, we would have needed three more people,” Burton said.
Powering Business-Critical Use Cases with Streaming Data Pipelines
Streaming data pipelines with Confluent enables real-time data flows across the organization—getting Vimeo’s data to the right place, in the right format, at the right time.
And with hundreds of millions of users, and a hundred billion views to date, Vimeo must harness unprecedented volumes of data—and data pipelines are vital to ingesting and processing that data in real-time.
Streaming pipelines also allow Vimeo to process data streams in flight, govern data, and securely share data faster, unlocking endless real-time use cases—all while reducing their total cost of ownership.
1. Real-time data warehousing for real-time analytics
Real-time analytics powers timely decision-making, allowing businesses to analyze data as it flows into the business.
With Confluent, Vimeo built a streaming pipeline from various clients—where Vimeo collects and sends user clickstream data from viewing devices—to Vimeo’s cloud data warehouse Snowflake. Vimeo can process data, apply business logic, and stream to other downstream analytics or SaaS platforms, including Amplitude, to get out-of-the-box dashboards and reports in real-time.
While Vimeo is moving more toward real-time analytics, it hasn’t done away with batch-based processes altogether. Confluent makes Kappa architecture a reality for Vimeo, enabling them to do both real-time and batch processing to derive the insights they need.
“We’re able to do real-time data enrichment and take load away from the application to improve app performance and create real-time impact,” Bashiri said. “Ultimately, having this data in real-time allows us to understand what is happening and how certain things are performing, and then use all of those insights to make changes very quickly.”
How does Vimeo benefit as a business from access to real-time analytics? There are three different ways to look at it: “First is user experience, second is improving revenue and growth by providing more avenues for us for better monetization, and third is reducing costs,” Bashiri said.
2. Adaptive bitrate streaming for optimizing video playback
Adaptive bitrate streaming is a game-changer for video streaming.
With adaptive bitrate streaming, Vimeo is able to dynamically and instantly adjust video quality for its users based on real-time network conditions. This is done by collecting and using real-time data (e.g., network latency, average buffering rate, playback time) to minimize buffering and interruptions, making playback as smooth as possible for viewers.
The result? Seamless user experiences and the best viewing quality at all times—regardless of device, location, or internet speed.
With Confluent, Vimeo built a streaming pipeline from video clients to custom back-end processes that run Vimeo’s adaptive bitrate optimization model. An observability pipeline ingests data from application logs to identify issues in real-time and take action immediately.
Future Forward: Exploring Flink, AI & ML
For Vimeo, improving video experiences for its users means continuously innovating.
Vimeo is looking to leverage real-time user data to automate hyper-personalization, where Vimeo can feed data and analytics back into the product as well as their operational systems to ultimately surface recommendations to users in real-time.
Vimeo is also exploring the potential to enhance their streaming use cases with Flink.
“Flink can open up more opportunities in the future. Our philosophy is we don’t want to solve problems that have been solved in an elegant, scalable way. We want to leverage stream processing while freeing up bandwidth of engineers and SRE functions—and Flink is part of that strategy,” Bashiri said.
Plus, with AI and machine learning (ML) capabilities becoming table stakes for companies today, Vimeo has invested in a specialized AI team that focuses on the integration of AI/ML into the video experience itself, according to Matthew Shump, VP of Data at Vimeo.
Vimeo has three different use case categories when it comes to how they are implementing AI, Shump said.
“First is what’s core to our product—it’s the video tech. We (Vimeo) are always exploring ways we can use AI or machine learning, or other data science capabilities, or streaming technology to help enable that,” Shump said. “Second is how can we be more efficient and effective when it comes to our operational capabilities, and how can we automate and scale that with modern technologies like AI. Third is hyper-personalization within customer experience, and bringing AI and machine learning data science to that.”
And without data streaming, everything becomes delayed. That becomes a limiting factor, especially when tailoring experiences for customers—be it a guided onboarding process or hyper-personalized recommendations based on a user-specific project, for example.
“Confluent is critical for how Vimeo instruments our product and how we make that information available across a multitude of different surface areas,” Shump explained. “Whether data is fed back into the product or into a pipeline for ML and AI teams, or the product team’s self-service tools to better understand how our customers are experiencing the product. There are always opportunities for improvement, where analysts and data scientists can curate, measure, identify usage patterns on the user journey.”
“Streaming allows us to be in the moment, identify new opportunities with real-time data enrichment and real-time personalization as the user is engaging with our product or through other customer touchpoints—quickly operationalize and scale that— and ultimately driving greater activation, engagement, longer-term retention and success at Vimeo,” Shump said.
Confluent の活用を 今すぐ開始
登録後の30日間で利用できる$400のクレジットを無償提供中