"Confluent Cloud has become a vital service for us—the 100% availability and uptime have significantly benefited both our organization and our valued customers."
Ambarish Kumar
Manager - Data Platform, OutSystems
Bringing one of the first low-code platforms to market in 2001, OutSystems revolutionized the high-performance application development platform space. And today, its platform serves customers in 87 countries and across 22 industries.
As the application platform market has grown more crowded and competitive, the company has kept pace by continually modernizing its approach to providing observability for dev users, which requires data integration and getting real-time data to the right place. To meet customer demand for the ability to monitor and troubleshoot in real time, OutSystems needed to make data streaming a central part of its data strategy.
Find out how building streaming data pipelines on Confluent Cloud has helped OutSystems cut operational costs, improve the customer experience, and unlock innovation and developer productivity.
Opportunity: Enhance Speed to Market and Reliability
Businesses and enterprise companies rely on the OutSystems low-code app development platform to quickly create and launch apps and portals. But speed to market alone isn’t what makes the platform so valuable for today’s businesses.
Having a reliable AppDev platform—one with minimal downtime and built-in real-time observability—is crucial to maintaining user trust and protecting the bottom line.
Without the ability to see and react to platform performance at a moment’s notice, app developers that use troubleshooting capabilities on the platform would be unable to quickly bring apps back online. Reducing the mean time to recovery (MTTR) would not only enhance developer experience but would also allow businesses to seamlessly accommodate rapid expansion of their user base and engagement levels.
OutSystems needed to build data pipelines that could give internal teams as well as their developer user base visibility into operational and product usage data.
Pipelines Delay Real-Time Data Access and Agile App Development
From artificial intelligence (AI) and machine learning (ML) to real-time ingestion and processing of telemetry data, there were numerous data-driven capabilities that operations and product teams at OutSystems needed to develop. But before migrating to Confluent Cloud, the data pipelines OutSystems had in place could not deliver the business agility and reliability needed to support these capabilities.
During initial stages of development, the team dealt with substantial operational burdens while attempting to operate a scalable, reliable service. On a weekly basis, data transfers to downstream consumers would fail. This instability resulted in high latency and frequent periods of downtime that the operations team had to address.
As a result, the data platform struggled to meet the 99.95% uptime service-level agreement (SLA) needed to make new features ready for their stakeholders. The failure to fulfill that contract could have disastrous impacts on the business. For example, the dashboards and APIs that customers relied on would be unable to return data, which hindered developers and their ongoing app development and issue resolution process.
Why OutSystems Chose Confluent for Streaming Data Pipelines
OutSystems’ technical leadership knew the company needed a managed data streaming platform that could ensure stability and eliminate operational burden. That’s why OutSystems chose Confluent Cloud—the only fully managed, cloud-native Apache Kafka® service.
”Kafka stands as the cornerstone of our data platform, infusing stability into our system and serving as the pivotal link that binds our operations. Since transitioning to Confluent Cloud, we have experienced uninterrupted operations and minimal developer intervention. It’s become a vital service for us—the 100% availability and uptime have significantly benefited both our organization and our valued customers.” — Ambarish Kumar, Manager - Data Platform, OutSystems
Solution: Streaming Data Pipelines Built on Confluent Cloud
OutSystems needed a data streaming platform that would make it easy to produce, share, consume, and trust real-time data across the organization without the instability previous solutions had introduced.
Business Savings and Innovation Unlocked With Confluent Cloud on AWS
Once OutSystems migrated to Confluent, its data platform team could build decentralized, declarative, developer-oriented, and governed data pipelines instead of having its data tightly bound to centralized systems. This approach has helped OutSystems maximize both the value of its streaming data and the reliability of its AppDev platform for customers.
Using Confluent Cloud on AWS to build streaming data pipelines allowed OutSystems to:
See an approximate 20% reduction in operational costs
Save the time of up to two full-time employees from managing Kafka operations
Meet its data platform’s internal 99.95% SLA and experience an actual uptime of 100%
Deliver best-in-class developer experiences, with ease of troubleshooting for accelerated delivery as well as the ability to build new features around data (e.g., AI, machine learning, analytics) while ensuring best practices and compliance
Power critical use cases, including real-time observability and customer 360 for understanding product usage and making data-driven decisions on how to engage and support users
Today, the company’s data platform manages billions of daily records through a data processing pipeline that encompasses Confluent Cloud and a suite of Kafka Streams microservices deployed on Amazon Elastic Kubernetes Service (EKS). This intricate pipeline facilitates the processing of data, fueling numerous essential services that serve both internal operations and customer-facing functionalities.
Taking Advantage of Confluent Cloud’s Data Governance Capabilities
OutSystems uses Stream Governance to create centralized standards for data observability, security, and compliance without compromising developer agility.
Schema Registry simplifies how OutSystems teams manage and validate schemas for topic message data, as well as how they define, enforce, update, and deploy standards in its data catalog. Using these features has helped eliminate friction between producing and consuming events from shared pipelines.
Stream Lineage features a graphical user interface that shows event streams and data relationships, giving OutSystems both holistic and granular views of data sources, sinks, and transformations. Using Stream Lineage allows data streams to become more trusted, shared, and adopted by teams across the organization.
”The visual representation of data sources and topics with Stream Lineage is very helpful because a lot of time, the idea of streaming is new and this makes it possible to have discussions about where data is coming from and how it’s moving. I can’t stress enough how nice it is to have it be stable, always work, and always be there.” — Bryan Jacobs, Principal Product Architect, OutSystems
Streaming Pipelines Unlock Real-Time Observability and Customer 360
Combined with self-service data access, centralized data governance capabilities have made maintaining compliance simpler for the entire OutSystems organization. As a result, engineers and other technical teams can dedicate more time to strategic work that drives revenue and efficiency.
Now, streaming data pipelines built on Confluent Cloud are powering real-time observability and customer 360 at OutSystems.
Use Case #1: Building Observability Pipelines on Confluent Cloud
Developers using OutSystems to write new applications need to be able to monitor and troubleshoot when something is not working as planned (e.g., bottlenecks in the app, code-related issues). If they aren’t able to access log data, they will not be able to quickly identify and rectify the issue.
Streaming data pipelines provide the real-time observability necessary for developers to understand what is happening in their applications, identify issues in real time, and take immediate action.
At OutSystems, there are various system components that produce log data—for example, service center, subscription consoles, customer interaction data from runtime. This is streamed into Confluent Cloud and from there, all of this data can be ingested into a data lake built on AWS using Amazon Simple Storage Service (Amazon S3), get processed in flight, and then fed back into the platform console so that both internal users and external customers can access it.
Engineers at OutSystems have built observability pipelines to stream, process, govern, and share this data in real time, powering real-time telemetry and network tracing capabilities.
The real-time telemetry pipeline—which sends application logs and telemetry data to Amazon S3, Elasticsearch, and Amazon OpenSearch—is used internally to understand customer product usage. And the network tracing pipeline streams data on cross application network calls to enable network performance tracking and troubleshooting.
Use Case #2: Customer 360 Done Right With Streaming Data Pipelines
OutSystems has also built an analytical pipeline to supply teams with platform user insights. In this streaming pipeline, product usage data is streamed into Confluent Cloud, processed, and converted into the right structure before landing in Snowflake.
Internal teams can access and analyze this data in a real-time, self-serve way to understand product usage and consumption patterns (e.g., whether developers are using certain paid features). This information, updated in real time, helps account teams make faster, data-driven decisions about which customers to engage with or when to reach out with personalized offers or support.
In the past, this type of log data was stored in databases, which resulted in slow analysis and even slower responses to customer needs. With data streaming from Confluent Cloud to OpenSearch or other analytical sinks, OutSystems can now query large amounts of data with low latency and minimize the time from data ingestion to analysis and action.
“When you use a database, you don’t think about if your database is running—it just runs. It’s always there and it always works. Confluent does that for streaming data, ensuring it’s always there, always running, always stable. We don’t have to worry about anything and instead can focus on building applications while the data streams driving our business run without interruption.” — Bryan Jacobs, Principal Product Architect, OutSystems
Looking Forward: What’s Next for OutSystems With Confluent Cloud
Confluent Cloud has given OutSystems the ability to build streaming pipelines that connect any data source or destination and make trusted, enriched data accessible across the business. That freedom of choice and extensibility means OutSystems can continue saving on cost, increasing visibility in operations and customer activity, and enabling developers and business teams with new real-time or event-driven capabilities.
Migrating to Confluent has already unlocked valuable use cases like real-time observability and customer 360. And the team at OutSystems already has its sights set on additional data streaming use cases that will benefit the business. Next, the data platform team plans to implement real-time audit logging. This use case will allow the company to better support customer apps by tracking every change that services make (e.g., adding or removing a role, creating or deleting apps, adding domains) within its infrastructure.
As streaming data pipelines are leveraged by more teams and across more systems in the organization, the resulting network of data streams connected to Confluent Cloud will expand what’s possible for OutSystems.
Confluent の活用を 今すぐ開始
登録後の30日間で利用できる$400のクレジットを無償提供中