Prädiktives maschinelles Lernen entwickeln, mit Flink | Workshop am 18. Dezember | Jetzt registrieren

The Role of Service Partners with Confluent Cloud

Verfasst von

"What is our role with Confluent Cloud?” is a valid question frequently asked by service and delivery partners who have traditionally made money from services related to the installation and upkeep of on-prem applications. To answer this, there are two key prerequisites that need to be understood: 

  1. What offerings does Confluent Cloud provide?

  2. As a service partner, what additional activities need to be performed to support Confluent Cloud customers?

The aim of this blog is to provide guidelines / a checklist for service partners to fully utilize Confluent Cloud services and eliminate any gray areas related to scope of work that needs to be performed by service partners in a SaaS world.

Confluent Cloud (CC) is a fully managed cloud native Kafka service for connecting and processing all of your data, everywhere it's needed. We provide the following functions as a service:

  • Connectors

  • Kafka

  • ksqlDB

  • Schema Registry

Other enterprise-ready features include, but are not limited to:

  • Monitoring (Control Center)

  • RBAC

  • Audit Logs

  • Data Flow for Data Lineage

  • Enterprise Support 

  • Data Governance

All services can be self-provisioned based on customer needs and are available on all major cloud providers such as AWS/Azure/GCP, with a pricing model based on consumption/Usage Based Billing (UBB).

Confluent Cloud and Services Partners

If you’re asking, “What is a fully managed service?” We take care of:

  • Installation and management

  • At-rest & in-transit data encryption

  • Uptime SLA’s

  • Throughput guarantees

  • Upgrades

  • And much more!

Let me draw an analogy for better understanding: the engine of a Formula 1 car is the most critical element and is typically built by a specific engine manufacturer. In turn, it’s sold to and used by multiple racing teams.

Consider a scenario where engines built by Mercedes are put into Mercedes, McLaren, Aston Martin, and Williams cars.

As an engine expert and manufacturer, Mercedes ensures the performance of the engine alone and takes responsibility for the engine, while the racing car companies take ownership of the rest of the car like design, weight, wheels, aerodynamics, etc.

This is exactly what Confluent Cloud offers: our technology is the engine that racing car companies (service partners) use to build multiple cars (applications), all while using the same engine.

Confluent Cloud provides the engine while the race cars are built by Service Partners!

The benefit to the end customer is the ability to build multiple data streaming applications quickly, all while benefiting from running on Confluent Cloud. 

To my curious reader's key question, “What activities need to be performed by me as a service provider on Confluent Cloud?”

I’d like to break up activities into three different personas:

  • Developers

  • Operators

  • Architects

The Developer Persona

This involves the development of applications on Confluent Cloud as well as the configuration of Confluent to support these applications or use cases in an optimal way. This typically entails:

  • Connectors

    • Choosing the right connector

    • Optimizing the number of tasks

    • Configuring Single message transforms (SMTs)

    • Meta-data information

  • ksqlDB

    • Scoping the number of CSUs required

    • Developing ksql pipelines

  • Kafka

    • Estimating the number of topics

    • Topics creation with partitioning strategy

    • Data retention/data replication

    • Data format

  • Schema registry

    • Creating schemas with compatibility modes

All of this must be designed and built by our service partners.

As you can see, the developer’s activities remain the same be it a self managed Confluent or Confluent Cloud.

The Operator Persona

As you might have expected, there are some tasks that are generally associated with on-prem operations which are omitted when using SaaS software like Confluent Cloud. These are duties such as software installation, configuration, support, etc. These are provided as out of the box on Confluent Cloud. What remains is still important: 

  • Monitoring - Control Center for Confluent Cloud provides a vast collection of metrics related to the health of the environment. For use cases where additional metrics need to be monitored, Confluent Cloud exposes the Metrics API which can be integrated into a 3rd-party monitoring tool of choice.

  • Application Deployment - Deployment of applications built into higher environments on Confluent cloud, application change management life-cycle, etc.

  • Application Support - Applications built by service partners on top of Confluent Cloud have to be supported by service partners for their lifetimes.

The Architect Persona

The majority of Architect activities remain the same except for some activities like cluster sizing, security, etc., which are not provided as out of the box by Confluent Cloud. The remaining tasks are vitally important:

  • Architecting the overall solution by defining the type of clusters required, replication and HA strategies, capacity, DR considerations, Schema Registry, etc.

  • Security - Setting up of security i.e. authentication and authorization activities such as Single Sign On, adding users, defining roles, ACLs, API key generation and management, etc.

  • Performance Benchmarking - Confluent Cloud provides max read and write throughput at the cluster level, each cluster will have more than one topic which actually holds all events. Topic design becomes critical to ensure Topics can scale up to individual throughput requirements.

With a list of things to do discussed above, let me come to the easy section and tell you what doesn’t need to be done on Confluent Cloud:

  • Identifying the infrastructure i.e. defining number of brokers, ZK, memory, CPU, etc. 

  • Installation and upgrades of software services such as Kafka, connectors, ksqlDB, etc.

  • Infrastructure support of software services

To summarize, there is minimal impact on the scope of activities for the Developer and Architect, while some of the Operator activities have been automated and provided as out of the box by Confluent Cloud.

Conclusion Confluent Cloud provides a platform whose stability is underpinned by years of operational expertise by the founders and majority contributors of Apache Kafka, while also adding:

  • ELASTIC - massive scale when required

  • GLOBAL - built for hybrid and multi cloud environments in all major clouds

  • INFINITE - infinite data storage

  • COMPLETE - Not only Kafka, connectors, ksqlDB, Schema Registry, and much more!

  • HIGHLY AVAILABLE - 99.9% Kafka SLA

  • SECURE - Enterprise-ready encryption and security features out of the box

So while we provide a best in class technology, service partners have a major role to play in bringing in:

  • Deep understanding of customer landscape

  • Domain knowledge

  • Application development expertise

  • Integration with other technologies

To help customers realize the full business value of Confluent Cloud in a quick time.

  • Venky is a Senior Solutions architect at Confluent based out of Bengaluru in India, he primarily works with GSI's in the region in a pre-sales role. He has 13 years of experience working at GSI as well as implementing Confluent at large customers.

Ist dieser Blog-Beitrag interessant? Jetzt teilen

Win the CSP & MSP Markets by Leveraging Confluent’s Data Streaming Platform and OEM Program

This blog explores how cloud service providers (CSPs) and managed service providers (MSPs) increasingly recognize the advantages of leveraging Confluent to deliver fully managed Kafka services to their clients. Confluent enables these service providers to deliver higher value offerings to wider...


Atomic Tessellator: Revolutionizing Computational Chemistry with Data Streaming

With Confluent sitting at the core of their data infrastructure, Atomic Tessellator provides a powerful platform for molecular research backed by computational methods, focusing on catalyst discovery. Read on to learn how data streaming plays a central role in their technology.