Ahorra un 25 % (o incluso más) en tus costes de Kafka | Acepta el reto del ahorro con Kafka de Confluent
Interoperability is the ability of systems, apps, and services to communicate in order to accomplish a task. While interoperability is used in many contexts, it is mostly relevant when disparate software and hardware systems are being composed to solve a single problem. The higher the interoperability of the individual components, the less custom work needs to be done.
In every scenario where Interoperability is needed, the solution is always to make the systems conform to a common communication standard. Within a single company or project, the components are well-known and can conform to this standard. However, when products are made at different times or by different companies then a global standard is required.
Confluent provides a global standard for interoperability by decoupling systems from rigid interfaces and automating data collection, processing, and integration. Unlock seamless, real-time data streaming, interoperability, and insights across your entire infrastructure.
An everyday example of Interoperability comes from Observability and Business Intelligence. Does the visualization platform support all the data sources that I need to report from? Most support database access via ODBC/JDBC, as well as HTTP calls. However, can they support live data pushed in via something like Kafka? Can they support authentication mechanisms like OAuth? Think about this harder problem: can the visualization tool query the data source by forwarding the logged-in user’s credentials to the data source instead of some system-defined one for all users? This level of interoperability is usually provided using some form of impersonation or a mechanism (token) to forward credentials to downstream systems.
Another example of Interoperability is the case of authentication in enterprise software. Does each application that an employee needs access to require a separate account within each application, or can they authenticate via single sign-on based on a central directory (e.g. Active Directory or X.500)? Can the groups of employees defined in the directory be used as roles in the applications for RBAC?
An important consideration is that interoperability of systems depends on interoperability at all levels. For a browser to display data, many pieces need to work together:
Interoperability is a requirement in many contexts across industries. For example
The key insight is that every system evolves. This could be the need to scale some of its functionality where, if it were built with microservices, the appropriate one could be horizontally scaled out. Perhaps the system needs to support another client application. Having a standardized way that does not constrain either the system or the client would reduce the need to write custom glue code and thereby improve the time to market. Components of systems often need to be either changed out or upgraded and sometimes the change is not backwards compatible. Ensuring that each piece is interoperable to a global standard makes that very easy.
NASA reported that the 1999 Mars Lander crashed because two modules were communicating with a number, but each assumed that the number was in different units. When systems interoperate, it is critical that they do so not simply using messages or packets but rather events. The distinction being that the event has all the context needed, the units in this case, for the destination to process it safely.
As is the case in any computing system, it is easy to build interoperability on day one, but keeping it going across multiple versions of different systems is hard. Given that upgrades of different components are usually on different timelines, it takes work to keep it in sync. Here again, technologies like schema registries with an intrinsic ability to handle backward and forward compatibility of messages are crucial to making a long-lasting interoperability solution.
A common path for systems to interoperate is to export data from one system in some known file format like CSV and import it into another. While ubiquitous, this is wrong on many fronts. One, it implies a batch processing model which results in stale information until the next batch is processed. Second, with multiple systems needing to communicate with each other, they all have to agree on file format, file naming conventions, file locations, and the archiving of old files. All of this needs infrastructure to be created and maintained. Solutions, like managed Kafka, make all of this go away and provide real-time communications on top.
Inevitably, systems will add and remove components and the need to scale up as traffic increases. Systems that communicate synchronously with request-response messages have a difficult time adding new systems since all the other systems now need to know about their endpoints. If a component needs to scale horizontally, all the other components need to know about its instances. These are all solved problems when one uses a technology like Kafka with these features as part of its core functionality.
Interoperability should be a key deliverable in the design of any system. It facilitates scalability, increases adoption rates, improves maintainability, as well as reduces time to market for solutions that use the system.
Confluent’s complete suite of products facilitates building interoperable systems. The managed Kafka offering leveraging the enhanced Kora engine allows teams to decouple their systems into event-based, asynchronous ones without the burden of administering your own cluster. The Data Streaming Platform provides the Schema Registry product, along with a Data Portal, allowing teams to provide well-defined interfaces for other teams to leverage. Lastly, the managed offerings of data stream processing building blocks, Kafka Connect, Kafka Streams, and Flink, make it easy to not only create interoperable systems but also provide developers with an easy way to enhance the value of existing, closed systems by refactoring them into ones that interoperate with others in the firm.