Confluent Platform offers a rich pre-built ecosystem of over 120 Kafka connectors and a schema registry to rapidly and reliably build modern, real-time applications with data in motion.
Confluent develops and works with partners who develop enterprise-ready connectors based on the Kafka Connect framework. Connectors are supported by either Confluent or our partners. A portion of them are available as managed connectors with Confluent Cloud.
Confluent Hub is an online marketplace to easily browse, search, and filter for connectors and other plugins that best fit your data movement needs for Kafka.
Schema Registry は、開発者が標準スキーマを定義し、互換性を実現するためにアプリケーションを登録できる RESTful インターフェイスを備えた中央レポジトリです。Schema Registry は、Confluent Platform のソフトウェアコンポーネントまたは Confluent Cloud のマネージドコンポーネントとしてご利用いただけます。
MQTT Proxy delivers Kafka-native connectivity into IoT devices without the need for intermediate MQTT brokers, eliminating the additional cost and lag. MQTT Proxy accesses, combines and guarantees that IoT data flows into Kafka without adding additional layers of complexity.
The catalog of fully supported connectors includes JDBC, HDFS, AWS S3, Elasticsearch, MongoDB, Salesforce, Debezium, MQTT, and many more. Some connectors are also available as managed components of Confluent Cloud, such as AWS S3, Google GCS & BigQuery, Azure Blob.
To simplify how you leverage the Kafka Connect connector ecosystem, we offer Confluent Hub, an online marketplace to easily browse, search and filter connectors to find the one that fits your needs.
Enable IoT data to flow into Kafka without adding additional layers of complexity. MQTT Proxy delivers Kafka-native connectivity into IoT devices without the need for intermediate MQTT brokers, thereby expanding the platform into new enterprise data sources and business applications.
Store and share a versioned history of all standard schemas, and validate data compatibility at the client level. Schema Registry supports Avro, JSON and Protobuf serialization formats.
Schema Registry reduces operational complexity in the application development cycle, because it eliminates the need for complex coordination among developers. Need to add a new column to a downstream database? You don’t need an involved change process and at least 5 meetings to coordinate 20 teams.
Schema Validation delivers a programmatic way of validating and enforcing Schema Registry schemas directly on the Kafka broker and with topic-level granularity. It provides greater control over data quality, which increases the reliability of the entire Kafka ecosystem.
Simplify management for production environments using Control Center as the GUI.