Ahorra un 25 % (o incluso más) en tus costes de Kafka | Acepta el reto del ahorro con Kafka de Confluent
Change data capture is a popular method to connect database tables to data streams, but it comes with drawbacks. The next evolution of the CDC pattern, first-class data products, provide resilient pipelines that support both real-time and batch processing while isolating upstream systems...
Learn how the latest innovations in Kora enable us to introduce new Confluent Cloud Freight clusters, which can save you up to 90% at GBps+ scale. Confluent Cloud Freight clusters are now available in Early Access.
Learn how to contribute to open source Apache Kafka by writing Kafka Improvement Proposals (KIPs) that solve problems and add features! Read on for real examples.
Since its inception, change data capture (CDC) technology has significantly evolved, transitioning from a tool primarily used for database replication and migration to a cornerstone of real-time streaming. Its pivotal role in modern data architectures enables businesses to harness real-time data...
Confluent has helped thousands migrate to KRaft, Kafka’s new consensus protocol that replaces ZooKeeper for metadata management. Kafka users can migrate to KRaft quickly and with ease by using automated tools like Confluent for Kubernetes (CFK) and Ansible Playbooks.
In this edition, we’ll have a look at creating Kafka Streams topologies—exploring the dependency injection and design principles with Spring Framework, while also highlighting some syntactic sugar of Kotlin that makes for more concise and legible topologies.
This blog post talks about Confluent’s newest enhancement to their fully managed connectors: the ability to assume IAM roles.
The Q3 Cloud Bundle Launch comes to you from Current 2024, where data streaming industry experts have come together to show you why data streaming is critical today, especially in the age of AI, and how it will become even more important in shaping tomorrow’s businesses...
If you are a developer looking for an easier way to test your apps on topics with schemas, this is for you! Now you can easily create a message with a topic schema directly from the Confluent Cloud Console, with built-in validation and error checking.
Since launching our first cloud connector in 2019, Confluent’s fully managed connectors have handled hundreds of petabytes of data & expanded to include over 80 fully managed connectors, custom connectors, and private networking. Discover popular connectors, SMTs, and use cases on Confluent Cloud...
Been searching far and wide for examples of Spring Boot with Kotlin integrated with Apache Kafka®? You’ve found it. But not just an example with unstructured data or no schema management. Not here! We’re going all the way with Stream Governance in Confluent Cloud. Let’s get into it.
We are excited to announce the release of a new Confluent Cloud Homepage UI, inspired by many conversations and features requests from our customer and field teams. In the past, many users bypassed the Homepage as just another click in the way of what they are trying to accomplish. This redesign...
Learn how Confluent Cloud and BigQuery Continuous Queries work together to enable real-time data processing, including the benefits of the integration with BigQuery Continuous Query and a step-by-step guide on setting up getting data from BQ Continuous Query and Confluent Cloud to capture data...
The Q2 2024 Confluent Cloud launch introduces a suite of enhancements across the four key pillars of a Data Streaming Platform - Stream, Connect, Process, and Govern – alongside some significant work we have been doing with our partner ecosystem to help customers unlock new possibilities.
In the dynamic landscape of conversational AI, the fusion of Amazon Bedrock and Confluent Cloud offers a groundbreaking solution for businesses seeking to create scalable and perpetually evolving generative AI (GenAI) chatbots. This collaboration ensures real-time data freshness, intelligent con...
Learn why data at rest sometimes needs to be transformed into a stream of data, and learn how to turn a REST API into a data stream using Apache Flink and Apache Kafka. Along the way, you’ll also find out the advantages of having your REST API data in streaming format.