Ahorra un 25 % (o incluso más) en tus costes de Kafka | Acepta el reto del ahorro con Kafka de Confluent
Change data capture is a popular method to connect database tables to data streams, but it comes with drawbacks. The next evolution of the CDC pattern, first-class data products, provide resilient pipelines that support both real-time and batch processing while isolating upstream systems...
Confluent Cloud Freight clusters are now Generally Available on AWS. In this blog, learn how Freight clusters can save you up to 90% at GBps+ scale.
Learn how to contribute to open source Apache Kafka by writing Kafka Improvement Proposals (KIPs) that solve problems and add features! Read on for real examples.
We built an AI-powered tool to automate LinkedIn post creation for podcasts, using Kafka, Flink, and OpenAI models. With an event-driven design, it’s scalable, modular, and future-proof. Learn how this system works and explore the code on GitHub in our latest blog.
Learn how an e-commerce company integrates the data from its Stripe system with the Pinecone vector database using the new fully managed HTTP Source V2 and HTTP Sink V2 Connectors along with Flink AI model inference in Confluent Cloud to enhance its real-time fraud detection.
Confluent Cloud Freight clusters are now Generally Available on AWS. In this blog, learn how Freight clusters can save you up to 90% at GBps+ scale.
Confluent's advanced security and connectivity features allow you to protect your data and innovate confidently. Features like Mutual TLS (mTLS), Private Link for Schema Registry, and Private Link for Flink, not only bolster security but also streamline network architecture and improve performance.
Confluent Cloud 2024 Q4 adds private networking and mTLS authentication, follower fetching, Flink updates, WarpStream features to support migration and governance, and more!
Continuing our discussion of JVM microservices frameworks used with Apache Kafka, we introduce Micronaut. Let’s integrate a Micronaut microservice with Confluent Cloud—using Stream Governance—and test the Kafka integration with TestContainers.
With both Confluent and Amazon Redshift supporting mTLS, streaming developers and architects are able to take advantage of a native integration that allows Amazon Redshift to query Confluent Cloud topics.
Since its inception, change data capture (CDC) technology has significantly evolved, transitioning from a tool primarily used for database replication and migration to a cornerstone of real-time streaming. Its pivotal role in modern data architectures enables businesses to harness real-time data...
Confluent has helped thousands migrate to KRaft, Kafka’s new consensus protocol that replaces ZooKeeper for metadata management. Kafka users can migrate to KRaft quickly and with ease by using automated tools like Confluent for Kubernetes (CFK) and Ansible Playbooks.
In this edition, we’ll have a look at creating Kafka Streams topologies—exploring the dependency injection and design principles with Spring Framework, while also highlighting some syntactic sugar of Kotlin that makes for more concise and legible topologies.
This blog post talks about Confluent’s newest enhancement to their fully managed connectors: the ability to assume IAM roles.
The Q3 Cloud Bundle Launch comes to you from Current 2024, where data streaming industry experts have come together to show you why data streaming is critical today, especially in the age of AI, and how it will become even more important in shaping tomorrow’s businesses...
If you are a developer looking for an easier way to test your apps on topics with schemas, this is for you! Now you can easily create a message with a topic schema directly from the Confluent Cloud Console, with built-in validation and error checking.