Change data capture is a popular method to connect database tables to data streams, but it comes with drawbacks. The next evolution of the CDC pattern, first-class data products, provide resilient pipelines that support both real-time and batch processing while isolating upstream systems...
Learn how the latest innovations in Kora enable us to introduce new Confluent Cloud Freight clusters, which can save you up to 90% at GBps+ scale. Confluent Cloud Freight clusters are now available in Early Access.
Learn how to contribute to open source Apache Kafka by writing Kafka Improvement Proposals (KIPs) that solve problems and add features! Read on for real examples.
Learn about Confluent Platform 7.5 and its latest key features: enhancing security with SSO for Control Center, improving developer efficacy with Confluent REST Proxy API v3, and improving disaster recovery capabilities with bidirectional Cluster Linking.
Apache Flink can be used for multiple stream processing use cases. In this post we show how developers can use Flink to build real-time applications, run analytical workloads or build real-time pipelines.
We're pleased to announce our new, expanded partnership with Google Cloud and for being named a Google Cloud Technology Partner of the Year for Marketplace–Data and Analytics. Together, our collaboration helps create an even easier path forward for companies looking to connect data across...
How Confluent’s new Cloud SQL Google Cloud Ready designation can help you accelerate business transformation
Senior Software Engineer Yash Mayya shares his path to Confluent, his work on Kafka Connect, and how he plans to keep growing his career.
Versioned key-value state stores, introduced to Kafka Streams in 3.5, enhance stateful processing capabilities by allowing users to store multiple record versions per key, rather than only the single latest version per key as is the case for existing key-value stores today...
See how Powerledger uses data streaming to facilitate peer-to-peer trading of renewable electricity.
Capturing and using streaming data in real time is essential today. Industry expert Sumit Pal explains why, and where managed services fit in.
Most people are not jumping for joy at the prospect of taking out a loan, or even going to the bank. If anything, banking is a chore (and not a very exciting one). But what if banking was fast, simple, and easier to understand?
Learn why stream processing is such a critical component of the data streaming stack, why developers are choosing Apache Flink as their stream processing framework of choice, and how to use Flink with Kafka.
For Niki Kapsi, commercial account executive at Confluent, it’s the “entrepreneurial” aspect of her role that she’s the most excited about.
Let’s learn more about how Niki got to Confluent—and how the company fosters a culture of learning and growth that keeps her driven and motivated.
Confluent Cloud has chosen Let’s Encrypt as its Certificate Authority and leverages its automation features to spend less time managing certificates and more time building private networking features.
When I say summer, you say… beach, popsicles, and camping.
And this summer, I went camping (albeit virtually) with Camp Confluent—a three-week (July 10-28) immersive learning experience focused on all things data streaming.
I’m thrilled to announce that Confluent has received the Financial Services Competency from AWS. The AWS Competency Program recognizes and promotes AWS Partners who exhibit technical expertise and customer success, enabling them to market and differentiate their businesses to AWS customers...