Ahorra un 25 % (o incluso más) en tus costes de Kafka | Acepta el reto del ahorro con Kafka de Confluent
Change data capture is a popular method to connect database tables to data streams, but it comes with drawbacks. The next evolution of the CDC pattern, first-class data products, provide resilient pipelines that support both real-time and batch processing while isolating upstream systems...
Learn how the latest innovations in Kora enable us to introduce new Confluent Cloud Freight clusters, which can save you up to 90% at GBps+ scale. Confluent Cloud Freight clusters are now available in Early Access.
Learn how to contribute to open source Apache Kafka by writing Kafka Improvement Proposals (KIPs) that solve problems and add features! Read on for real examples.
The insurance industry, like other traditional sectors, is no stranger to the rapid technology-driven change in consumer expectations. If these companies don’t keep pace, they risk an eroding customer base and lost revenue to more nimble and innovative competitors.
Apache Kafka (the basis for the Confluent Platform) delivers an advanced stream processing platform for streaming data across AWS, GCP, and Azure at scale, used by thousands of companies. Amazon...
Get a high-level overview of source connector tuning: What can and cannot be tuned, and tuning methodology for any and all source connectors.
In today’s hyper-connected era, customers demand nothing less than instant access to tailor-made, meaningful content that resonates with their individual needs and preferences. This post details how organizations can leverage the power of cloud-native data streaming in Confluent Cloud...
The Very Large Data Bases (VLDB) conference is a premier conference for data management systems and is renowned for showcasing cutting-edge research and industry systems. We are delighted to share that our paper, titled "Kora: A Cloud-Native Event Streaming Platform For Kafka"
Today, use of data streaming technologies has become table stakes for businesses. But with data streaming technologies, patterns, and best practices continuing to mature, it’s imperative for businesses to stay on top of what’s new and next in the world of data streaming.
Learn about Confluent Platform 7.5 and its latest key features: enhancing security with SSO for Control Center, improving developer efficacy with Confluent REST Proxy API v3, and improving disaster recovery capabilities with bidirectional Cluster Linking.
Apache Flink can be used for multiple stream processing use cases. In this post we show how developers can use Flink to build real-time applications, run analytical workloads or build real-time pipelines.
We're pleased to announce our new, expanded partnership with Google Cloud and for being named a Google Cloud Technology Partner of the Year for Marketplace–Data and Analytics. Together, our collaboration helps create an even easier path forward for companies looking to connect data across...
How Confluent’s new Cloud SQL Google Cloud Ready designation can help you accelerate business transformation
Senior Software Engineer Yash Mayya shares his path to Confluent, his work on Kafka Connect, and how he plans to keep growing his career.
Versioned key-value state stores, introduced to Kafka Streams in 3.5, enhance stateful processing capabilities by allowing users to store multiple record versions per key, rather than only the single latest version per key as is the case for existing key-value stores today...
See how Powerledger uses data streaming to facilitate peer-to-peer trading of renewable electricity.
Capturing and using streaming data in real time is essential today. Industry expert Sumit Pal explains why, and where managed services fit in.