Change data capture is a popular method to connect database tables to data streams, but it comes with drawbacks. The next evolution of the CDC pattern, first-class data products, provide resilient pipelines that support both real-time and batch processing while isolating upstream systems...
Learn how the latest innovations in Kora enable us to introduce new Confluent Cloud Freight clusters, which can save you up to 90% at GBps+ scale. Confluent Cloud Freight clusters are now available in Early Access.
Learn how to contribute to open source Apache Kafka by writing Kafka Improvement Proposals (KIPs) that solve problems and add features! Read on for real examples.
In just a few months since it became widely available, generative AI has swiftly captivated the attention of organizations across industries. In March 2023, IDC polled organizations and found that 61% were already doing something with generative AI (GenAI).
Let’s delve into Leandro’s career so far and what he’s learnt in his 4 years at Confluent, helping to scale the EMEA region.
It’s that time of year again––Kafka Summit London is right around the corner. On March 19 and 20, the Kafka community will gather at ExCeL London to talk about all things new and exciting in the world of Apache Kafka®.
Apache Kafka 3.7 introduces updates to the Consumer rebalance protocol, an official Apache Kafka Docker image, JBOD support in Kraft-based clusters, and more!
Keep reading to learn more about Sergio’s experience at Confluent and what he appreciates most about his role, his team, and his managers.
Several key new features have been added to Confluent Cloud for Apache Flink this year including Topic Actions, Terraform support, and expansion into GCP and Azure. Let's take a look at these enhancements and how they empower users to harness the full potential of streaming data.
Change data capture is a popular method to connect database tables to data streams, but it comes with drawbacks. The next evolution of the CDC pattern, first-class data products, provide resilient pipelines that support both real-time and batch processing while isolating upstream systems...
Over the last two years, we have periodically announced Confluent Platform releases, each building on top of the innovative feature set from the previous release. Today, we will talk about some of these core features that make hybrid and on-premises data streaming simple, secure, and resilient.
Confluent's new Migration Accelerator program will jump-start your data streaming journey by seamlessly migrating from any version of Apache Kafka® or traditional messaging systems to Confluent.
Learn how to use Confluent for Kubernetes to enable GitOps with a CI/CD pipeline and delegate resource creation to groups of people without distributing admin permission passwords to other people in the organization.
I’ve worked with artificial intelligence for nearly 20 years, applying technologies spanning predictive modeling, knowledge engineering, and symbolic reasoning. AI’s tremendous potential has always felt evident, but its widespread application always seemed to be just a few more years away.
What’s the key to delivering a stellar user experience? Deok Choi, staff product designer at Confluent, will tell you it’s all about truly understanding what your users need and then creating simple, effective solutions to enhance their experience with your product.
Confluent’s mission is to help our customers set data in motion with a complete data streaming platform. We've played a pivotal role in establishing Apache Kafka® as the de facto standard for data streaming while making streaming simpler, cheaper, and faster with the Kora engine.