Kafka in the Cloud: Why it’s 10x better with Confluent | Find out more
Change data capture is a popular method to connect database tables to data streams, but it comes with drawbacks. The next evolution of the CDC pattern, first-class data products, provide resilient pipelines that support both real-time and batch processing while isolating upstream systems...
Learn how the latest innovations in Kora enable us to introduce new Confluent Cloud Freight clusters, which can save you up to 90% at GBps+ scale. Confluent Cloud Freight clusters are now available in Early Access.
Learn how to contribute to open source Apache Kafka by writing Kafka Improvement Proposals (KIPs) that solve problems and add features! Read on for real examples.
If you are a developer looking for an easier way to test your apps on topics with schemas, this is for you! Now you can easily create a message with a topic schema directly from the Confluent Cloud Console, with built-in validation and error checking.
While generative AI is driving the need for stronger data governance, it can also help to meet that need.
Confluent has acquired WarpStream, an innovative Kafka-compatible streaming solution. Read the full statement by Jay Kreps, co-founder and CEO of Confluent.
This is the Q3 CwC announcement blog—a quarterly installment introducing new entrants into the Connect with Confluent technology partner program. Every blog has a new theme, this blog celebrates the 1-year anniversary of the program.
The beauty of Kafka as a technology is that it can do a lot with little effort on your part. In effect, it’s a black box. But what if you need to see into the black box to debug something? This post shows what the producer does behind the scenes to help prepare your raw event data for the broker.
With AI model inference in Flink SQL, Confluent allows you to simplify the development and deployment of RAG-enabled GenAI applications by providing a unified platform for both data processing and AI tasks. Learn how you can use it to build a RAG-enabled Q&A chatbot using real-time airline data.
Transportation providers have long relied on manual processes to keep their operations flowing. While manual processes have underpinned the operations of these companies for years, they’ve also held them back from scaling. Busie, enables ground transportation providers to provide this kind of...
Apna has become India's leading job site with more than 50 million users by leveraging a modern data infrastructure built on Confluent Cloud and Onehouse's universal data lakehouse. This architecture shift has improved their data integration, reduced costs, and enabled rapid innovation, including...
62% of Confluent Cloud clusters run on AWS. Meanwhile, hundreds of thousands of customers are using DynamoDB. This blog explains how the connector helps customers integrate both platforms together.
For Staff Security GRC Program Manager Janis Hom, it's working alongside a team of highly motivated individuals—who help create an atmosphere of excellence and progress—that pushes her to perform at her best.
Let’s learn more about Janis and how she’s always learning something new at Confluent.
Since launching our first cloud connector in 2019, Confluent’s fully managed connectors have handled hundreds of petabytes of data & expanded to include over 80 fully managed connectors, custom connectors, and private networking. Discover popular connectors, SMTs, and use cases on Confluent Cloud...
Been searching far and wide for examples of Spring Boot with Kotlin integrated with Apache Kafka®? You’ve found it. But not just an example with unstructured data or no schema management. Not here! We’re going all the way with Stream Governance in Confluent Cloud. Let’s get into it.
On September 17-18, the data streaming world will descend upon Austin, Texas for Current 2024––bringing the community together to discuss all things Apache Kafka® and Apache Flink®. You’ll hear from tech leaders, industry giants, and startups as they drop a seemingly endless supply of knowledge...
Skai completely revamped its interactive, ad-campaign dashboard by adding Apache Kafka and an in-memory database—eventually moving the solution to Confluent Cloud. Once on the Cloud, they devised an ingenious architecture for reducing the number of topics they needed.