Kafka in the Cloud: Why it’s 10x better with Confluent | Find out more
Change data capture is a popular method to connect database tables to data streams, but it comes with drawbacks. The next evolution of the CDC pattern, first-class data products, provide resilient pipelines that support both real-time and batch processing while isolating upstream systems...
Learn how the latest innovations in Kora enable us to introduce new Confluent Cloud Freight clusters, which can save you up to 90% at GBps+ scale. Confluent Cloud Freight clusters are now available in Early Access.
Learn how to contribute to open source Apache Kafka by writing Kafka Improvement Proposals (KIPs) that solve problems and add features! Read on for real examples.
For Staff Security GRC Program Manager Janis Hom, it's working alongside a team of highly motivated individuals—who help create an atmosphere of excellence and progress—that pushes her to perform at her best.
Let’s learn more about Janis and how she’s always learning something new at Confluent.
Since launching our first cloud connector in 2019, Confluent’s fully managed connectors have handled hundreds of petabytes of data & expanded to include over 80 fully managed connectors, custom connectors, and private networking. Discover popular connectors, SMTs, and use cases on Confluent Cloud...
Been searching far and wide for examples of Spring Boot with Kotlin integrated with Apache Kafka®? You’ve found it. But not just an example with unstructured data or no schema management. Not here! We’re going all the way with Stream Governance in Confluent Cloud. Let’s get into it.
On September 17-18, the data streaming world will descend upon Austin, Texas for Current 2024––bringing the community together to discuss all things Apache Kafka® and Apache Flink®. You’ll hear from tech leaders, industry giants, and startups as they drop a seemingly endless supply of knowledge...
Skai completely revamped its interactive, ad-campaign dashboard by adding Apache Kafka and an in-memory database—eventually moving the solution to Confluent Cloud. Once on the Cloud, they devised an ingenious architecture for reducing the number of topics they needed.
We are excited to announce the release of a new Confluent Cloud Homepage UI, inspired by many conversations and features requests from our customer and field teams. In the past, many users bypassed the Homepage as just another click in the way of what they are trying to accomplish. This redesign...
Learn how Confluent Cloud and BigQuery Continuous Queries work together to enable real-time data processing, including the benefits of the integration with BigQuery Continuous Query and a step-by-step guide on setting up getting data from BQ Continuous Query and Confluent Cloud to capture data...
BT Group, a large telecoms company based in the UK, has been on a journey to creating a ‘Smart Event Mesh’ with Confluent, enabling the availability of well-governed, real-time streams of data across a hybrid cloud environment. Learn how they’ve navigated this journey so far.
The Apache Flink® community released Apache Flink 1.20 this week. In this blog post, we highlight some of the most interesting additions and improvements.
This blog announces the general availability of Confluent Platform 7.7 and its latest key features: Enhanced security with OAuth Support, Confluent Platform for Apache Flink® (LA), a new Connector, and More
We are proud to announce the release of Apache Kafka 3.8.0. This release contains many new features and improvements. This blog post highlights some of the more prominent features. For a full list of changes, be sure to check the release notes.
Confluent Cloud for Apache Flink®️ supports AI model inference and enables the use of models as resources in Flink SQL, just like tables and functions. You can use a SQL statement to create a model resource and invoke it for inference in streaming queries.
The Q2 2024 Confluent Cloud launch introduces a suite of enhancements across the four key pillars of a Data Streaming Platform - Stream, Connect, Process, and Govern – alongside some significant work we have been doing with our partner ecosystem to help customers unlock new possibilities.
As one of the largest cancer research and treatment organizations in the United States, City of Hope’s mission is to transform cancer care. Advancing this mission requires an array of cutting-edge technologies to fuel innovative treatments and services tailored for patients’ specific needs.