[Webinar] Beherrsche die Grundlagen von Apache Kafka® mit Confluent | Jetzt registrieren
Change data capture is a popular method to connect database tables to data streams, but it comes with drawbacks. The next evolution of the CDC pattern, first-class data products, provide resilient pipelines that support both real-time and batch processing while isolating upstream systems...
Confluent Cloud Freight clusters are now Generally Available on AWS. In this blog, learn how Freight clusters can save you up to 90% at GBps+ scale.
Learn how to contribute to open source Apache Kafka by writing Kafka Improvement Proposals (KIPs) that solve problems and add features! Read on for real examples.
Most AI projects fail not because of bad models, but because of bad data. Siloed, stale, and locked in batch pipelines, enterprise data isn’t AI-ready. This post breaks down the data liberation problem and how streaming solves it—freeing real-time data so AI can actually deliver value.
The concept of “shift left” in building data pipelines involves applying stream governance close to the source of events. Let’s discuss some tools (like Terraform and Gradle) and practices used by data streaming engineers to build and maintain those data contracts.
Airy helps developers build copilots as a new interface to explore and work with streaming data – turning natural language into Flink jobs that act as agents.
This article explores how event-driven design—a proven approach in microservices—can address the chaos, creating scalable, efficient multi-agent systems. If you’re leading teams toward the future of AI, understanding these patterns is critical. We’ll demonstrate how they can be implemented.
Real-time data streaming and GenAI are advancing Singapore's Smart Nation vision. As AI adoption grows, challenges from data silos to legacy infrastructure can slow progress - but Confluent, through IMDA's Tech Acceleration Lab, is helping orgs overcome hurdles and develop smarter citizen services.
Learn how Flink enables developers to connect real-time data to external models through remote inference, enabling seamless coordination between data processing and AI/ML workflows.
Learn how to use the recently launched Provisioned Mode for Lambda’s Kafka ESM to build high throughput Kafka applications with Confluent Cloud’s Kafka platform. This blog also exhibits a sample scenario to activate and test the Provisioned Mode for ESM, and outline best practices.
Learn how Confluent Champion Suguna motivates her team of engineers to solve complex problems for customers—while challenging herself to keep growing as a manager.
Confluent achieves FedRAMP Ready status for its Confluent Cloud for Government offering, marking an essential milestone in providing secure data streaming services to government agencies, and showing a commitment to rigorous security standards. This certification marks a key step towards full...
An expanded partnership between Confluent and Databricks will dramatically simplify the integration between analytical and operational systems, so enterprises spend less time fussing over siloed data and governance and more time creating value for their customers.
Confluent is thrilled to announce the newest cohort of early-stage startups joining the Confluent for Startups AI Accelerator program. This 10-week virtual program is designed to support the next generation of AI innovators that are developing real-time generative AI (GenAI) applications.
We built an AI-powered tool to automate LinkedIn post creation for podcasts, using Kafka, Flink, and OpenAI models. With an event-driven design, it’s scalable, modular, and future-proof. Learn how this system works and explore the code on GitHub in our latest blog.
FLIP 304 lets you customize and enrich your Flink failure messaging: Assign types to failures, emit custom metrics per type, and expose your failure data to other tools.
Discover how to unlock the full potential of data streaming with Confluent's "Ultimate Data Streaming Guide." This comprehensive resource maps the journey to becoming a Data Streaming Organization (DSO), with best practices, industry success stories, & insights to scale your data streaming strategy.