Level Up Your Kafka Skills in Just 5 Days | Join Season of Streaming On-Demand
Change data capture is a popular method to connect database tables to data streams, but it comes with drawbacks. The next evolution of the CDC pattern, first-class data products, provide resilient pipelines that support both real-time and batch processing while isolating upstream systems...
Learn how the latest innovations in Kora enable us to introduce new Confluent Cloud Freight clusters, which can save you up to 90% at GBps+ scale. Confluent Cloud Freight clusters are now available in Early Access.
Learn how to contribute to open source Apache Kafka by writing Kafka Improvement Proposals (KIPs) that solve problems and add features! Read on for real examples.
We are proud to announce the release of Apache Kafka 3.8.0. This release contains many new features and improvements. This blog post highlights some of the more prominent features. For a full list of changes, be sure to check the release notes.
Confluent Cloud for Apache Flink®️ supports AI model inference and enables the use of models as resources in Flink SQL, just like tables and functions. You can use a SQL statement to create a model resource and invoke it for inference in streaming queries.
The Q2 2024 Confluent Cloud launch introduces a suite of enhancements across the four key pillars of a Data Streaming Platform - Stream, Connect, Process, and Govern – alongside some significant work we have been doing with our partner ecosystem to help customers unlock new possibilities.
A well-known debate: tabs or spaces? Let’s settle the debate, Kafka-style. We’ll use the new confluent-kafka-javascript client to build an app that produces the current state of the vote counts to a Kafka topic and consumes from that same topic to surface them to a JavaScript frontend.
Deploying Apache Kafka at the edge brings significant challenges related to scalability, remote monitoring, and high management costs. Confluent’s enterprise-grade data streaming platform addresses these issues by providing comprehensive features that enhance Kafka’s capabilities, ensuring effici...
Discuss how to use OpenSearch Ingestion to integrate Confluent with Amazon OpenSearch.
In the dynamic landscape of conversational AI, the fusion of Amazon Bedrock and Confluent Cloud offers a groundbreaking solution for businesses seeking to create scalable and perpetually evolving generative AI (GenAI) chatbots. This collaboration ensures real-time data freshness, intelligent con...
Learn why data at rest sometimes needs to be transformed into a stream of data, and learn how to turn a REST API into a data stream using Apache Flink and Apache Kafka. Along the way, you’ll also find out the advantages of having your REST API data in streaming format.
Part two in the series on using FlinkSQL, Kafka, and Streamlit dives into async.io, FlinkSQL syntax, and Streamlit barchart component structure.
In part 1 of this series, we’ll make an app, powered by Kafka and FlinkSQL in Confluent Cloud and visualized with Streamlit, that allows a user to select a stock, in this case SPY, or the SPDR S&P 500 ETF Trust. Upon selection, a live chart of the stock’s bid prices, calculated every five seconds...
The post discusses the Dual-Write Problem in distributed systems, where atomic updates across multiple systems like databases and messaging systems (e.g., Apache Kafka) are challenging, leading to potential inconsistencies. It outlines common anti-patterns that fail to address the issue...
The blog post delves into best practices and recommendations for utilizing the Confluent Terraform Provider. It offers insights on efficiently provisioning resources within Confluent Cloud infrastructure while ensuring adherence to industry standards. Additionally, it provides a GitHub repository...
Analyzing Confluent Cloud audit logs is good, but being proactively informed once something suspicious is happening is better. This article provides a conceptual guide for developing a pipeline that transfers Confluent Cloud audit logs into Splunk and defines automatic alerts based on certain events
Apache Kafka® has become the de facto standard for data streaming, used by organizations everywhere to anchor event-driven architectures and power mission-critical real-time applications.