[Webinaire de démonstration] Prêt(e) à dire au revoir à ZooKeeper ? Rencontrez KRaft ! | Inscrivez-vous dès maintenant
Change data capture is a popular method to connect database tables to data streams, but it comes with drawbacks. The next evolution of the CDC pattern, first-class data products, provide resilient pipelines that support both real-time and batch processing while isolating upstream systems...
Confluent Cloud Freight clusters are now Generally Available on AWS. In this blog, learn how Freight clusters can save you up to 90% at GBps+ scale.
Learn how to contribute to open source Apache Kafka by writing Kafka Improvement Proposals (KIPs) that solve problems and add features! Read on for real examples.
As the Senior Marketing Manager for the Central European region, Evi Schneider has been responsible for the entire marketing mix from events to online campaigns to partner marketing, as well as localised assets for the German-speaking market.
Stream processing has long forced an uncomfortable trade-off: choose a framework based on its power, or in your preferred programming language. GraalVM may offer an alternative solution to avoid having to choose.
Boston is a city of many firsts. The first public park, the first public school, the first UFO sighting in America. And, we just added one more to the list: The first stop in North America for our Data in Motion Tour this year.
The big data revolution of the early 2000s saw rapid growth in data creation, storage, and processing. A new set of architectures, tools, and technologies emerged to meet the demand. But what of big data today? You seldom hear of it anymore. Where has it gone?
Use the Confluent CLI and API to create Stream Designer pipelines from SQL source code.
Experienced technology leaders know that adopting a new technology can be risky. Often, we are unable to distinguish between those investments that will be transformational and those that won’t be worthwhile. This post examines how one can decide if event streaming makes sense for them.
Learn how modern data management approaches like data mesh and event-driven architecture (EDA) can be used to manage data platforms and how to take advantage of them.
When Jade Bowen joined Confluent as an account executive for the enterprise market and 11th employee in the ANZ region, there was only one client for her to work with. Three years later, she’s heading up the entire APAC Customer Success team.
Perhaps the largest challenge for modern data teams is gaining and retaining trust. The challenge of Big Data has come and gone, now we face the challenge of Untrustworthy Data, which will be one of the core focal points of the data space in 2023 and beyond.
Get an introduction to why Python is becoming a popular language for developing Apache Kafka client applications. You will learn about several benefits that Kafka developers gain by using the Python language.
Three years in, Marcus Greer is still excited about the work he does at Confluent. As a software engineer in the Cloud Manageability organization, Marcus helps make customers’ lives easier – giving them insight into the complex systems their businesses depend on.
Discover tools, practices, and patterns for planning geo-replicated Apache Kafka deployments to build reliable, scalable, secure, and globally distributed data pipelines that meet your business needs.
An Approach to combining Change Data Capture (CDC) messages from a relational database into transactional messages using Kafka Streams.
This post details how to minimize internal messaging within Confluent platform clusters. Service mesh and containerized applications have popularized the idea of control and data planes. This post applies it to the Confluent platform clusters and highlights its use in Confluent Cloud.