Ahorra un 25 % (o incluso más) en tus costes de Kafka | Acepta el reto del ahorro con Kafka de Confluent
At Current 2022 this week, Confluent CEO Jay Kreps welcomed attendees to the brand-new conference. “Where’s Kafka Summit?” he said. “It’s still right here.” As the next generation of Kafka Summit, Current 2022 focused on the broader data streaming ecosystem, bringing together practitioners, influencers, and other industry leaders to share best practices and use cases, as well as explore the vision and future of data streaming.
“We’re at a turning point of how we think about data and technology,” said Kreps in his opening keynote. “Until recently, the goal of data infrastructure was, essentially, to write down data at a fixed point in time with the intent of simply storing it. That got us here—but it won’t get our systems and businesses to what’s next. Reality isn’t some fixed static thing,” Kreps continued. “The systems we build have to reflect that.” Databases, SaaS apps, and analytics tools—the bread and butter of most modern data tech stacks—are all predicated upon data at rest.
Big numbers coming from Pinterest’s Kafka ecosystem #Current22 pic.twitter.com/Icp8GKVzBy
— Ben Stopford (@benstopford) October 4, 2022
What’s next is a way to make sense of the truly enormous volume of data that’s being produced and interacted with every day. USPS CIO Pritha Mehra described to Confluent co-founder Jun Rao how the postal service agency responded immediately to a White House directive in late 2021 to send Covid test kits to every American, free of charge. At the peak of fulfilling that request, they processed 8.7 million test kits per hour with help from Kafka—far exceeding the goal of 1 million kits per hour they had set initially. “The numbers are staggering,” said Mehra, “but that is the speed of life.”
USPS CIO Pritha Mehra talks to Confluent co-founder Jun Rao at Current 2022
Data streaming is authentic to how life really works, according to keynote guest Gian Morlino of Imply. “Stream vs. batch processing is as big a shift as the move to mobile was,” he said. “If you do data streaming correctly, it should be deeply hooked into how the business operates.”
Learn more in the keynote videos:
What’s New: Stream Designer, Stream Governance Advanced, and Confluent for Startups
As the data streaming industry matures, the technology continues to advance to meet ever-increasing needs for safe and secure expansion of data streaming without adding a lot of time or toil. Confluent Cloud’s newest additions include Stream Designer and Stream Governance Advanced, both designed to help businesses innovate faster and more confidently with Apache Kafka. Stream Designer accelerates real-time initiatives by simplifying the streaming data pipeline development process, and Stream Governance Advanced introduces enterprise-grade governance and data visibility for production workloads.
Kreps also announced the launch of Confluent for Startups, a program offering a full year of Confluent Cloud for free, among other benefits, all with the goal of getting new companies off on the right foot with a streaming data architecture.
Learn more:
Rolling out the Real-Time Red Carpet: Winners of the First Data Streaming Awards
The first-ever data streaming awards ceremony took place at Current 2022, with winners taking home a custom guitar. Winners are exploring innovative use cases using data streaming, like building a smart stethoscope, connecting astronomers around the world, and improving race car performance.
Walmart won Best Individual Data Streaming System Award at Current 2022
Where are you in your data streaming journey? See what you missed at Current and learn what you need in order to take your next steps. And a piece of parting advice: “Think really big,” said Merlino, of Imply. “Don’t do streaming as a half measure.”
Gartner® published the 2024 Magic Quadrant for Data Integration Tools, which recognized Confluent as a Challenger. Confluent is pushing the industry beyond the outdated dogma of batch processing and improving the way you implement data integration at scale.
This blog post announces the launch of the APAC deep dive of the data streaming report.