Level Up Your Kafka Skills in Just 5 Days | Join Season of Streaming
Imagine getting into your car to head to work on a hot day. Your car already knows and sets the temperature, the ambient lighting, and the music you prefer. Not only that, it optimizes your route, and with Level 3 autonomy, it can even drive you there.
But what does the automotive industry have to do on the backend in order to achieve this kind of personalization?
That’s the question Seema Acharya, software engineering manager at Mercedes-Benz R&D North America, asked 4,280 attendees (both in-person and online) during the recent opening keynote at Current 2024.
“Think of a car… as a very complex IoT device,” Acharya explained. “We bring in data from multiple sources and collect them into our data streaming platform, then we distribute it out and create a very well defined and curated set of data products. Once the data products are ready, our AI and ML teams can take these on and make real-time inferences and real-time recommendations back to our customers … to create these very hyper personalized experiences.”
Not just that, the data streaming platform (DSP), which currently handles 800 terabytes of data per month, provides the business a comprehensive view into what’s happening with its customers and fleet, Acharya added. The result? They don't have to wait for months to get insight into their customers’ needs.
“It has helped us drastically reduce time to market,” she said. “Imagine each of our data teams building their own [data] pipelines. It would take somewhere between four to six months to do this from scratch every single time, now we have reduced this time to four to six weeks.”
Throughout the two days at Current, many such incredible stories echoed through the halls of the Austin Convention Center. Industry experts shared real-world examples of how a DSP can help businesses go from data mess to data value and ultimately help organizations deliver the use cases of today and tomorrow, across every industry.
Take the media industry, for example. Kushal Kahndelwal, head of data platforms at Viacom18, took the stage to share how a DSP is integral to enabling JioCinema—one of the leading India-based digital OTT platforms which streams 100,000 plus hours of content in 19 plus languages and catering to over 600 million customers—build at India scale.
“With a data streaming platform we have been able to convert our heartbeat data and clickstream data into high quality data assets that enable us to deliver that seamless entertainment experience on our platform,” Kahndelwal said. “With this dataset we were also able to build a much more effective targeted advertising on our platform, do better recommendations, better personalizations, and a lot more for our customers.”
Catch up on all the insights and expert discussions by watching our recorded sessions from Current 2024.
And the learnings didn't just end there. Experts outlined how a data streaming platform—along with a change in mindset—is imperative to business success in the AI era.
Before we get into the details, let’s consider this question: what does AI need? It’s access to the right data, at the right time. And our CEO and co-founder Jay Kreps will tell you, “What AI actually needs is a data streaming platform—it needs an infrastructure that can work with this kind of real-time data at scale across the organization so you can get the right data and you can get them in a way that’s continuously up to date and maintainable.”
This increasing demand for data that’s of high quality, reliable, safe to use, and discoverable is what’s driving the mindset shift we hinted at earlier—pushing more companies to shift processing and governance closer to data generation time—a strategy we call shifting left. Shifting your data processing and governance “left” or upstream allows you to eliminate duplicate pipelines, reduce the risk and impact of bad data, and leverage high-quality data products for both operational and analytical use cases. It ensures that the data downstream is always fresh and up to date, trustworthy, reliable, discoverable, and instantly usable so your teams can build new applications more easily.
Embracing a shift-left strategy ensures frictionless access to data, Chief Data and AI Architect at Accenture Global IT Adam Tybor reinforced during the opening keynote. It has also opened up innumerable new opportunities for the business—ultimately helping drive innovation at scale, Tybor added.
“In the past all of our projects used to start with data integration and it would take weeks and months to get that data and copies of that data off the ground. Now, where we have the data streams in place, those new experiments tend to go really quick,” Tybor explained. “With data products and having data ready we have been able to take advantage of new technologies like GenAI. It offers this decoupling platform where as new technologies come out it gives us the ability to experiment without a lot of disruption to the day to day work.”
Now, if you are wondering what this rise of data streaming means for software and data professionals like yourself, then we have you covered. Our Day 2 keynote shed light on how this paradigm shift is giving rise to a brand new role: data streaming engineers. Not just that, our in-house experts also delved into all things Apache Flink® and key use cases. Interested in learning more? Read our blog or watch the recording.
At Confluent, our focus has always been on helping businesses unlock the full value of their data. To do that, we have made it our mission to make data streaming simple and powerful—so you can quickly harness business data to deliver business value. To this end, we made some key announcements at Current. Here’s a quick rundown:
Client-side field-level encryption (general availability) allows you to democratize access to streams and share them widely while encrypting really sensitive fields end-to-end.
Table API (open preview) brings Flink closer to all Java and Python developers by letting you access all of the capabilities of the Flink engine directly from within the languages and tools you are already using.
Flexible schema management enables you to put Flink to work on all of your data whether or not that data was originally serialized with an attached schema.
AI model inference (open preview) helps you query your AI models with Flink SQL.
Private networking for dedicated clusters on AWS which enables you to put Flink to work on all of your data all throughout your enterprise
Tableflow went into early access earlier this year but will go into open preview very shortly. A new feature on Confluent Cloud, Tableflow turns topics and schemas into Apache Iceberg tables in one click to feed any data warehouse, data lake, or analytics engine for real-time or batch processing use cases.
Most importantly to make that data streaming journey even simpler for you we announced the acquisition of WarpStream in September. With the acquisition of WarpStream, we now have a data streaming offering for every company no matter the need—whether that’s fully managed with Confluent Cloud, self-managed with Confluent Platform, or bring-your-own-cloud (BYOC) with WarpStream.
But how do you do BYOC right? Watch this clip to see what WarpStream co-founder Richard Artoul has to say.
For those of you who weren’t able to join us in person this time, the good news is that we have session recordings available for you. Here are a few highlights from our breakout sessions:
Empowering Teams: Learning Strategies That Fuel Kafka Adoption: Learn how Cardinal Health devised their enablement strategy and proceeded to build an ecosystem of information and learning to entice leaders, architects, and developers into embracing data streaming.
Powering Real-Time Generative AI With Amazon Bedrock and Confluent Streaming: Learn how to integrate Confluent's scalable data streaming platform with Amazon Bedrock to build innovative GenAI solutions.
Introducing Apache Fink: 5 Things You Need to Know to Get Started: Learn the basics on what you need to know about Flink in order to get started with your first streaming application.
Modernized Trading: Managing Market Data With Confluent Cloud at Vanguard: Learn how Vanguard’s went from a proof-of-concept win to hosting mission critical applications, and the role Confluent plays to improve their digital client experience.
Trust and Safety at Indeed: Detecting and Preventing Fraud in Real-Time: Learn how Indeed built and grew a robust data streaming platform to ensure all employers are able to have a robust pool of qualified applicants to meet their hiring needs.
And for those who missed out on our exclusive Executive Summit event this year, which happened a day before Current, we had 151 attendees across 20 industries spend an immersive day learning about how to harness the power of data in creative and innovative ways. Our hand-picked panelists shared their data streaming journey, key use cases, best practices, business outcomes, how they are getting ready for—and using—GenAI, and why they are shifting left to make it right.
Check out this video to hear from attendees about their favorite moments at Current 2024.
Catch up on all the insights and expert discussions by watching our recorded sessions from Current 2024.
On Day 1 of Current, our "Women in Data: Building Your Personal Brand" panel featured inspiring leaders who shared insights on defining and growing personal brands in tech. Key themes included authenticity, resilience, and the power of mindful actions. The discussion encouraged women to celebrate wins, find mentors, and build supportive communities. Read our Linkedin post for details.
Day 2 lunch session, “How Financial Services Extract Value From Data at Scale”, featured insights from finserv leaders who highlighted key data streaming use cases, including fraud detection and hyper-personalization. And with GenAI adoption accelerating across the finserv industry, experts reminded the audience that the GenAI journey isn’t a one-size-fits all. They stressed on the importance of laying a strong foundation with a clear vision, focusing on specific use cases, demonstrating early wins, and assessing and addressing risks early in the process. Read our Linkedin post for details.
Recognizing and celebrating the achievements of individuals and teams is a highlight that brings energy and inspiration to any event. And Current was no exception. Our third Data Streaming Awards recognized organizations that are harnessing the power of this revolutionary technology to drive business and customer experience transformation. And the winners are...
Solinftec, a global leader in artificial intelligence and robotics for agribusiness, won the award for innovation
Netflix won the best company-wide implementation award
Video analytics company Conviva won the case study of the year award.
And what's an award ceremony without prizes? Winners took home mechanical keyboards. Read our LinkedIn post to learn more.
While the curtains have closed on this year’s event, the excitement doesn't stop here. We’re already looking ahead to welcoming you in New Orleans next year (October 28-30) for Current 2025. We’ll be back with even more insightful sessions, innovative showcases, and opportunities to connect with the brightest minds in the data streaming industry. Stay tuned!
For now, catch up on all the insights and expert discussions by watching our recorded sessions from Current 2024. Explore the full library and stay ahead with the latest trends in data streaming, AI, and more.
We covered so much at Current 2024, from the 138 breakout sessions, lightning talks, and meetups on the expo floor to what happened on the main stage. If you heard any snippets or saw quotes from the Day 2 keynote, then you already know what I told the room: We are all data streaming engineers now.
The Q3 Cloud Bundle Launch comes to you from Current 2024, where data streaming industry experts have come together to show you why data streaming is critical today, especially in the age of AI, and how it will become even more important in shaping tomorrow’s businesses...