[Webinar] How to Protect Sensitive Data with CSFLE | Register Today

Generic

Confluent Confab Co-Hosted by Google Detroit

Register

Apache Kafka & AI/ML Connecting the Dots…

Building from inception to production fully managed AI/ML use cases in the Cloud

Apache Kafka® has become the de facto standard for reliable and scalable streaming infrastructures. AI/Machine learning and the Apache Kafka ecosystem are a great combination for training and deploying analytic models at scale. AI/Machine Learning/Deep Learning are showing up more and more in projects, but still feel like buzzwords and hype for science projects. See how to connect the dots! How are both related? How can they be combined to productionize Machine Learning models in mission-critical and scalable real time applications?

When
Wednesday, November 20th, 2019
11:30 am - 3:30 pm

Venue
Google Detroit
52 Henry St
Detroit, MI 48201

Agenda
11:30 - 12:15 PM – Lunch & Networking, Hosted by Google
12:15 - 1:00 PM – Welcome: Multi/Hybrid Cloud Strategy for AI/MI: Confluent & Google
1:00 - 3:10 PM – Apache Kafka and AI/Machine Learning in the Cloud – Let’s Connect the Dots
3:10 - 3:30 PM – Wrap Up: White Board Ideas, Q&A

Agenda Details

  • See how to converge the best of breed tools used by your Data Science teams, living in silo’s through your enterprise, to a central nervous system running in the Cloud that can be fully managed and automated to deliver your data sources for enabling real-time AI/ML/DL use cases with Apache Kafka.
  • Deep dive on the process & hands on for Citizen Data Roles in your organization to train & deploy AI/ML models using Notebooks, Python, Machine Learning / Deep Learning frameworks such as TensorFlow, Kubeflow, DeepLearning4J, H2O, etc. and the Apache Kafka data pipeline ecosystem for Cloud / Kubernetes.
  • A live demo in GCP that shows how to build a mission-critical Machine Learning environment leveraging different Kafka components:
    • Kafka messaging and Kafka Connect for data movement from and into different sources and sinks
    • Kafka Streams for model deployment, pre-processing and inference in real time
    • KSQL for real time predictions and alerts
  • Showcase of Production Use case examples in Automotive, FinSrv/Insurance, Retail & Healthcare

Hosted by:

Kai Waehner is Field CTO at Confluent. He works with customers across the globe and with internal teams like engineering and marketing. Kai’s main area of expertise lies within the fields of Data Streaming, Analytics, Hybrid Cloud Architectures, Internet of Things, and Blockchain. Kai is a regular speaker at international conferences such as Devoxx, ApacheCon and Kafka Summit, writes articles for professional journals, and shares his experiences with new technologies on his blog: www.kai-waehner.de. Contact: kai.waehner@confluent.io / @KaiWaehner / linkedin.com/in/kaiwaehner.

With over 20 years of experience in sales engineering, Steve loves engaging with customers to assist in changing their view of data from historic databases and transient messaging to real-time streaming. Previously, he was a Principal Architect at Express and Database Systems Architect at OCLC, Inc.

Additional Resources

cc demo

Confluent Cloud Demo

Join us for a live demo of Confluent Cloud, the industry’s only fully managed, cloud-native event streaming platform powered by Apache Kafka
kafka microservices

Kafka Microservices

In this online talk series, learn key concepts, use cases and best practices to harness the power of real-time streams for microservices architectures
Image-Event-Driven Microservices-01

e-book: Microservices Customer Stories

See how five organizations across a wide range of industries leveraged Confluent to build a new class of event-driven microservices