わずか5日間で Kafka スキルをレベルアップ | ストリーミングシーズンに参加
Today, Confluent, the data streaming pioneer, is excited to announce its entrance into MongoDB’s new AI Applications Program (MAAP). MAAP is designed to help organizations rapidly build and deploy modern generative AI (GenAI) applications at enterprise scale. Enterprises will be able to utilize MAAP to more easily and quickly leverage Confluent’s industry-leading data streaming platform, with unified Apache Kafka® and Apache Flink®, to bring real-time data from every corner of the business to MongoDB and beyond.
"Enterprise AI strategy is inextricably dependent upon fresh, trusted data about the business. Without real-time datasets, even the most advanced AI solutions will fail to deliver value,” said Shaun Clowes, Chief Product Officer, Confluent. “Seamlessly integrated with MongoDB and Atlas Vector Search, Confluent’s fully managed data streaming platform enables businesses to build the trusted, always-up-to-date data foundation essential for powering GenAI applications.”
Confluent fuels MongoDB Atlas vector search with highly contextualized, AI-ready data streams sourced from anywhere throughout a customer’s business. By leveraging Kafka and Flink as a unified platform with Confluent, teams can clean and enrich data streams on the fly, and deliver them as instantly usable inputs in real time to MongoDB. Paired with a portfolio of 120+ pre-built connectors, our fully managed Flink service enables users to process data in flight and create high-quality data streams to power the most impactful GenAI applications. Teams can easily access governed, fully managed data streams directly within Atlas vector search, making it even easier to fuel highly sophisticated GenAI applications with trusted, real-time data.
To help developers get started with building advanced AI applications, Confluent recently launched a new quickstart guide that provides a step-by-step approach to developing a GenAI chatbot tailored for financial services. This quickstart will guide you through the process of setting up a Confluent and MongoDB Atlas cluster within your Amazon Web Services (AWS) account and leveraging large language models (LLMs) from Anthropic—all fully automated to make deployment and testing easy.
With this quickstart, users can rapidly launch an environment designed for testing out GenAI chatbot use cases. This setup gives you access to the combined power of Confluent’s real-time data streaming, MongoDB’s scalable database platform, and Anthropic’s advanced AI models, creating an ecosystem where high-quality, relevant data flows seamlessly into AI applications—all running reliably on AWS cloud infrastructure.
Ready to dive in? Follow the instructions right within this blog. The quickstart will walk you through each necessary step, and by the end, you’ll have a fully operational GenAI chatbot powered by Confluent, MongoDB, and Anthropic.
Docker
The deploy script builds everything for you, the only required software is Docker.
Follow the Get Docker instructions to install it on your computer.
Access Keys to Cloud Services Providers
Once you have docker installed, you just need to get keys to authenticate to the various CSPs.
MongoDB Atlas API key: follow Grant Programmatic Access to an Organization or MongoDB Atlas API Keys (part of a tutorial on Terraform with Atlas)
At the end of these steps, you should have:
A key and a secret for Confluent Cloud
A public key and a private key for MongoDB Atlas
A key, a secret and a token for AWS.
And finally, get your Atlas Organization ID from the Atlas UI.
1. Bring up the infrastructure
2. Bring down the infrastructure
🚀 Project Structure
GenAI Chatbot Architecture
For a more detailed review of the quickstart, including a breakdown of the architecture and its components, please open the GitHub repository:
Additionally, check out the webinar “How to Build RAG Using Confluent with Flink AI Model Inference and MongoDB” to learn more about this use case and gain implementation guidance from Confluent and MongoDB’s experts.
We’re excited to expand our partnership with MongoDB as part of their AI program, enabling Confluent to bring powerful AI-driven data streaming solutions to businesses worldwide. Confluent seamlessly integrates with major cloud providers and on-prem environments, ensuring enterprises can harness real-time data streaming with the flexibility to meet regulatory and privacy requirements. Together with MongoDB, we aim to be a trusted AI partner for enterprises, empowering them to build advanced applications that prioritize data security and privacy—ideal for organizations that require high levels of security for proprietary and sensitive data.
Ready to get going? Check out the quickstart and start building a GenAI chatbot today.
Not yet a Confluent customer? Start your free trial of Confluent Cloud today. New users receive $400 to spend during their first 30 days.
Easily stream processed, governed datasets to MongoDB with Confluent’s fully managed sink connector for MongoDB Atlas.
Apache®, Apache Kafka, Kafka®, Apache Flink, and Flink® are either registered trademarks or trademarks of the Apache Software Foundation in the United States and/or other countries.
This is the Q4 CwC announcement blog—a quarterly installment introducing new entrants into the Connect with Confluent technology partner program. Every blog has a new theme, this blog focused on shift left and new use cases for SAP and GenAI.
Imagine competing in a high-stakes, gamified environment where you're tasked with solving real-world data challenges, all while exploring AWS and Confluent services hands-on. Welcome to AWS GameDay—an experience like no other.