わずか5日間で Kafka スキルをレベルアップ | ストリーミングシーズンに参加

Confluent Introduces Enterprise Data Streaming to MongoDB’s AI Applications Program (MAAP)

作成者 :

Today, Confluent, the data streaming pioneer, is excited to announce its entrance into MongoDB’s new AI Applications Program (MAAP). MAAP is designed to help organizations rapidly build and deploy modern generative AI (GenAI) applications at enterprise scale. Enterprises will be able to utilize MAAP to more easily and quickly leverage Confluent’s industry-leading data streaming platform, with unified Apache Kafka® and Apache Flink®, to bring real-time data from every corner of the business to MongoDB and beyond.

"Enterprise AI strategy is inextricably dependent upon fresh, trusted data about the business. Without real-time datasets, even the most advanced AI solutions will fail to deliver value,” said Shaun Clowes, Chief Product Officer, Confluent. “Seamlessly integrated with MongoDB and Atlas Vector Search, Confluent’s fully managed data streaming platform enables businesses to build the trusted, always-up-to-date data foundation essential for powering GenAI applications.”

Build a real-time data foundation for GenAI

Confluent fuels MongoDB Atlas vector search with highly contextualized, AI-ready data streams sourced from anywhere throughout a customer’s business. By leveraging Kafka and Flink as a unified platform with Confluent, teams can clean and enrich data streams on the fly, and deliver them as instantly usable inputs in real time to MongoDB. Paired with a portfolio of 120+ pre-built connectors, our fully managed Flink service enables users to process data in flight and create high-quality data streams to power the most impactful GenAI applications. Teams can easily access governed, fully managed data streams directly within Atlas vector search, making it even easier to fuel highly sophisticated GenAI applications with trusted, real-time data.

With Confluent, you can build a real-time, contextualized, and trustworthy knowledge base to fuel GenAI applications running on MongoDB.

Quickstart: Deploy a GenAI chatbot with Confluent, MongoDB, and Anthropic on AWS

To help developers get started with building advanced AI applications, Confluent recently launched a new quickstart guide that provides a step-by-step approach to developing a GenAI chatbot tailored for financial services. This quickstart will guide you through the process of setting up a Confluent and MongoDB Atlas cluster within your Amazon Web Services (AWS) account and leveraging large language models (LLMs) from Anthropic—all fully automated to make deployment and testing easy.

With this quickstart, users can rapidly launch an environment designed for testing out GenAI chatbot use cases. This setup gives you access to the combined power of Confluent’s real-time data streaming, MongoDB’s scalable database platform, and Anthropic’s advanced AI models, creating an ecosystem where high-quality, relevant data flows seamlessly into AI applications—all running reliably on AWS cloud infrastructure.

Ready to dive in? Follow the instructions right within this blog. The quickstart will walk you through each necessary step, and by the end, you’ll have a fully operational GenAI chatbot powered by Confluent, MongoDB, and Anthropic.

Quickstart Requirements

Docker

The deploy script builds everything for you, the only required software is Docker.

Follow the Get Docker instructions to install it on your computer.

Access Keys to Cloud Services Providers

Once you have docker installed, you just need to get keys to authenticate to the various CSPs.

At the end of these steps, you should have:

  • A key and a secret for Confluent Cloud

  • A public key and a private key for MongoDB Atlas

  • A key, a secret and a token for AWS.

And finally, get your Atlas Organization ID from the Atlas UI.

Run the Quickstart

1. Bring up the infrastructure

# Follow the prompts to enter your API keys
./deploy.sh

2. Bring down the infrastructure

./destroy.sh

🚀 Project Structure

. # root of the project
├── frontend # Frontend project for the chatbot. This is what will be deployed to s3 and exposed via cloudfront
└── infrastructure # terraform to deploy the infrastructure
    ├── modules
    │     ├── backend # websocket backend & lambdas for the chatbot
    │     │     └── functions # lambda functions
    │     ├── confluent-cloud-cluster # confluent cloud infra. i.e. kafka, flink, schema registry, etc.
    │     └── frontend # s3 bucket and cloudfront distribution
    │         └── scripts # scripts to assist with building and deploying the frontend
    ├── scripts # scripts to assist with deploying the infrastructure
    └── statements # sql statements to register against a flink cluster
        ├── create-models
        ├── create-tables
        └── insert

GenAI Chatbot Architecture

Architecture for handling document indexing and chatbot functionality using a combination of AWS services, Anthropic Claude, MongoDB Atlas and Confluent Cloud.

For a more detailed review of the quickstart, including a breakdown of the architecture and its components, please open the GitHub repository:

Additionally, check out the webinar “How to Build RAG Using Confluent with Flink AI Model Inference and MongoDB” to learn more about this use case and gain implementation guidance from Confluent and MongoDB’s experts.

Get started with Confluent

We’re excited to expand our partnership with MongoDB as part of their AI program, enabling Confluent to bring powerful AI-driven data streaming solutions to businesses worldwide. Confluent seamlessly integrates with major cloud providers and on-prem environments, ensuring enterprises can harness real-time data streaming with the flexibility to meet regulatory and privacy requirements. Together with MongoDB, we aim to be a trusted AI partner for enterprises, empowering them to build advanced applications that prioritize data security and privacy—ideal for organizations that require high levels of security for proprietary and sensitive data.

Ready to get going? Check out the quickstart and start building a GenAI chatbot today.

Not yet a Confluent customer? Start your free trial of Confluent Cloud today. New users receive $400 to spend during their first 30 days.

Easily stream processed, governed datasets to MongoDB with Confluent’s fully managed sink connector for MongoDB Atlas. 

Apache®, Apache Kafka, Kafka®, Apache Flink, and Flink® are either registered trademarks or trademarks of the Apache Software Foundation in the United States and/or other countries.

  • Pascal Vantrepote is a seasoned technology leader with over 20 years of experience in software architecture, AI, and early-stage startups. Currently the Senior Director of Partner Innovation at Confluent, he drives initiatives to accelerate data streaming technology adoption. Pascal has held key roles at organizations like Confluent, TD, Scotiabank, and Achieve3000, delivering innovative solutions and building high-performance teams. He has been recognized with honors such as the President's Club and Technology Innovation Star Award and is fluent in French and English.

このブログ記事は気に入りましたか?今すぐ共有