[Webinar] How to Protect Sensitive Data with CSFLE | Register Today
Confluent provides a powerful solution for connecting together multiple sources of data in real time. In order for any system (or even a person) to access data from another system, you typically need to authenticate using secret credentials to ensure security around data access. For Confluent and in particular Kafka Connect, secret management is about how you configure Confluent with the necessary secret credentials that it needs to pull data from your data sources.
Because of their sensitive nature, you will want to store your sensitive credentials securely and provide Confluent the bare minimum access to those credentials. HashiCorp's Vault together with Kubernetes creates a powerful combination for securely storing credentials and providing necessary credentials to Confluent in a way that both minimizes operational toil and minimizes the potential exposure of these credentials. With Confluent Operator, folks deploying Confluent can leverage that powerful combination.
In this episode of Livestreams, Amit Gupta, group product manager at Confluent, will join your host Viktor Gamov. They’ll walk you through an end-to-end demo of Confluent Operator deploying Confluent to Kubernetes. They will also deploy the Kafka Connect GitHub Source Connector that watches for commits to a GitHub repository and then writes those commit messages to an Apache Kafka® topic. The connector will authenticate with GitHub using secret credentials securely stored in HashiCorp’s Vault and accessed by the connector in a highly secure and automated manner.