[Webinar] How to Protect Sensitive Data with CSFLE | Register Today

Enhancing Security with IAM Roles in Confluent Managed Connectors

Written By

As cloud environments evolve, so must the security measures that protect them. With Confluent’s latest enhancement—AWS IAM role integration for managed connectors—you can now adopt temporary security credentials, reducing both the risk of long-term credential exposure and the operational burden of key management. This feature tightens security and simplifies access management for your data flows between AWS and Confluent Cloud.

A quick introduction to IAM roles

AWS Identity and Access Management (IAM) securely manages identities and access to AWS services and resources, and scales workload and workforce access. IAM roles provide a way to access AWS by relying on temporary security credentials. Each role has a set of permissions for making AWS service requests, and a role is not associated with a specific user or group. Instead, trusted entities such as identity providers or AWS services assume roles. 

You can use roles to delegate access to users, applications, or services that don't normally have access to your AWS resources, but rely on short-term credentials. Authorized identities, which can be AWS services or users from your identity provider, can assume roles to make AWS requests. 

Introducing Confluent Cloud Provider Integration for managed connectors

Confluent Cloud Provider Integration offers identity and access management (IAM) role-based authorization that lets you adopt the temporary security credentials of an IAM role, which acts as a set of permission policies. Trusted entities, such as IAM users, applications, or cloud services can assume this role. Using this approach, you can create a secure access connection between source or sink resources on AWS and Confluent Cloud for data ingestion or transfer.

As of today, the Confluent Cloud Provider Integration is available as a part of the Confluent Cloud API. Using the REST API, you can map AWS Identity and Access Management roles in Confluent through the provider integration setup.

Benefits

  • Reduced attack vector – The connector previously used IAM user access keys. Now, the Confluent Cloud Provider Integration uses an integration ID mapped to an IAM role, eliminating the risk of leaking access keys.

  • Reduced overhead – A best practice associated with access keys, also known as long-term credentials, is to rotate the keys every 90 days. This acts as a stop-gap by limiting the amount of time bad actors to act with compromised keys.

  • Enhanced security – IAM roles have a limited lifetime, so you do not have to update them or explicitly revoke them when they're no longer needed. After temporary security credentials expire, they cannot be reused. You can specify how long the credentials are valid, up to a maximum limit.

How it works

To explain how the integration works, the following walks through how to set up Confluent Cloud’s source connector for Amazon DynamoDB CDC to assume an IAM role. 

Prerequisites

  • Authorized access to Confluent Cloud with the OrganizationAdmin or EnvironmentAdmin role to set up provider integration. If you do not have the appropriate role, reach out to your OrganizationAdmin or EnvironmentAdmin.

  • cURL and jq installed to use the API request examples in this document.

  • A Confluent Cloud API key to authenticate with the Confluent Provider Integration API. For information about how to create a Confluent Cloud API key, see Manage API Keys.

  • You have an existing DynamoDB table (or multiple tables) in the us-east-1 region.

  • DynamoDB Streams is enabled on the DynamoDB table. You can follow this guide to do so if it is not already set.

Create an IAM role 

This is one of the roles that Confluent Cloud’s source connector for Amazon DynamoDB CDC uses to retrieve data from your Amazon DynamoDB table. For ease, we’ve attached the AWS Managed policy AmazonDynamoDBFullAccess; however, when implementing in your own environment, it is strongly recommended that the policy you attach to this role be properly scoped to the minimum amount of resources and actions it needs.

  1. Navigate to the IAM Service within the AWS console.

  2. Select Roles, and then click Create role.

  3. In the Trusted entity type screen, select Custom trust policy. In the Custom trust policy editor, copy and paste the trust policy below. Notice that we don’t fully set the Principal and ExternalID values yet. For clarity, this is where we define what entities (a user, a role, or a service) can assume this role and, if appropriate, which entities to deny the ability to assume the role. An entity that assumes this role will be able to access and act according to what the attached IAM policies dictate. We will go back and fill these in properly after setting up the provider integration in Confluent Cloud via Confluent APIs.

    {
      "Version": "2012-10-17",
      "Statement": [
      {
        "Effect": "Deny",
        "Principal": {
           "AWS": "*"
        },
        "Action": "sts:AssumeRole",
        "Condition": {
           "StringEquals": {
              "sts:ExternalId": "dummy-externalId"
           }
          }
       }
     ]
    }

  4. Click Next to review the permissions, and then click Add permissions. Select the permission policy that you created or one of the AWS Managed policies for Amazon DynamoDB. For swift reference, here are the permissions required for the connector to function properly.

  5. In the Name, review, and create screen, enter a Role name and a Description.

  6. Click Create role to save your new IAM role.

  7. Copy the ARN of the IAM role you just created for use in the provider integration setup in Confluent.

Create IAM role mapping in Confluent

With the role created, we are ready to create a provider integration. This process will: 

  1. Register the role within Confluent Cloud.

  2. Generate an integration ID which will be used in the connector configuration.

  3. Generate the external ID used in the IAM role trust policy.

Note: Before proceeding, ensure you created a Confluent Cloud resource management key (not a Kafka API key). Second, ensure that you are base64 encoding your Confluent Cloud resource management keys. You can visit the documentation to learn how to do this. You will use this same base64 encoded string through the following commands:

curl --request POST \
 --url https://api.confluent.cloud/pim/v1/integrations \
 --header 'Authorization: Basic <base64-encoded-key-and-secret>' \
 --header 'content-type: application/json' \
 --data '{
   "display_name":"dynamodb_provider_integration",
   "provider":"AWS",
   "config":{
         "customer_iam_role_arn":"arn:aws:iam::000000000000:role/my-test-aws-role",
         "kind":"AwsIntegrationConfig"
     },
   "environment":{
   "id":"env-00000"
  }
}'

Below is an example output after creating a provider integration. Make a note of the following as each is used in upcoming configurations:

  • iam_role_arn – used in the IAM role trust policy for role chaining (i.e., the connector will assume this role first and then assume the role that you created earlier). 

  • id – this is the ID of the integration provider you just created.

  • external_id – used in the IAM role trust policy.

{
   "api_version":"pim/v1",
   "config":{
      "customer_iam_role_arn":"arn:aws:iam::635910096382:role/ConfluentConnectorRole",
      "external_id":"0ebb08b8-5dc8-45fa-af7c-76c4083533ba",
      "iam_role_arn":"arn:aws:iam::851725421142:role/cspi-gl5g4",
      "kind":"AwsIntegrationConfig"
   },
   "display_name":"dynamodb_provider_integration",
   "environment":{
      "id":"env-oxd00p",
      "related":"https://confluent.cloud/org/v3/environments/env-oxd00p",   "resource_name":"crn://confluent.cloud/organization=f0914313-9a78-4475-8793-bcf211076166/environment=env-oxd00p"
   },
   "id":"cspi-gl5g4",
   "kind":"Integration",
   "provider":"aws",
   "usages":null
}

Update the IAM role trust policy

Follow the steps below to update the trust policy with Confluent IAM role configurations in the AWS account. This allows Confluent to assume the role in your AWS account.

  1. Open the AWS console at https://console.aws.amazon.com/iam/

  2. Navigate to Roles, and then open the IAM role you created in the "Create an IAM role" section.

  3. In the Trust relationships tab, click Edit trust policy and update the following configurations:

    1. Change Effect to Allow.

    2. Under Principal, add iam_role_arn from the Create IAM role mapping section. Essentially, this means “I will allow any entity that has this role specified in the Principal section to assume this connector role that you’ve created.” In this case, the entity that will be assuming the connector role is the Confluent Cloud connector.

    3. Under Condition, add the external_id from the Create IAM role mapping section.

{
     "Version": "2012-10-17",
     "Statement": [
      {
        "Effect": "Allow",
        "Principal": {
          "AWS": "arn:aws:iam::851725421142:role/cspi-gl5g4" <--<iam_role_arn>
         },
      "Action": "sts:AssumeRole",
      "Condition": {
          "StringEquals": {
              "sts:ExternalId": "0ebb08b8-5dc8-45fa-af7c-76c4083533ba"
            }
         }
      }
    ]
}

Create a connector with the IAM role

Deploy a Confluent Cloud source connector for Amazon DynamoDB CDC using the following command. You will need to create Kafka cluster API keys (these are different from Confluent Cloud API keys). The Kafka cluster API keys as well as the provider integration ID will need to be inserted into the command before being run.

curl --request POST \
--url "https://api.confluent.cloud/connect/v1/environments/$ENVID/clusters/$CLUSTERID/connectors" \
--header "Authorization: Basic <base64-encoded-key-and-secret>" \
--header "content-type: application/json" \
--data '{
    "name": "DynamoDbCdcSourceConnector_0",
    "config": {
        "connector.class": "DynamoDbCdcSource",
        "dynamodb.cdc.checkpointing.table.prefix": "connect-KCL-",
        "dynamodb.cdc.checkpointing.table.read.capacity": "50",
        "dynamodb.cdc.checkpointing.table.write.capacity": "50",
        "dynamodb.cdc.max.poll.records": "5000",
        "dynamodb.cdc.table.billing.mode": "PROVISIONED",
        "dynamodb.snapshot.max.poll.records": "1000",
        "dynamodb.snapshot.max.rcu.percentage": "50",
        "dynamodb.table.discovery.mode": "INCLUDELIST",
        "dynamodb.table.includelist": "transactions",
        "dynamodb.table.sync.mode": "SNAPSHOT_CDC",
        "kafka.api.key": "<REPLACE_THIS_VALUE>",
        "kafka.api.secret": "<REPLACE_THIS_VALUE>",
        "kafka.auth.mode": "KAFKA_API_KEY",
        "key.converter.reference.subject.name.strategy": "DefaultReferenceSubjectNameStrategy",
        "key.subject.name.strategy": "TopicNameStrategy",
        "max.batch.size": "1000",
        "output.data.format": "AVRO",
        "output.data.key.format": "AVRO",
        "poll.linger.ms": "5000",
        "schema.context.name": "default",
        "tasks.max": "1",
        "name": "DynamoDbCdcSourceConnector_0",
        "value.converter.reference.subject.name.strategy": "DefaultReferenceSubjectNameStrategy",
        "value.subject.name.strategy": "TopicNameStrategy",
        "authentication.method": "IAM Roles",
        "provider.integration.id": "<REPLACE_THIS_VALUE>",
        "dynamodb.service.endpoint": "dynamodb.us-east-1.amazonaws.com"
    }
}' | jq

Verify incoming messages

Once your connector has a Running status, you will be able to view the data coming in from DynamoDB. You can do this by navigating to the Topics in your Confluent Cloud cluster. You will see a topic for every table you’ve chosen to import. By clicking into the topic, selecting the Messages tab, and selecting From beginning in the dropdown you will be able to view the data that has been imported into your Confluent Cloud cluster.

Clean up

Navigate to the DynamoDB connector and click the Settings tab. Scroll down to the bottom of the page and click Delete Connector.

Now navigate to the Topics and delete the topics created by the connector. These will simply be the names of the tables you allowed access to the DynamoDB connector.

Last, delete the integration provider using the following command (be sure to replace with your own data):

curl --request DELETE \
 --url 'https://api.confluent.cloud/pim/v1/integrations/{id}?environment={environment-id}' \
 --header 'Authorization: Basic <base64-encoded-key-and-secret' | jq

Conclusion

Confluent Cloud’s Integration Provider allows for connectors to assume an IAM role to securely source from or sink to targets within your cloud environments. This eliminates the risk of credential leakage and reduces the overhead of managing access key rotation. Available across all major clouds, you can start using this feature by visiting the connector documentation or deploying directly from the AWS Marketplace.

  • Braeden Quirante began his career as a software consultant where he worked on a wide array of technical solutions including web development, cloud architecture, microservices, automation, and data warehousing. Following these experiences, he joined Amazon Web Services as a partner solutions architect working with AWS partners in scaled motions such as go-to-market activities and partner differentiation programs. Braeden currently serves as a partner solutions engineer for Confluent and an AWS evangelist.

  • Weifan Liang is a Senior Partner Solutions Architect at AWS. He works closely with AWS top strategic data analytics software partners to drive product integration, build optimized architecture, develop long-term strategy, and provide thought leadership. Innovating together with partners, Weifan strives to help customers accelerate business outcomes with cloud powered digital transformation.

  • Jai Vignesh began his career as a developer before transitioning to a product manager (PM). As a PM, he has led the roadmap for multiple products, delighting customers and driving product success. Currently, Jai is part of the Confluent Connect PM team. In this role, he focuses on ensuring seamless connectivity between Confluent and various databases, data warehouses, and SaaS applications.

Did you like this blog post? Share it now