Raw data from IoT devices, like GPS trackers or electronic logging devices (ELDs), often lacks meaning on its own. However, if combined with information from other business systems, such as inventory management or customer relationship management (CRM), this data can now provide a richer, more complete picture for more effective decision-making. For example, combining GPS data with inventory levels can optimize logistics and delivery routes. Similarly, integrating sensor data with customer purchase history enables personalized recommendations and targeted marketing campaigns. This contextualization of IoT data, especially when done in real time, can improve operational efficiency, enhance customer experiences, and unlock new revenue streams.
An excellent solution to make this happen is using AWS IoT Core and Confluent Cloud.
Confluent Cloud’s extensive offering of SDKs and connectors (many of which are fully managed) means data from virtually any application, database, and object storage can be sourced into a central location. This increases the volume of data that can be used to transform and combine IoT data, resulting in enriched streams of contextualized data. Ultimately, this leads to a more comprehensive and insightful view of the data, enabling more effective decision-making.
AWS IoT, on the other hand, plays its part by providing the cloud services that connect your IoT devices to other devices and AWS cloud services. It provides device software that can help you integrate your IoT devices into AWS IoT-based solutions. If your devices can connect to AWS IoT, AWS IoT can connect them to the cloud services that AWS provides. One of the largest benefits of IoT Core is the bundle of solutions that abstract difficult aspects of managing IoT solutions. Some of these solutions are device provisioning and registry, device shadows, fleet indexing, over-the-air (OTA) updates, and message routing rules. In fact, the message routing rules are exactly how we are able to integrate.
The following will walk you through how to configure the integration between IoT Core and Confluent Cloud. A quick distinction to be made is that there are both IoT Core topics and Confluent Cloud topics. In IoT Core, the topics are MQTT topics that receive MQTT messages from devices, while Confluent Cloud has Kafka topics that can receive data from anywhere including AWS IoT Core. While there are further differences and similarities, for the purposes of this blog, knowing that there is a distinction of where each is used is enough.
Navigate to Confluent Cloud (you can sign up here) and create a cluster.
Create new Kafka API keys. Select “My Account” when provided the option. Store the key and secret for later use. The IoT Core rules will use these keys to authenticate to Confluent Cloud and push data to your cluster’s topics.
Create a new topic in your cluster called device_data
. Skip the data contract option for now if prompted.
Navigate to AWS Secrets Manager in the us-east-1 region and select “Store new secret.”
Choose the “Other type of secret” option.
For the key/value pairs, use the following values:
confluent_key
: <your_kafka_api_key>
confluent_secret
: <your_kafka_api_secret>
Click next and name the secret iot-demo-confluent-secret
.
Finish the secret setup with the default settings. IoT Core will later reference these keys in a secure fashion to publish messages to Confluent Cloud.
Navigate to AWS IoT Core.
Find the “Messages Routing” section of the menu and select Destinations.
Select “Create VPC destination.”
If you’ve already created a basic VPC with a public subnet, you can use that. If you have not, use the Create Destination flow to create those. You should still be in the us-east-1 region.
For the IAM role, use a role with the following attached permissions:
AWSIoTLogging
AWSIoTRuleActions
AWSIoTThingsRegistration Note: You may need to create this role and attach these permissions.
Click Create and wait until the Destination Status shows Active.
Navigate to AWS IoT Core.
Find the “Messages Routing” section of the menu and select Rules.
Give your rule a name and click Next.
Provide the following SQL statement to specify what data we want IoT Core to send to Confluent Cloud. In this case, all data found in the IoT Core topic named device_data
.
Note: Normally you would end a SQL statement with a semicolon. In this case, omit the semicolon to avoid a parsing error upon trying to create the rule.
In the rule actions section, select the Apache Kafka® Cluster option.
Select the VPC destination you created in the previous section.
Provide the name of the Kafka topic we created earlier: device_data
.
Provide the bootstrap servers using the following format: SASL_SSL://<your_bootstrap_servers>
Use the following configurations for the corresponding fields. Below are also screenshot examples of the IoT Core rule.
Save your rule.
Go to the MQTT test client on the left-hand side.
Add the topic name device_data
where prompted and click Publish. This will send a simple message to the IoT Core topic device_data
. This will cause the IoT Core rule to trigger and ultimately publish the message into Confluent Cloud.
Navigate to the cluster in Confluent Cloud you previously created and check the topic for the forwarded message to IoT Core.
If you run into issues, you can set up an error action in the IoT Core rule which will republish the error to an AWS IoT topic:
Go back to your IoT Core rule and select “Edit.”
Toward the bottom of your rule, you’ll see an “Error Action” section. Select Republish to AWS IoT Topic.
Provide the name iot_rule_errors
.
Add an IAM role that allows the IoT Core rule to republish to AWS IoT topics. If you simply select a role, ideally one you’ve created during this blog, the necessary permissions will automatically be added.
Select Save.
Back in the MQTT Test Client
tab, select “Subscribe.”
Enter in the topic iot_rule_errors
.
Now, when you publish to device_data
, any errors encountered will be forwarded to the iot_rules_errors
AWS IoT topic.
For even further details, you can configure AWS IoT Core logging and set the LOG LEVEL to DEBUG
.
By following this guide you have now successfully set up a data streaming platform that can connect your edge devices with data from other parts of the business. To expand further, consider deploying a connector to source data from a database or object storage, leveraging Apache Flink® to join and transform that data in real time before it is synced to a data warehouse for analytics or consumed by a downstream application for event-driven automation. By leveraging AWS IoT Core and Confluent Cloud, messages from IoT devices can drive analytics and event-driven automation.
Ready to get started with Confluent Cloud on AWS Marketplace? New sign-ups receive $1,000 in free credits for their first 30 days! Subscribe through AWS Marketplace, and your credits will be instantly applied to your Confluent account.
This blog post talks about Confluent’s newest enhancement to their fully managed connectors: the ability to assume IAM roles.
Continuing our discussion of JVM microservices frameworks used with Apache Kafka, we introduce Micronaut. Let’s integrate a Micronaut microservice with Confluent Cloud—using Stream Governance—and test the Kafka integration with TestContainers.