[Webinar] How to Protect Sensitive Data with CSFLE | Register Today
Fleet management has played a fundamental role in the operation of different businesses for decades. Haulage companies, construction firms, taxi operators, and rental businesses have all relied on some form of fleet management system to maintain their fleet of vehicles and increase the efficiency of their operations.
What’s changed dramatically is the technology (both hardware and software) behind these systems. What started with relatively simple GPS technologies has exploded into a universe of Internet of Things (IoT) devices, all specialized to measure different critical vehicle metrics (e.g., fuel consumption, component vibration, and device temperature, to name a few).
This has enabled companies to collect an incredible amount of telemetry data and gain a more holistic view of their fleets. At the same time, however, it has also presented a number of challenges—principally, how to implement a data architecture that can handle the ever-increasing amount of data emitted from vehicle IoT sensors in real time.
This blog delves into some of those technical challenges, and explains how data streaming is used by organizations to address them and how important data streaming is to building a real-time fleet management system.
Many IoT applications, including fleet management, rely on MQTT Message Queuing Telemetry Transport (MQTT) as the client-server messaging protocol. This is because it’s lightweight (i.e., requires minimal resources to run on constrained devices) and is designed specifically for use in low-bandwidth, high-latency, or unreliable networks. To transfer data between MQTT clients, organizations use message brokers such as HiveMQ, RabbitMQ, or Mosquitto.
Despite its ubiquity in IoT applications, MQTT poses a number of limitations, including:
No stream processing – MQTT follows a simple pub/sub pattern, and lacks stream processing. This prevents organizations from integrating data streams from a large number of vehicle components (i.e., millions of devices) in real time, and powering a holistic fleet management solution.
No data reprocessing – Many MQTT brokers follow a “fire-and-forget” communication pattern (i.e., a sender sends a message to a receiver, without the expectation of a response). If a message isn’t delivered due to network issues or message queue failure (a common occurrence with hundreds of thousands of distributed vehicles), this can result in lost data.
Business impact: While MQTT is ideal for connecting large numbers of vehicles across low-bandwidth networks, it’s not designed to stream and process large quantities of complex IoT device data in real time. Depending on the broker, this can lead to lost, delayed, or incomplete data. Ultimately, this prevents companies from gaining a holistic, real-time view of their fleets (with all of the operational inefficiencies that entails).
In order to address these challenges for fleet management, many organizations combine MQTT with Apache Kafka®, the leading data streaming technology used by over 70% of Fortune 500 companies.
Unlike MQTT, Kafka wasn’t designed specifically for IoT applications (i.e., it lacks various IoT-specific features such as Keep Alive, and Last Will or Testament). It was, however, designed to handle incredibly high throughputs in a reliable, scalable way across distributed systems, and process data as it’s generated. As a distributed event log with infinite storage, Kafka integrates with MQTT to enable companies to stream large volumes of IoT data from thousands (or, hundreds of thousands) of IoT devices and power real-time applications.
There are a number of ways to integrate MQTT and Kafka (please refer to this post for further details). From our experience, one of the most common is to leverage Kafka Connect with Confluent’s MQTT Source and Sink Connectors.
Confluent, based on Apache Kafka and powered by the Kora engine, is a complete, cloud-native data streaming platform, used by businesses across different industries to stream IoT data from thousands of vehicles and maintain a real-time view of fleets.
Here’s a high-level architecture of a fleet management solution using the integration pattern mentioned above:
According to this pattern, Confluent’s MQTT Source and Sink connectors enable the bi-directional flow of data between the MQTT device and the Confluent cluster. Telemetry data is streamed and processed in real time, and synced downstream to MongoDB Atlas, from where it’s used to power the fleet management solution. This particular pattern takes advantage of:
Pre-built connectors – Confluent provides over 120 pre-built connectors, 70 of which are fully managed on Confluent Cloud. This enables development teams to deploy event-driven applications faster.
Stream Processing – Confluent enables organizations to join, filter, and aggregate streams of data in a reliable, scalable way. This allows companies to build a holistic, real-time view of the status of their vehicle fleets.
Schema Registry – A key component of Stream Governance, Schema Registry is used to ensure the consistency of data streams and validate data schemas between producers and consumers, providing compatibility checks, versioning, and facilitating the evolution of schemas
When used together, MQTT and Kafka enable organizations to stream, process, and govern large volumes of vehicle telemetry data in real time. Regardless of industry (e.g., construction, vehicle rental, or haulage, etc.), this facilitates the delivery of a fleet management system which can:
Increase efficiency, by optimizing routes, reducing idle time, improving fuel economy, and reducing maintenance time/costs.
Improve fleet visibility, by providing the real-time locations of vehicles alongside streaming analytics on driver behavior and vehicle condition.
Provide a better service, by giving customers accurate arrival/delivery information, and removing a reliance on customer service teams.
If you’d like to see how data streaming with Confluent works in practice, try Confluent Cloud for free today.
This blog explores how cloud service providers (CSPs) and managed service providers (MSPs) increasingly recognize the advantages of leveraging Confluent to deliver fully managed Kafka services to their clients. Confluent enables these service providers to deliver higher value offerings to wider...
With Confluent sitting at the core of their data infrastructure, Atomic Tessellator provides a powerful platform for molecular research backed by computational methods, focusing on catalyst discovery. Read on to learn how data streaming plays a central role in their technology.