[Webinar] How to Protect Sensitive Data with CSFLE | Register Today

Building a Data-Centric Force

Written By

The Department of Defense (DOD) has set a clear strategy to become a data-centric agency. This direction is driven by the awareness that data is a strategic asset. To realize this vision, mission-critical data needs to be interoperable and accessible for all strategic and tactical programs, including disrupted, disconnected, intermittent and low-bandwidth (DDIL), tactical edge, and enterprise cloud environments. An emerging approach to becoming data-centric is to implement a data fabric strategy. Doing so will enable the DOD to remain connected within DDIL, fully utilize edge computing, and apply AI to tactical and operational activities.

Data fabric defined

Gartner defines data fabric as a design concept that serves as an integrated layer (fabric) of data and connecting processes. This layer utilizes continuous analytics over existing, discoverable, and curated metadata assets to support the design, deployment, and utilization of integrated and reusable data across all environments. 

This approach gives DOD the means to access and use important data across the enterprise and in multiple environments. It enables scalability of the data architecture both technologically and organizationally, eliminating ad hoc point-to-point connections in data pipelines. 

Data fabric in action

The application of data fabric is not theoretical. It is happening today across the DOD and specifically within the Navy in multiple programs. The Logistics Information Technology (LOGIT) program is a multi-year architectural plan the Navy is using to integrate and automate supply and logistics data for the entire naval fleet. The goal is to have full visibility across three key operational systems N-MRO, N-SEM, and N-PLM, to be able to proactively coordinate the fastest maintenance schedule. 

Additionally, the Logistics Information Naval Connector (LINC) program is an operational environment to provide additional platforms with a standardized Platform-as-a-Service (PaaS) for hosting its portfolio of logistics applications. 

Each of these programs requires creating ship-to-ship, ship-to-shore, and ship-to-edge connectivity with data as a service. This connectivity will build a modern logistics and supply chain management practice that will improve maintenance efficiency,inform mission planning, and drive toward predictive logistics. 

Confluent is proud to serve as the data streaming platform and data broker for Navy programs, creating an integrated solution that pulls data from siloed and disparate systems for a singular view of maintenance records, supply levels, and tactical assignments. The application of data streaming means that integrating systems takes minutes, not weeks, resulting in tens of millions in cost savings. 

Our team will be at the AFCEA WEST show February 13-15 in San Diego to talk about how data streaming feeds a data-centric approach to mission-supporting edge and AI goals. To learn more, visit us at Booth 2920 at AFCEA West. Not going to the show? Contact us today. 

  • Will LaForest is Field CTO for Confluent. In his current position, LaForest works with customers across a broad spectrum of industries and government, enabling them to realize the benefits of a data in motion architecture with event streaming. He is passionate about data technology innovation and has spent 26 years helping customers wrangle data at massive scale. His technical career spans diverse areas from software engineering, NoSQL, data science, cloud computing, machine learning, and building statistical visualization software but began with code slinging at DARPA as a teenager. LaForest holds degrees in mathematics and physics from the University of Virginia.

Did you like this blog post? Share it now