Introducing Connector Private Networking: Join The Upcoming Webinar!
The Department of Defense (DOD) has set a clear strategy to become a data-centric agency. This direction is driven by the awareness that data is a strategic asset. To realize this vision, mission-critical data needs to be interoperable and accessible for all strategic and tactical programs, including disrupted, disconnected, intermittent and low-bandwidth (DDIL), tactical edge, and enterprise cloud environments. An emerging approach to becoming data-centric is to implement a data fabric strategy. Doing so will enable the DOD to remain connected within DDIL, fully utilize edge computing, and apply AI to tactical and operational activities.
Gartner defines data fabric as a design concept that serves as an integrated layer (fabric) of data and connecting processes. This layer utilizes continuous analytics over existing, discoverable, and curated metadata assets to support the design, deployment, and utilization of integrated and reusable data across all environments.
This approach gives DOD the means to access and use important data across the enterprise and in multiple environments. It enables scalability of the data architecture both technologically and organizationally, eliminating ad hoc point-to-point connections in data pipelines.
The application of data fabric is not theoretical. It is happening today across the DOD and specifically within the Navy in multiple programs. The Logistics Information Technology (LOGIT) program is a multi-year architectural plan the Navy is using to integrate and automate supply and logistics data for the entire naval fleet. The goal is to have full visibility across three key operational systems N-MRO, N-SEM, and N-PLM, to be able to proactively coordinate the fastest maintenance schedule.
Additionally, the Logistics Information Naval Connector (LINC) program is an operational environment to provide additional platforms with a standardized Platform-as-a-Service (PaaS) for hosting its portfolio of logistics applications.
Each of these programs requires creating ship-to-ship, ship-to-shore, and ship-to-edge connectivity with data as a service. This connectivity will build a modern logistics and supply chain management practice that will improve maintenance efficiency,inform mission planning, and drive toward predictive logistics.
Confluent is proud to serve as the data streaming platform and data broker for Navy programs, creating an integrated solution that pulls data from siloed and disparate systems for a singular view of maintenance records, supply levels, and tactical assignments. The application of data streaming means that integrating systems takes minutes, not weeks, resulting in tens of millions in cost savings.
Our team will be at the AFCEA WEST show February 13-15 in San Diego to talk about how data streaming feeds a data-centric approach to mission-supporting edge and AI goals. To learn more, visit us at Booth 2920 at AFCEA West. Not going to the show? Contact us today.
As presenters took the keynote stage at this year’s Kafka Summit in London, an undercurrent of excitement was “streaming” through the room. With over 3,500 people in attendance, both in person and online, the Apache Kafka® community came out...
The Data in Motion Tour brings Confluent experts together with partners and customers for an interactive day of discussions and demos of data streaming. The Washington, D.C. stop of the tour...