A second can be the difference between success and failure. Are you performing cyber and dynamic threat detection? Handling analytics in a denied, degraded, intermittent, or limited (DDIL) environment. Predict weather for unmanned aircraft, ground, or water vehicles. Or when it comes to making instant decisions to increase a warfighter's advantage, data is the backbone of intelligence and decision-making. In such scenarios, high-speed data alone is not enough.
Mission-critical situations require real-time data. A fully equipped database can perform 200 million reads per second. Rapid, real-time decision-making provides the advantages needed to enable the quickest and most efficient action.
The role of AI
The United States supports the responsible military use of artificial intelligence and autonomous systems. The database combines historical and streaming data analytics to ingest and analyze large datasets, whether running at the tactical edge or across the public cloud. Data feeds AI and machine learning (ML) algorithms quickly and accurately during times of crisis. Powering mission-critical applications with more, faster, and better data enables continuous learning and real-time decision-making. Both quality and quantity are essential for sound decision-making, and this is where ML comes into play.
AI/ML models perform better with more data and more iterations. The more training, tuning, and validation you do, the better the results. The challenge lies in preparing the data, creating the model, and tuning it so that the model is constantly evolving. This becomes even more complex when an online system has persistent signal data ingestion from disparate sources and still needs to make inferences in milliseconds. Modern data architectures must support both online and offline data feeds for machine learning systems. The increased data feed reduces model iteration intervals and improves model accuracy.
Move data from the edge to the tactical cloud despite challenges
The database must be able to process data from any AI/ML engine. Additionally, it should be possible to run anywhere and everywhere. Enabling real-time data should be possible on-premises or in a hybrid cloud model, multicloud model, or cross-cloud environment.
Data processing typically takes place far from where decisions are made. High-speed processing occurs from the tactical edge to the tactical cloud, for example from an unmanned vehicle in one country to a command center in another, generating a common operating picture (COP) for any area of ​​responsibility (AoR). With a global perspective.
Mission teams need to transmit data despite the constraints of low bandwidth and DDIL edge environments. There should be virtually zero latency during replication between the edge and the tactical cloud, whether processing gigabytes or petabytes of changed or compressed data.
Modern data architecture supports the depth and diversity of real-time data
The data itself is not tidy. Data comes from tens of thousands of sources around the world, so modern data architectures must support multi-model capabilities. The database ingests and analyzes large amounts of structured, semi-structured, and unstructured data types (streaming signals from wireless, wireless access points, SATCOM/MILSAT, tactical data links, hidden links) into a multi-model database. is needed. A single database then meshes disparate database models into one unified database engine to service a broader range of data processing and data retrieval tasks and use cases.
Most database management systems are organized around a single data model (such as relational, document, or graph) that determines how data is organized, stored, and manipulated. Multi-model databases can accommodate key-value, document (such as JSON documents, complex data types such as maps and lists), GeoJSON, graphs, and object-oriented models. This means that you can store a wide range of data types, maps, and lists, making your database suitable for a wide range of use cases.
Apply real-time data to real-world context
Processing all this data requires a next-generation common operating database (COD) that can ingest and analyze large datasets and perform historical and streaming data analysis combined with powerful location intelligence. The COD must be able to perform data observability to track and monitor the provenance and lineage of data, and access corporate data, mission data, and contracts at multiple security levels (unclassified, classified, confidential, and top secret). You need to be able to tag your data to differentiate it. . COD is a shared-nothing, multi-threaded, multi-model data platform designed to run efficiently on clusters of server nodes, leveraging the latest hardware and networking technology to aggregate petabytes of data. Ensure fast performance with sub-millisecond speeds over time.
COD should allow the mission, not the architecture, to dictate different deployment form factors (Kubernetes, virtualization, bare metal, cloud). Processing data at the point of need enables data-driven insights to make decisions at competitive speed.
Edge computing, along with 5G and streaming IoT data, has created opportunities for more efficient decision-making at the point of event. Regardless of physical location, milliseconds matter at the edge. By collecting and processing data closer to the network edge, less data needs to travel on the network, less data is at risk, and less network bandwidth is required. Data is continuously ingested and can be analyzed at real-time speeds, speeding decision-making that impacts mission outcomes and ultimately providing an advantage.
Modern data architectures are complex in themselves, but their results are simple. This enables seamless decision-making and leverages real-time data to improve military strategy. Bottom line: If fast enough isn't enough, real-time data processing works in critical milliseconds.
Mr. Cuong Nguyen is Vice President of Public Sector at Aerospike