particular set of satellites. The primary LLM identifies the satellites and, equipped with past data, knows that the last 40 times that constellation passed in that orbit in that configuration, the satellites moved slightly closer to a U.S. Space Command satellite. Because the LLM is also configured with the other models, it can query those LLMs on the behavior as well. As a result, the primary LLM can inform the operator that this constellation has recently been flagged for moving out of its orbit and its speed is increasing just enough to create that devastating conjunction. And thanks to information from the LLM trained on adversarial threats, the primary LLM could also queue up details for the operator identifying tactics that could be at play. Going one step further, we can imagine one of the LLMs is trained in the physics of the problem. The primary LLM could use this capability to extrapolate possible scenarios and courses of action—perhaps recommending a maneuver that uses less fuel or is less disruptive to the orbits of other satellites. The human remains in charge and is empowered to make a more confident decision, faster. Knowledge gained on this encounter will feed into the primary LLM’s learning process. And because it is linked— nested with the other LLMs to provide hierarchical, context-based learning—it gains knowledge from them at a massive rate. Its information is continuously updated as the nested LLMs are trained and validated on the latest feeds in their vast database. Every event helps the system become steadily more intelligent, creating an ever more sentient space domain awareness capability. Adding Classified Data—and What It Takes to Do It The crowding of space requires higher accuracy in tracking and predicting space movement, requiring data from all sources—especially classified data, traditionally difficult to share. A critical aspect of the linked approach is that networking LLMs has been demonstrated in a secure enclave, using both classified and unclassified data. The innovation can be taken to space organizations by teams with a development environment built on open architectures, infrastructure that leverages government-owned technologies, zero trust architecture, and experience providing flexible modernization for government missions. For example, networking LLMs requires the same granular security policies as military initiatives, such as Joint All-Domain Command and Control (JADC2). Automated DataOps ensures the onboarding of diverse feeds, standardizing formats and enforcing data standards and policies plus providing granular security. Common tools and interoperable technologies simplify development. And cross-domain solutions ensure automated workflows with modular elements that can be adapted for mission demands.
Some of the most critical datasets are classified, requiring laborious manual processes to share across domains. Although technical cross-domain solutions exist, they often can’t keep pace with new file types and data structures and lack the resiliency to keep operating under stress or adversarial attack. Modeling threat scenarios requires a vast amount of data to train algorithms. As space defense is a relatively new area, data is scarce and, in some cases, doesn’t yet exist. Therefore, generating synthetic data is a necessary step, with attendant responsibilities, such as ensuring algorithms are free of bias. Tracking the Future: The BRAVO Hackathon LLMs—deep learning algorithms that generate content and perform other complex functions using very large datasets—exploded in popularity following the launch of OpenAI’s ChatGPT in the fall of 2022. While most users have been experimenting with creating poetry, writing essays, or paraphrasing information, Booz Allen has been exploring new applications for LLMs across space- domain applications. The capability to network LLMs was demonstrated in spring 2023 at the Air Force’s BRAVO hackathon, a multi-classification event drawing over 400 experts
to compete in prototyping data solutions for pressing problems. The award for best data visualization and the Best User Interface award went to a team that linked two LLMs in a classified environment using zero trust protocols. The hackathon gave the team a chance to give ensemble modeling—a process for improving multiple diverse algorithms to arrive at an outcome—new power by networking LLMs rather than individual algorithms. This opened a new path to generate the fast, comprehensive answers required to move space operations with speed and accuracy. It also provided two-way communication between specialized LLMs to amplify space operators’ awareness. After rapidly deploying a user interface, the team deployed two LLMs and wrote an app that allowed them to talk with each other (see Figure 1). The first LLM was trained in radar sensor data, while the other was trained in Earth observation (EO) imagery. The team executed a scenario where the team member, acting as operator, asked the first LLM to watch a certain area in Asia and send an alert if anything of interest was found. No special codes were needed; the operator simply typed the request as if texting a colleague. In practice the request could have been activated another way, according to operator preference; for example, via voice recognition. The first LLM, designated as moderator, located a radar image and asked the second model if it had any data. The second LLM, trained in EO, responded that it did and sent the image along. The first LLM then delivered both images to the operator along with a message saying, essentially, “I found a radar image at that location and retrieved an EO image at that same location.” The process was simple and streamlined for the human partner. “Software building conferences like BRAVO allowed us to push, and sometimes stumble, on some interesting solutions,” said Collin Paran, the AI solutions architect who led the Booz Allen team. “Linked, multimodal, and networked AI with two-way communication will certainly unlock more insights for different organizations.” Why Networking LLMs Dramatically Increases Space Awareness The synergy of LLMs working together and delivering ever more detailed, insightful results makes this approach significant. Say you have a mission-focused LLM trained on the Space Command’s catalog of objects in orbit, the Unified Data Library. Imagine you network it with an LLM trained in avoiding collisions, called conjunctions; one trained on radar data; and another trained on a military intelligence database of adversarial threats. You’ve conducted skilled training and testing, and you’ve been using the system for increasingly critical tasks. Now the system is deployed on a mission where the operator wants to understand the behavior of a
The complexities can be compared to a game of chess—a game computers have become famously good at—played in multiple dimensions, with decisions made at split- second speed. Some of the factors: Objects need to be tracked in multiple orbits. Most commercial satellites, human space missions, and the International Space Station are in low Earth orbit (LEO), a regime that extends to about 2,000 km. Higher orbits, such as medium Earth orbit (MEO) and geostationary orbit (GEO), host navigation, weather, communications, and national security satellites. NASA’s Artemis program and robotic adjuncts from multiple nations generate more traffic between Earth and the Moon. Operating safely in this region of cislunar space requires new technologies and tactics for object detection, forecasting, and collision avoidance. Data pours in from multiple sensors from multiple sources. This results in a profusion of siloed datasets in multiple formats that need to be ingested and processed, with granular security applied. Although the increased number and diversity of data sources improve our ability to perform space domain awareness, it also introduces a data fusion challenge as multiple data sources with different formats and reference frames must be integrated in real time.
Figure 1: Networking LLMs demonstrated at the BRAVO hackathon. Two specially trained models shared information and seamlessly delivered a report to the operator.
Monitor this area on the ground and alert me if you find anything
LLM #2 , do you have anything else at this location?
Can talk to eachother
I found a radar image at this location... ....and I also retrieved an Earth Observation image of the same location
I have an image of an object at that location
VELOCITY | © 2023 BOOZ ALLEN HAMILTON
Powered by FlippingBook