decades typically requires using remote servers and computational resources, which may not be available to the average researcher. Plus, researchers and scientists may not always have a deep background in information technology (IT) and can find it challenging to work with advanced IT systems. Breaking down these barriers and allowing subject matter experts to harness the power of advanced technologies will be key to unlocking large-scale scientific collaboration. Agencies can take several steps to address these issues. For example, with edge computing—including deployable algorithms; powerful, distributed graphics processing units (GPUs); and advanced compression and pruning techniques to streamline models for greater efficiency—we can limit the need to backhaul Earth observation satellite data through the internet into cloud-based repositories. “For AI to reach its full potential, processing needs to occur where the data is being collected. However, most light form factor hardware like drones can’t run standard AI models that tend to be large and power intensive,” shared Jags Kandasamy, chief executive officer and co-founder of Latent AI. “By compressing models while simultaneously improving inference, processing can take place on the edge and deliver real-time reliable results and extended mission capabilities.” Performing data processing and analytics at the edge, whether in space or at ground stations, and combining it with federated learning
approaches reduces latency and unlocks options for critical decision-making needed at the speed of relevance, such as in response to disaster threats and recovery. Additionally, open-source data platforms can be made available and leveraged by scientists and citizen scientists alike. These platforms provide access to various data sources that are translated into more universally recognized formats on authoritative data repositories. AI can also be used to conduct topical analyses that return novel data associations for intuitive search and discoverability. Ultimately, the pace of scientific discovery and its application to the climate mission hinge on the level at which collaboration can be fostered and the depth at which the relevance of scientific products can be measured. This is no easy feat, owing to scientific information and research residing in large-scale, unstructured data, such as journals and web repositories. Applying AI to that data can identify patterns within scientific information, ranging from large-scale analysis of citations and mining web and social data to identify interrelationship, to trends around usage and the relevance of scientific analyses. Generative AI systems, for instance, can be trained on a vast corpus of scientific climate journals and academic papers to aid with summarization and intuitive queries. Additionally, natural language processing (NLP), a subfield of AI, can
DELIVERING AI-READY DATA
A project within a space agency gathers Earth-observation data and parameters to
serve to the public in several free, easy-to-access, easy-to-use methods. It helps communities become resilient amid observed climate variability by improving data accessibility, aiding research in renewable energy development, building energy efficiency, and supporting agriculture projects. All parameters and outputs are provided in commonly employed community formats, naming conventions, and units. The project also provides thoroughly documented application programming interfaces (APIs), a catalog of geospatially enabled analysis-ready data (ARD), and the capability to visually explore data via geospatial services, serving over 100,000 unique users with more than 50 million data requests in a year.
Understanding Dynamics and Conditions
Operationalizing the Science
Making Climate Resilience a Reality
help with extraction of specific patterns for decision- making around key issues. For example, an AI solution deployed for the Department of Defense (DoD) helps search and uncover critical policy documents, within massive troves of context-rich text, that discuss the policies of interest and highlight groups tackling similar problems. This solution is changing the policy game for DoD and is open sourced for broader access and benefit. This is the kind of democratization of AI that will provide pathways to improve scientific collaboration and empower the shift to open data and open science paradigms, a first step toward the broader operationalization we envision.
or to deploy it for disaster prediction, mitigation, and recovery. For example, if climate models predict that a city will see a 50% yearly increase in precipitation by 2100, what changes should be made to allow the city to become more resilient to this increase? How does this environmental threat interrelate with economic, societal, and technological conditions? As a result of this difficulty in translating information into action, there is a limited infusion of valuable scientific information and products into daily operations, which further hinders crucial long-term planning and management of localities. Agencies can take steps to address this disconnect, including using advanced technologies to generate analyses and outputs that are relevant to specific industries or communities. For example, AI, specifically predictive analytics, is increasingly being used to identify demand-supply equations and characterize trends related to natural resources, such as water, food, energy, and critical minerals. AI could also be used to report livability metrics for cities now and to predict them in the future by integrating data sources such as local weather patterns, documented hazard sites, locations of critical infrastructure, city walkability, and reliability of public transport.
Environment Extreme weather events and degradation of natural ecosystems
Society Vulnerable populations, health, interest groups/ alliances
Efficient, integrated outcomes
Better Access & Discover- ability
Improving Understanding
Scientific integrity and transparency
Government Legislative/regulatory changes, energy independence, national security
Scientific Information & Products
Economics Damage to properties and infrastructure, economic growth opportunities
Improved scale and relevance
Infusion of Scientific Products into Daily Lives
Technology Application of advanced technologies, innovation ecoystems, and partnerships
Infusion in Daily Lives
Efficient risk management and collaboration
Climate science deals with information that is presented across large time frames (decades and centuries) and spatial scales (multi-kilometer boxes), which can make it difficult to grapple with the relevance of it. Resilience planners and disaster response personnel may labor to translate climate information from computer models into actionable planning for climate resilience
Figure 1: Climate resilience requires an understanding of interconnected dynamic conditions along with broader application and access to information.
70
71
VELOCITY | © 2023 BOOZ ALLEN HAMILTON
Powered by FlippingBook