Keywords

Introduction

Disasters are sudden, tumultuous events that disrupt communities and cause human, material, economic, and ecological losses, as well as exacerbating preexisting sociopolitical circumstances. When invoked, the word “disaster” conjures up images of the 2010 Haitian earthquake, or the 2004 Indian Ocean tsunami, with more than 250,000 casualties each and bleak landscapes of devastated communities, infrastructure, and ecology. Disasters are varied, caused by natural or manmade hazards, diverse in degree and type of impact, yet they are all united by the overwhelming destruction that the United Nations International Strategy for Disaster Reduction (UNISDR) defines as “exceeding the ability of the affected community or society to cope using its own resources.” [1]. Integral to the practice of global public health is an acknowledgement of the inevitability of such events, the potential impact on community and infrastructure, and the means by which to mitigate, prepare for, respond to, and recover from them.

Disaster management, disaster medicine, and the multiple technologies that have evolved to contribute to these efforts constitute a vast body of knowledge that cannot be thoroughly discussed within this chapter. However, beginning to gain an understanding of disaster management and the technologies that are being employed throughout its cycle is integral to the discussion of technology within global health.

During the past two decades, disaster response has grown from isolated post hoc reactions to hazards to a tremendous field that acknowledges a common cyclical pattern surrounding disaster timelines. This cycle is disaggregated into four reactionary stages: preparedness, response, recovery, and mitigation. Although awareness of all phases is necessary during programmatic design and impact evaluation, each phase requires unique perspectives, data, initiatives, and, therefore, technology.

In its nascence, the field of disaster response technologies focused on tactical and medical objects, such as tourniquets, radio equipment, and extrication devices. With a broader scope, a more sophisticated understanding of disasters and the emergence of powerful yet readily available technology, disaster management science can leverage computational sciences, mobile communication devices, and the internet for a more holistic, coordinated, and useful response. At the leading edge of disaster management technology is an emphasis on data collection, analysis, and dissemination. Tools such as geoanalytics, modeling and optimization, simulation, information and communication technology (ICT), and crowdsourcing have been developed to create faster real-time information streams, bolster the agencies of affected communities, produce more sophisticated and useful initiatives, and improve disaster management coordination.

Disaster Preparedness

Early integration of geospatial information into disaster preparedness came in the form of creating simple maps that depicted background reference information including landforms, political boundaries, and roads. These “basemaps” were later superimposed with post-event hazard data to understand the degree of impact, as exampled by the 1906 San Francisco Fire Map Basemaps, which continue to be invaluable in disaster management, and explicitly disaster preparedness and mitigation, as they are canvases for the assessment of vulnerabilities and response planning.

Mapping “vulnerability” – those complex sociopolitical, human–environment interactions that render communities at risk to a hazard – confers true insight into potential impact and anticipated needs post disaster [2]. Generally, vulnerability mapping included basic demographic (i.e., percentage of under-five population, gender, etc.), economic, and structural considerations. However, the conversation of vulnerability is evolving. Since Hurricane Katrina, for example, American public health experts have acknowledged the impact of chronic health conditions, such as diabetes, asthma, and physical disability, on post-disaster needs and morbidity and mortality. Recommendations now suggest that disaster preparedness organizations inventory population basemaps that include chronic health and disability data [3].

However, in many developing countries, there is a scarcity of accurate, up-to-the-minute spatial data regarding the population, let alone critical facilities, healthcare infrastructure, or population vulnerabilities. Remote sensing, utilizing satellite or other aerial imagery platforms, is now crucial in creating and augmenting basemaps by combining layers of infrastructure, hydrography, elevations, and forest cover into an orthoimage file These high-resolution aerial images combine the visual attributes of photographs with the spatial accuracy of maps, such that feature displacement and scale variations caused by terrain and technology are rectified to create accurate disaster planning basemaps.

Along with hazard and response modeling, as is discussed in the next section, geoanalytics allows a more sophisticated understanding of geographic susceptibility, and community and infrastructural vulnerability, that is critical to developing safe access routes, adequate resource inventories, and other disaster response algorithms.

Modeling and Simulation in Disaster Preparedness

Disasters originate from diverse geophysical, climatological, technological, biological, and sociopolitical variables that impact populations not only by the magnitude of the disaster itself, but also secondarily to the infrastructural foundation and inherent vulnerabilities of the affected population. Effective preparedness requires accurate understanding of both the potential hazard and the preexistent demographic, infrastructural, sociopolitical, economic, and medical characteristics of the communities they affect to (1) amass appropriate food and water resources, medical inventories, volunteer and non-volunteer response personnel, and equipment, and (2) create warning, evacuation, search and rescue, and other response and recovery plans.

Geographic Information System (GIS) modeling and simulation is a valuable tool for emergency preparedness and management. Such modeling has the capability of simulating emergency scenarios in various situations with temporal attributes. Also, it can aid emergency managers in making critical decisions by using spatial analysis for (1) risk and threat assessment, (2) scenario modeling, (3) situational awareness, and (4) resource allocation.

Hazard modeling imports multivariate geospatial, climatological, meteorological, temporal, and infrastructural data into a computational model or simulation, superimposed upon a GIS platform, to determine the likelihood of disaster impact. Explicitly, modeling can be utilized to predict flood inundation, coastal surge heights, seismic vulnerability, landslide susceptibility, smoke plume distribution, and infectious disease transmission. Vulnerability, that is, the capacity to anticipate, cope with, resist, and recover from a hazard, can also be mapped [4]. When hazard predictions and vulnerability demographics are simulated, risk can be estimated. Quantitative methods, including complex analytical and measurement methods, causal inference, structural equation modeling, and decision theory, are commonly used in public health research to create these predictions [2].

One such example of hazard modeling is that of the U.S. SLOSH Program, or the Sea, Lake and Overland Surges from Hurricanes Program, which is utilized not only to forecast weather but also to determine which communities are susceptible to surge flooding and inform disaster preparedness programs. Systems such as Sahana Asia have been developed to simulate disaster mitigation and preparedness activities to manage disaster data that support early disaster warning and mitigation in advance [5]. Interdisciplinary efforts, as evidenced by the U.N. Development Program’s Disaster Hotspots project, are now conjoining hazard modeling and operational research to provide decision support for evacuation routing, resource allocation, medical response infrastructure, and supply chain schema in response to earthquakes, flood, wildfires, and landslides.

Virtual Reality Simulation

The chaotic nature of disaster response requires sufficient preparation and familiarity with the sequence of events, communication, and skills to respond effectively and efficiently. New technologies, programs, and applications are making high-fidelity simulation within disaster preparedness more accurate, and hence more reliable for training.

Virtual reality (VR) simulation incorporates real event elements (large crowds, infrastructure, visual/auditory cues) into training scenarios to prepare personnel to respond within the high acuity, chaotic, and stressful environment of a disaster or public health emergency. Mass casualty triage training can incorporate VR-based scenarios that respond to user inputs and provide instant feedback through the use of avatars. A recent study comparing triage performances during disaster simulation exercises using VR and standardized patient drills evidenced the efficacy and feasibility of VR as an alternative for training in disaster triage [6]. Furthermore, VR-based training can be cost-effective when compared to more traditional training models if one considers the number of learners that it can potentially reach, the range of applications, and the ease of repeated use [7].

Programs such as SimulationDeck, a secure web portal that replicates online communication tools used during an acute disaster, allow professionals to simulate mass information campaigns as they would after a disaster, to evaluate their efficacy, and to address challenges before the true event. Because of the extensive use of social media for communication and emergency management, SimulationDeck is becoming more important for organizations to emulate the use of the medium for responder training. Social networks such as Facebook and Twitter, as well as agency websites and blogs, are simulated to train responders in disseminating accurate and coordinated public information during an emergency [8].

Disaster Response and Recovery

During the response phase of a disaster, geoanalytics and remote sensing are increasingly critical in impact assessment and response support. It is not just formal or planned systems that provide this information: the rapidly increasing penetrance of mobile technology and internet access puts social media such as Twitter and Facebook among the earliest primary sources of information following a disaster. Capitalizing on this flood of information arriving directly from the ground and quickly curating it into actionable and reliable information is a key function in disaster response. Crowdsourcing has emerged as one mechanism to perform this and can be defined as an online-distributed, problem-solving production model that leverages the collective intelligence of online communities for specific purposes [9].

Understanding spatial relationships through geoanalytics enables public health and disaster management personnel to assess risk as a factor of distance, create flow-through networks for response programs, better disseminate needs assessment data, coordinate multi-team, cross-departmental response initiatives, and perform impact analysis. Geoanalytics contributes significantly to each of the phases of disaster management.

GIS are composed of the hardware, software, and management systems that gather, store, manipulate, and display geographically referenced data to create an accessible visual database. “Geoanalytics,” or “geospatial analysis,” is a computational science that applies inferential statistics to geospatially tagged data to evaluate patterns and relationships that might not have been otherwise observed had spatial characteristics not been appreciated. Both tools emphasize the “where” variable of any question, but geospatial analytics goes beyond simple mapping and generates spatial hypotheses to make valid scientific inferences. In the context of disaster management, geospatial methodologies transcend disaster management phases to both understand the pre-disaster state of a community and assess the evolving impact of the hazard.

One of the first major events in which remote sensing imagery was utilized for day-of-response efforts occurred on September 11, 2001, after the attacks on the World Trade Center in New York City. Satellite and aerial imagery, LIDAR (light detection and ranging) imagery, and data from handheld electronic devices that combined Global Positioning System (GPS), internet, and GIS databases were combined to support search-and-rescue efforts, response planning, and information dissemination [10]. After the 2004 Indian Ocean Tsunami, the U.N. Humanitarian Information Center utilized ArcGIS to augment incomplete basemaps with thematic layers characterizing infrastructural damage, displaced populations, and response organization distribution in an effort to improve supply chains, monitor for communicable diseases, determine field hospital and mobile resource locales, and inform disaster recovery planning [11]. Within 5 days of the 2010 Haiti Earthquake, more than 15 organizations from eight countries uploaded satellite data to the freely accessible United Nations Platform for Space-based Information for Disaster Management and Emergency Response (UN-SPIDER). The Haitian Earthquake Data Portal was launched and publishing GIS data within 48 h, and more than 3 million Haitian cell phone users were rebuilding maps and contributing to needs assessment efforts.

Mobile Technology and Phones

Information and communications technology (ICT) has been identified by the United Nations as useful in every phase of disaster preparedness. ICT includes traditional media, such as radio and television, and new media such as cellular, internet, and satellite [12]. Traditional media represented a significant improvement in the speed of information dissemination when compared to its print predecessor, but new media are rapidly displacing all prior forms of mass media as the preferred source of current news. Mobile technology, internet access, and the rise of social media allow for growth that is changing the way information in disasters is gathered, analyzed, and disseminated [13].

Global mobile phone penetration was 67% in 2019 with 5.2 billion subscribers, and this is expected to rise to 70% with 5.8 billion subscribers in 2025 [14]. The gap in internet access between developing and advanced countries has been closing rapidly over the past decade. In 2015, the global median of internet access was 67%, but this is composed of medians of 54% in developing countries and 87% in advanced countries. Smartphone ownership in developing countries is rising extremely rapidly, nearly doubling from 2013 to 2015 to 37%. This growth is fueling the increased use of social media in these regions [15].

However, mobile devices are no longer only used for communication. Given their growing affordability and accessibility, cell phones, tablets, and laptops are being used across the world to (1) collect and share data, (2) implement programs, (3) aid in the provision of medications and healthcare, and (4) assist in acute disaster settings. Applications such as stock management, disease surveillance, identification and tracking of patients, awareness campaigns, and their use as medical devices have transformed health delivery and evaluation [16].

Mobile data collection systems allow for the customization of surveys to collect specific data with photographs, information from a list selection, voice recordings, and GPS coordinates. Free and open tools such as Kobo Toolbox allow the user to build forms and collect, analyze, and manage data. Because of its offline capabilities, Kobo Toolbox has been adopted as the preferred electronic data collection tool by the United Nations Office for the Coordination of Humanitarian Affairs (OCHA) to improve standardization, reliability, and ease of access during humanitarian crises [17].

Social Media

Persons affected by a disaster now have increased access to social media, and it is becoming a tool for early information gathering and dissemination. Typically, the first reports from an area affected by a disaster come from posts to Twitter or Facebook by the affected population. Sometimes this information can be overwhelming and conflicting, and verification may be difficult or impossible. Best practices have developed to guide response planning by mapping social media posts on Twitter, Facebook, and other media onto heat maps and interconnected diagrams. Facebook rolled out its “marked safe” feature in 2014 to allow individuals to communicate their status after a disaster, thereby easing some of the pressure on cellular networks during the acute phase.

Realizing the key role of social media in disasters, some public health agencies have integrated a formal Social Media Monitoring Team into their Incident Command System with some success [18]. Larger organizations have begun building out operations in this sector as well, with examples including the American Red Cross’s Digital Operations Center (DigiDOC) and the Federal Emergency Management Agency (FEMA) “Watch Centers.” [19, 20].

Early Warning Systems

The broad penetration of mobile technology paired with GIS technology included in smartphones has revealed new avenues into building early warning systems. Systems relying on minimal user input and passive analysis of existing data can be analyzed and applied to disaster-related functions, such as natural disasters and infectious disease outbreaks.

An example of a minimal user input system is the U.S. Geological Survey (USGS) “Did You Feel It?” earthquake mapping application. This application combines user responses and their location with actual seismic data to provide more detailed earthquake impact maps than were previously available. MyShake is a crowdsourced early earthquake warning program that detects shaking on a user’s smartphone, sends this information to a processing center, and uses algorithms to determine whether an earthquake is under way. Using the GPS information from the phones and the intensity of shaking, the app can then send early warning alerts to an area’s users that provide the precious seconds to minutes needed to reach safety. Furthermore, these data can be routed to public transportation systems and utilities to active emergency shutdowns before the seismic waves arrive [21].

Google Flu Trends analyzed usage of search terms from users that may be flu related and mapped this geographically and temporally to track flu outbreaks. Comparisons of this with actual Centers for Disease Control (CDC) data initially correlated well for influenza-like illness, but later estimates showed the algorithm to be overestimating laboratory-confirmed influenza rates by 50% and missing smaller outbreak peaks [22, 23]. Despite this failure, surveillance systems that use large data for historical mapping or prediction hold substantial promise for developing the early warning systems of the future. One successful example is the Short-term Inundation Forecasting for Tsunamis (SIFT) developed by the National Oceanic and Atmospheric Administration (NOAA) Center for Tsunami Research, which predicts the amplitude and travel time of a tsunami following an earthquake based on combined historical and real-time numerical models [24].

Flood, fire, and volcano early warning systems are also burgeoning, as evidenced by the Dartmouth Flood Observatory (DFO) River Watch flood detection and measurement system, Sentinel Asia early flood warning, and the Moderate Resolution Imaging Spectroradiometer (MODIS) data used for many international fire SensorWeb early warning systems. South Africa’s Advanced Fire Information System (AFIS) integrates these data with vector information to calculate the fire trajectory and automatically warn the population of fire migration via mobile phone messaging.

Mapping

GIS developed through crowdsourcing is useful in a non-disaster setting with programs such as the navigation application Waze, but in disaster settings it can be a real-time, dynamic source of information on the ground, made possible by the GPS functionality built into modern mobile technology. Ushahidi, an open source mapping platform, was used to create crowdsourced maps of the 2010 Haitian earthquake and the Thailand floods. Another open source platform, GeoChat, has been used to track influenza outbreaks and alert exposed populations in Cambodia [25]. These applications and others utilize human input and can also harvest information from social networks and other sources of information, such as city infrastructure and utility maps.

During the 2013 Typhoon Haiyan in the Philippines, the online open source map program OpenStreetMap was used to rapidly develop street maps of affected regions. This source provided maps of previously unmapped regions for humanitarian workers and also classified structural damage to buildings through a tagging system. However, later ground surveys showed this approach of using satellite data to determine structural damage is less reliable than traditional methods. Ongoing work in this area is focusing on training volunteers to better recognize damage and providing access to pre-disaster images for comparisons [26].

Geoanalytical integration into disaster management can be even more tangible than assessment needs and humanitarian coordination. GIS-linked data are now informing computer-aided dispatch (CAD), a system that utilizes geospatially tagged data regarding infrastructure, incident distribution, mobile and immobile resources, and real-time response unit information to improve disaster response logistics [27]. Through advanced vehicle locating, responder interface, and real-time tracking of the event, CAD identifies the most appropriate and available resources required to reduce response times, increase responder safety, and enhance situational awareness. These systems have been utilized in coordinating fire-fighting efforts, hazardous spill response, and emergency medical service dispatch [28].

Disaster Recovery and Information Dissemination

Disaster recovery constitutes the actions required for a community to resume critical functions following a disaster. Disaster recovery is as complicated a task as disaster preparedness and requires a sophisticated understanding of infrastructure, resources, and population casualties combined with community priorities, vulnerability, and future risk. Key facilities and societal functions must be given priority for reconstruction. Pathways between the damage perimeter and those facilities must be identified, and hazard modeling must be employed to mitigate future destruction. Geoanalytical databases are ideally suited for the amount of data and the data manipulation needed for these efforts. Aerial imaging by unmanned aerial vehicles (UAVs) can also be valuable in recovery programs. In the Philippines, Medair utilized two-dimensional (2D) and three-dimensional (3D) terrain models produced from high-resolution aerial images collected by UAVs to inform post-Typhoon Yolanda recovery efforts. In consultation with the local government, low-cost drones were flown over Tacloban and Leyte to provide information to first responders and community leaders regarding areas of greatest need [29, 30].

Notwithstanding the tremendous potential, geoanalytics and remote sensing, be it by drones or satellites, have their undeniable pitfalls. Critical to the adoption and optimization of these geospatial technologies is an international consensus regarding international law, shared data dictionaries, functional collaboration, and open but safe information sharing. The overwhelming tonnage of data combined with incompatible technological lexicons and a lack of universality surrounding operational datasets have left the geoanalytical community fractionated. Arguments pertaining to accessibility and affordability of GIS technologies in austere environments are valid. Also, the lack of infrastructure in disaster response mechanisms to validate and analyze the volume and velocity of such innumerable sources is quite a legitimate obstacle.

In the past decade, however, geoanalytics, remote sensing, and the utilization of these data have continued to evolve. The resolution of satellite images and aerial photographs has improved drastically, thus expediting routing of escape, rescue, or relief distribution. Public domains such as Google Maps and numerous geoanalytical volunteers, facilitated by mobile applications and open source platforms such as Ushahidi and OpenStreetMaps, have moved disaster response from the domain of select professional groups to communities, both local and international. International organizations and nongovernmental organizations have launched online dissemination mechanisms for better response coordination, such as HumanitarianResponse.info, the Center for Research on the Epidemiology of Disasters Emergency Events Database (EM-DAT), the Global Disaster Alert and Coordination System (GDAC) Virtual Onsite Operations Coordination Centre (OSOCC), the MIT/Harvard Data Portal, and the Sahana Free and Open Sources Disaster Management System. The disaster response community is beginning to address the complexities and pitfalls of such large amounts of data by forming working groups, such as that on Applied Technologies which operates under the auspices of the Humanitarian Action Summit.

Cloud Repository Services

For information systems to be of use, it is necessary to integrate large amounts of data from different sources. Cloud-based services have emerged as a potential platform for disaster settings. This model allows for on-demand, internet-based access to a shared pool of computing resources, including networks, storage, and applications. Resource virtualization, large-scale pattern recognition, high reliability, generality, and scalability, as well as low-cost and on-demand services, allow cloud-based services to link numerous computers and services during a crisis.

Governmental and nongovernmental organizations can now store their information and data across different servers around the world. By its ability for continuous syncing capabilities as well as creating multiple copies of a data set, cloud computing provides the ease of tracking down important information quickly as well as swift recovery of critical information. Even when disasters interrupt internet access, which is not infrequent, lack of connectivity is somewhat temporary in comparison to when disasters destroy local computing infrastructure completely. As soon as internet access is regained, data sets can be retrieved through cloud-based services. Furthermore, cloud computing information is usually hosted in well-protected data centers, away from the conflict or disaster setting [31]. Within the field of geospatial data, it provides a potential solution to its complex structure, the massive amounts of data, and high levels of computing requirements during an acute crisis [32].

Information Dissemination

The compilation and analysis of crowdsourced information is only as effective as its reliability and dissemination. The Sahana Foundation developed an avenue to publish different sets of data sources by means of Linked Open Data (LOD) technology and to interconnect them with other sources on the LOD cloud. Data verification remains a challenge, particularly in the most acute phases of a disaster, but efforts have been made to improve this aspect. There are various approaches to the compromise of these competing goals: aggregating information and only considering clear patterns, small-scale manual verification of key reports, or crowdsourced fact checking [33]. However, there is an inverse relationship between verification and timeliness. For example, social media reports relevant to a disaster are frequently the most rapid form of reporting, but can be incorrect, self-serving, or even malicious. Independent verification is time- and resource consuming, both of which are precious commodities in an evolving disaster.

Compiled information should be quickly and openly accessible to all who may need it. The humanitarian information portal ReliefWeb is one example of a disaster-oriented aggregator and disseminator of information from and to responders, planning agencies, funders, and the affected population. It is operated by the United Nations Office for the Coordination of Humanitarian Affairs (OCHA) and updated continuously with situation reports, maps, and even humanitarian-related job postings. Virtual organizations, such as Humanitarian Road, are also transforming this platform. Virtual organizations are geographically distributed organizations with members bound by a common interest who communicate and coordinate their work through information technology [34]. Created in 2010, Humanitarian Road and its volunteers use internet and mobile communications to collect, verify, and route information online during sudden-onset disasters. They have responded to crises all over the world to collect public safety information and to direct the public to government and international aid agencies that can provide assistance during a disaster [35].

The collection and organization of data is important; yet, during a disaster, it is the dispersal of data to those on the ground that becomes crucial. Two such technologies are making communication during disasters more reliable and direct. Intellistreets, for example, is an emerging technology that uses streetlight poles with wireless technology to provide emergency alerts and information during a disaster to the public. Another example is iDAWG, a device being developed with the objective of maintaining communication between different relief services without relying on cell towers or internet networks [8].

The remote sharing of information among providers is also becoming more accessible. There are several reliable open online data sources (Table 1) that provide a strong foundation for data collection and dispersal during an acute crisis, providing an invaluable resource to responders and researchers alike.

Table 1 Open online data sources

Next Steps

The evolution of technology within the field of disaster management has accelerated tremendously in the past decade, with geoanalytics, modeling and optimization, simulation, ICT, crowdsourcing, and social media completely changing the way data are collected, analyzed, and disseminated. Inevitably, there is much to be done to standardize, integrate, and optimize the utilization of these tools. Geospatial databases must be able to interact with each other, requiring common dictionaries and open application program interfaces (APIs). Operational algorithms for data collection and analysis should be studied, creating empirically validated best practices and standard operating procedures. Conversations about data ethics in post-disaster environments should take place during all levels of policy and programmatic discussion. Infrastructure needed to ensure reliable and equitable access to technologies and data should be maintained. Data dissemination mechanisms should harness the fast uptake of smartphones, social media, and the development of organizational dashboards to improve disaster management coordination. From the engineering logistics of electricity and WiFi availability to the theory behind data ethics, to the academic evaluation of geoanalytical and remote sensing techniques, there is much to be done within disaster management technologies.

Case Study

You work for Disastech International, an NGO that focuses on using technology in disaster preparedness and response. A 7.9 magnitude earthquake has just hit off the coast of Chile, near Santiago. Fortunately, your NGO specializes in these types of natural disasters and is able to use its experience and technology to respond. Because Chile sits on the arc of volcanos and fault lines circling the Pacific Ocean, there was immense damage to the population of more than 6 million. Yet, Disastech has been using GIS to create basemaps of the city, including the vulnerable areas that are the hardest hit. Your basemaps include basic demographic, economic, and geospatial information that provided the foundation for a hazard modeling map of the city. Furthermore, because of the extensive expertise of your NGO, you have detailed topographic, 3D models of the region using UAV imagery. Using these maps, you can provide the international community with information about high-risk areas where there are likely to be the greatest numbers of victims and survivors. Your previously prepared maps can be used to evaluate the most probable routes to deliver aid and logistical supply chains. You are also able to provide the locations of the previously established health facilities for injury referral.

Because of the scale of the earthquake and the population density that was affected by it, social media and mobile technology become the first line of information output after the disaster. Capitalizing on the flood of information from posts through Facebook and Twitter, Disastech is able to identify the hotspots where the highest impact occurred. You forward the information you collected to UN-SPIDER. Additionally, hours after the earthquake hit, Disastech deployed a team to complete a rapid assessment using Kobo Toolbox, information that will be forwarded to UN-OCHA, which is coordinating the international response. In addition to the local response, Disastech has been in constant communication with SIFT to help alert neighboring countries in the Pacific. In the next 24 h, you update your previously prepared basemaps with up-to-date information on the neighborhoods most affected, hotpots of victims, best routes for health referrals, and the regions needing the most aid. You update this information every 24 h. To begin reconstruction planning, 1 week after the earthquake hits, you use satellite imagery, social media, and UAVs to deliver detailed city maps to the Mayor of Santiago that include the hardest hit infrastructure and safest pathways around the city.