Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

1 Introduction

Crisis response activities include undertaking measures to protect lives and properties immediately before, during, and immediately after the occurrence of a disaster. Such activities may span from a few hours to days or even months, depending upon the magnitude of the event. Disaster management using the World Wide Web is an emergent field that uses technology to enhance users collaboration around disasters. While there exist a number of dedicated “disaster portals” [6, 12, 13], large social networks such as Twitter, Facebook, and Google-plus can facilitate the analysis and sharing of a collective intelligence regarding disaster information on a much greater scale. Recent disasters (e.g. Haiti, Australia, Japan, Mexico, etc.) have demonstrated their real potential in providing support to emergency operations for crisis management [38]. Social networks have the potential to increase accessibility to eyewitness’ information, and exploiting their input to gain awareness of an incident is an important research topic. Reaching populations by means of customised and timely alerts through multiple channels (Internet technologies, hand-held devices, and social networks) can help inform those at risk and assure those not at risk with messages that accurately reflect the levels of vulnerability of the target population.

From the earthquake in Haiti in 2010 to the terrorist attack in Boston in 2013, Facebook, Twitter and other social media have shown ability to provide valuable support to civil protection in emergency situations. The earthquake in Haiti is often referred to as the turning point that changed the way social media can be used during disasters. The size and emotional impact of disasters have created the right motivations for the integration of social media in emergency management. In 2005, when the hurricane Katrina devastated the US coast of the Gulf of Mexico, Facebook was one year old, there was no Twitter, and smart phones were not common yet. In 2012, when hurricane Sandy hit the east coast of the US, social media had become an integral part of the response to disasters: millions of Americans used social media to follow the news, look for persons, and send requests to the authorities. Researchers have now begun to publish reports on the use of social media during disasters, and law and security experts have started to evaluate how to best exploit social media in emergency management.

In 2011, the Australian state of Queensland was impacted by floods and a severe tropical cyclone. While there was significant media coverage, social networks were also inundated with posts related to the floods. A government social media outlet, “@QPSmedia” was created and utilised for community interaction regarding the disaster. In 2012, the Department of Health and Human Services (HHS) in USA sponsored a challenge for software application developers to design a Facebook application called the Personal Emergency Preparedness Plan (PEPP) [34]. Riskr [31] is a low-technological project which applies a Web2.0 [46] solution to creating disaster portals fed by social networking messages; the system has been implemented using Twitter and tested by users to determine the advantages of having interoperability between social networks and disaster portals. Farber et al. [31] state that almost all Riskr users have no problem with predefined hash-tags and they are satisfied about the received information. There are some challenges to process posts in socials networks. The social networks are not optimised and specialised for emergency management and the posts may include sentimental words, emoticons, links, and personal and untrustable opinions. Extraction of related posts, categorisation of heterogenous messages, and determination of the trustworthiness of messages are some of the challenges to face when using social media for managing catastrophic events. Efficient methods of handling subjective information, uncertainty, different level of credibility, extraction of exact location (position), and sentiment context should be used to make the utmost use of the data collected through a stream of posts.

In this paper we present a review of existing disaster management systems. In particular, we consider strategies and technologies for the analysis of information collected by mining social networks and information provided by a wireless sensor network, as well as the use of geo-spatial technologies. We claim that the integration of information from these different sources is essential to implement a decision making system aiming at promptly disseminating alerts, efficiently organising rescue activities and providing effective support. Section 2 defines terminology and phases of emergency management. Section 3 reviews existing disaster management systems. In Sect. 4, we propose an architecture for disaster management that integrates the mining of social networks and the use of sensor networks. Finally Sect. 5 clarifies which parts of the proposed architecture have already been implemented.

2 Emergency Management

The term emergency, or disaster or crisis, refers to a situation that poses an immediate risk to health, life, property or the environment. Thus emergency management, or disaster management, is used to encompass all plans defined by national or local agencies, called Emergency Management Agencies (EMAs) and the consequent activities carried out to tackle an emergency situation, or disaster, with the goal of reducing harm to life, property and the environment, and then return to a normal functional condition [26]. Emergency management is an interdisciplinary field of study, which involves intertwined social, political, technological, economic and cultural aspects. Moreover, there is a general consensus in identifying four phases in handling disasters [26]:

  • mitigation often called prevention or risk reduction, aims to reduce the likelihood or consequence of a hazard risk by defining appropriate measures and procedures before the occurrence of a disaster;

  • preparedness is defined as actions taken in advance of a disaster, as implementation of mitigation measures and procedures, together with roles and responsibility assignments as well as service availability, to ensure adequate response to its impacts and the relief and recovery from its consequences;

  • response when the disaster occurs, all involved actors promptly contribute, according to their roles and within the confines of their limited funding, resources, ability and time, to the search and rescue, with the coordination of the appropriate EMAs that activate and manage measures and procedures;

  • recovery is the process of rebuilding, reconstructing and repairing damages and destruction caused by the disaster, and finally restoring the normal situation.

Mitigation is often considered the “cornerstone of disaster management”. It has been for a long time perceived as a luxury of the wealthy countries, but has now started to gain recognition and practical application also in the developing world. Recently, prevention has been distinguished from mitigation to better characterise pro-active measures designed to provide permanent protection [10]. Response is usually subject to extensive media coverage and is therefore the most visible disaster management function at the international level. The effectiveness of the response phase not only depends on the promptness in carrying out search and rescue actions, but also on the definition, during mitigation, of measures and procedures that facilitate data selection, aggregation, integration and availability, and optimal scheduling of actions, as well as on their efficient implementation, during preparedness.

3 Literature Review on Disaster Management Systems

Some of the current disaster management solutions are inadequate to help manage disaster due to manual data collection and entry, which result in delayed dissemination of information. Sahana [22] is an open source disaster management system that has been widely used. It has a modular structure for effective communication and information sharing among various stakeholders including government, NGOs and affected people. However, being a traditional database management system, it requires manual data entry. Global Disaster Information Network (GDIN) [7] is another conventional web based information system that provides effective communication. However, it lacks in the effective management of disaster data. Other notable disaster management systems are Queensland Disaster Management System [6] and Disaster Management Information System (DMIS) [5]. All these systems provide no support for automatic information collection and this is a severe limitation in situations where time is one of the most precious entities and even seconds can save lives.

In this section we review a number of disaster management systems that adopt technologies for automatic information collection. We starts with systems based on the analysis of social networks. Then we consider systems that use wireless sensor networks. We conclude with discussing the use of Geographical Information System (GIS) and Global Positioning System (GPS).

3.1 Systems Based on Analysis of Social Networks

Recently, Twitter has played an increasing role as a clearinghouse for information related to emergencies and disasters. Sakaki et al. [49] propose an algorithm for tweets monitoring and real-time target detection. To detect a target event, the authors devise a classifier of tweets based on features such as the keywords in a tweet, the number of words, and their context. They then produce a probabilistic spatiotemporal model for the target event that can find the center of the event location. They apply a particle filtering for each Twitter user to achieve a better estimation of target event location. Nguyen et al. [45] define an event by five attributes for each Twitter message: actor, action, object, time and location. They build a collective, readable and intelligence-based web ontology tool that understands the meaning of Japanese text messages and extracts semantic data about incidents. The authors propose a novel approach which can automatically build an earthquake semantic network by mining human activities from Twitter. By using this semantic network, computers can recommend suitable action patterns for victims. This approach automatically makes its own training data and uses linear-chain conditional random field as a learning model. In the project conducted by Nguyen et al., the text extractor architecture consists of two modules: “self-supervised learner” and “activity extractor”. Firstly, the learner uses basic Japanese syntax patterns to select analysable activity sentences. Then, it uses deep linguistic parsing to extract activity attributes and relationships between activities in these sentences. Secondly, a “decision module” uses extracted actions and objects to create search keywords for Twitter API.

Although news information is widely available through social media such as Twitter, the credibility of such information may be questionable. To assess the credibility of information propagated through Twitter, Castillo et al. [23] propose a mechanism using features from the content of posts and from citations to external sources. The study conducted by Westerman [53] found a curvilinear pattern between the number of followers and Twitter user’s credibility.

3.2 Systems Based on Wireless Sensor Networks

Sensor networks have the potential to revolutionise the capture, processing and communication of critical data for use of disaster rescue and early-warning systems. Event detection functionality of wireless sensors can be of great help and importance for real-time detection of, for example, meteorological natural hazards and residential fires. The basic idea of event detection is to define some threshold values and generating an alarm by sensor when input is lower/higher than a pre-defined threshold value. Due to the fact that disasters cannot be detected by simple pre-defined thresholds [43], the new trend in event detection is to use pattern matching or machine learning techniques. Based on the scale of the network, application requirements and constraints, pattern matching have been proposed for use in the base station [55], locally in the sensor nodes [19], or distributed over the network [42]. The RT-HRLE system [32] uses a wireless sensor network for real-time monitoring, tracking of missed people inside buildings and reporting partial or total destruction of buildings to a central database. Liu et al. [41] present an architectural approach for wireless sensor networks which can proactively self-adapt to changes and evolution occurring in the provision of search and rescue capabilities in a dynamic environment. The proposed model uses the concepts of dynamic workflow management to enable dynamic service integration for reliable and sustainable provision of rescue capabilities. This approach is able to identify evolution, evaluate the impact of evolution and self-configure services to adapt to evolution.

3.3 Systems Based on Geo-Spatial Technologies

Disaster management involves not just crisis-reactive responses to emergencies, but also finding ways to avoid problems in the first place and preparing for those that undoubtedly will occur. Natural disaster management is a complex and critical activity that can be more effectively addressed with the support of geo-spatial technologies and spatial decision support systems [30, 48]. In this respect, spatial data and related technologies such as GIS and GPS have been proven crucial for effective disaster management [16, 28]. Throughout the first three phases of disaster management, preparedness, mitigation, and response, a GIS can effectively facilitate the integration of spatially distributed information within a decision support system. The effectiveness and growth of GIS and GPS is however dependent on the development of a national disaster management database underlying the varied scope and activities pertaining to national emergency management. A national database provides a common frame of reference for all provincial and local agencies and establishes the framework for managing and organising the data required to support the disaster risk management activities of responsible organisations. Gunes et al. [35] aim at building a database in a GIS frame that helps emergency management officers in decision making, focusing on Douglas County’s preparedness, mitigation, and response efforts for its most common disaster: flooding. The system leads to better flood management by automating the task of determining the probable flood-affected areas and integrating the results with other spatially distributed information. This enables emergency management officers to make more informed decisions before, during, and after a flood situation. Pareta and Pareta [47] address the need, the technical structure and the potential solutions facilitated by the creation of an effective database at a national level and draw upon experience from work in Vietnam and practices from India.

4 Towards an Integrated Server Architecture

In this section we propose a comprehensive server architecture, as illustrated in Fig. 1, which consists in a “Command, Control, Communication Computers, and Intelligence” (C4I) [54] unit communicating with the database that provides a common frame of reference for all local agencies (Agency Data) and with sensor networks, social networks, etc. The incorporated components include mechanisms to model event level semantic information, a system for implementing multi-sensor fusion, mechanisms for estimating the veracity of information, data cleaning to reduce uncertainty and enhance accuracy of event detection and notification, and spatiotemporal analyses for pattern and trend analyses for higher level observations. Such a modular architecture makes it possible to upgrade the platform in the future as needs change or new technologies appear. The proposed platform is capable of processing information about various types. Processing can be configured using rules and may include configuration for data loading, pre-processing, aggregating, statistics building, correlating with other events and storing in a database.

The architecture consists of services on which data are ingested, cleaned and analysed to extract information that is customised for emergency services, and hand-held devices on which alerts are visualised. The disaster portal that interfaces the server leverages core features of the platform such as notification and authorisation dialog for citizens. Citizens can register themselves and their own socio-economic situations (having car, address, disabilities, situation of building, etc.) to receive the most appropriate alert messages in pre- and post-catastrophic events. Citizens can also register themselves as volunteers and encourage other citizens to participate. The server can send requests to volunteers in a threatened area and volunteers can send their plans to the server. The portal allows registered users to send their real-time location and situations. Unregistered users can communicate with the server through different channels, e.g. phone call and messages through cellphone. Obviously, analysis and visualisation techniques implemented in such tools need to be customised to the specific disaster management subject domain.

In a disaster management service, deployment scenarios for sensor networks are countless and diverse. For example, sensors may be used for weather forecasting, tsunami detection, pollution detection, and video surveillance. Normally, a disaster management server allows the operator to query a sensor network and retrieve some resulting data. However, some scenarios may require regular queries to be scheduled and automatically dispatched without external operator intervention. Furthermore, there is a growing need to share resources among diverse network deployments to aid in critical tasks like decision making. For example, a tsunami warning system may rely on water level information from two geographically distributed sets of sensors developed by competing hardware vendors. This presents significant challenges in resource interoperability, fault tolerance and software reliability. We need to implement a set of uniform operations and a standard representation for different entities, sensors data and web services data, which can fulfil the software needs of a network regardless of the deployment scenario. The proposed platform capabilities include:

  • monitoring of sensors and social media for the relevant information;

  • semantic enrichment of information through multi-modal analysis (tweets, sensor, etc.) to create event level representation;

  • integration with other information sources such as the agency database;

  • querying targeted sensors and rescue agents for recent updates based on their location;

  • making the best decision and generating custom alerts for specific population groups.

Finally, the choice of an adequate data interchange format can have significant consequences on data transmission rates and performance. XML and JSON (JavaScript Object Notation) serialisation have been widely used in the actual development of web applications. Compared to XML, JSON has higher parsing efficiency and the advantages of easy preparation. Since JSON is not just a text format, but a serialised data structure, Resource Description Framework (RDF) libraries can support it [15], not just as a format to parse from or output to, but also internally, as a data structure that can be passed to and returned by functions and methods.

In Sects. 4.14.4 we describe the server components, as shown in Fig. 1.

Fig. 1.
figure 1

Server components for data ingestion and enrichment, and alerting tasks.

4.1 Event Collection Module

This module gathers information on crisis-related events from a variety of disperse sources including voice, text, image, video, and sensors. It explores integrated distributed systems, data management, and networking systems that enable information to seamlessly flow in real-time from sources to collection points. This is done by periodically querying social networks with different incident related key-words, sending requests to sensors and waiting for voice/message calls. The received data must be pre-processed and stored according to agreed standards to support data sharing with incident management applications. For instance, sensors data may be labeled with sensor ID and location, voice calls may be labeled with the phone number and location. Finally, each type of data is converted to an appropriate low-volume format to facilitate analysis.

In geographically spread hazards, such as earthquakes, the amount of resulting generated data is voluminous. Real-time analysis and collecting such data could overwhelm any computing infrastructure. While dynamically acquired cloud computing alleviates some of the overheads, scaling data collection to such big data requires additional techniques. One possible technique is the prioritisation of the collection of diverse data by dynamically optimising the overall situational awareness under resource constraints and source restrictions, e.g. network bandwidth and maximum concurrent queries.

One of the goals in data acquisition is to ensure that the data collected is relevant for the event under consideration and avoid retrieving too many irrelevant data. Precision is obtained by ranking and clustering different key-words and terms relevant for the event. The retrieval challenge is to get as many relevant messages as possible. The precision challenge is instead tackled by applying context-based filtering techniques [36]. In addition, boosting methods [20] are used to collect the initial set of messages related to the event, and then, based on them, select new frequent key-words as query terms. Such query terms could also be chosen using ontologies such as SWEET [4].

The server controls several types of sensors and video-surveillances spatially located in different places. However, such a variety of sensors requires interfacing applications to perform common operations and transformations on sensor data. As sensor data is time-dependent, the user needs to provide, essentially, the desired geographic area, the desired time interval, and the desired properties to be observed. With the specifications defined through the Sensor Web Enablement (SWE) [8] initiative of the Open Geospatial Consortium (OGC) [9], flexible integration of sensor data is becoming a reality. The NICTA Open Sensor Web Architecture (NOSA) infrastructure [2] is built upon the SWE standard defined by the OGC, which consists of a set of specifications, including sensor model language, observation, measurement, sensor collection service, sensor planning service and web notification service. NOSA adopts a Service Oriented Architecture (SOA) approach to describe, discover and invoke services from a heterogeneous platform using XML and SOAP standards. Services are defined for common operations including data aggregation, scheduling, resource allocation and resource discovery. Each sensor is registered as a web service that can be comfortably discovered. Combining sensors and sensor networks with a SOA is an important step forward in presenting sensors as resources to discover, access and, where applicable, control via the World Wide Web. It offers the opportunity of linking geographically distributed sensors and computational resources into a “sensor-grid”.

The main challenges in sensor networks are the discovery of appropriate sensor information and the real-time fusion of the discovered information [51]. They are key issues in disaster management, where the flow of information is overwhelming and sensor data must be easily accessible for non-experts. By registering every sensor as a web service, sensor discovery and fusion can be carried out by semantically annotating services with terms from a purposed-designed ontology. In doing so, several well known techniques from the GIS and semantic web worlds can be employed. Semantically, annotations of geographically distributed sensors provide an infrastructure with which on-line discovery and integration of sensor data is not more difficult than using standard GIS applications. The service discovery is realised by text search combined with taxonomy browsing [18]. For instance, suppose a number of different types of sensors, e.g. water/air pollution, water level and water temperature, are placed in different point of a river. The highest level of taxonomy will be “river” and “sensor”. The second level connected to river taxonomy will be the different sectors/points of the river and the second level of the sensor taxonomy will be the different types of sensors. The third level of the sensor taxonomy will be the different installed sensors (ID number). Each sensor (ID) is (semantically) connected to a sector/point of the river. The emergency officer can then query the situation of the whole river, or of a specific sector of the river. Then, the server commands the proper sensors, the observed data is integrated/fused together and the result is finally shown on the GIS map.

4.2 Event Extraction and Analysis Module

This module further analyses the pre-processed data stored in the databases of an emergency management system to extract meaningful semantic information. The semantic enrichment is the context of location determination and event representation. In addition, the semantic information extracted from multiple data is fused for event classification and disambiguation. In case any ambiguity is found in the received data, the system can ask more information from citizens to achieve a reliable message. The enriched data is compared with previous similar events stored in the local database. One important challenge is to develop technologies and tools to integrate, analyse and visualise multiple information sources to rapidly assess the nature, composition and pattern of threats, and to address public safety practitioner requirements.

One of the crucial data enrichment challenges is text (tweets, SMS) analysis to extract disaster related information. The sentences retrieved from social media, or received through SMS are complex, often structurally varying, syntactically incorrect, and have many user-defined new words. Thus, extracting activities from these sentences might be very difficult. The Event Extraction and Analysis Module exploits some known platforms for Natural Language Processing (NLP). GATE [33] is one of the most popular platforms for Natural Language Processing (NLP) widely used in industry and academia to extract information from text. The actual processing of the content goes through several steps, starting with tokenisation, sentence splitting and speech tagging. These processing layers are provided by GATE along with grammars and other standard building bricks for obtaining sophisticated information extraction applications. ANNIE (A Nearly New Information Extraction System), a plug-in of GATE, is used to extract disaster information. ANNIE [27] uses PRs (Processing Resources) that have been developed using the JAPE (Java Annotation Pattern Engine) language [52], a pattern/action rule language based on regular expressions.

Another challenge is speech analysis; there have been several recent announcements surrounding the application of speech-to-text analysis in consumer search settings. Google announced its Political Gadget, enabling visitors to search the spoken word of content within YouTube Presidential candidate’s channels. Adobe plans to include speech-to-text features in future versions of its video authoring applications, such as Premier. Sites such as WEEI [1] and FOX Sports [3] have been using similar tools to power search and publishing applications for their multimedia archives. The Event Extraction and Analysis Module automatically transcribes speech messages and extract meaningful keywords and then analyses it as a text message.

Detecting the occurrence of an incident-relevant event within a multimedia clip is another task of this module. The representation of an image/video into a set of key-words have been successfully used in many detection and recognition tasks such as object detection [29], scene recognition [39], human action recognition [40] and semantic concept detection [37].

Another crucial analysis technology supported by this module is the geo-localisation of data from sensors, cell-phones and social media sources. This can be done at several levels of complexity, including cellphone location, sensors’ location, extraction from tweet, or complex computer vision algorithms that can localise images based on skylines or building facades. Having information about where an event occurs allows for various geo-spatial analyses that can support alert customisation to subscribers. With the advances in location-aware mobile devices, location-based social networking applications have been taking shape at fast pace. Examples of such applications include Google Buzz Mobile, Loopt, and Microsoft Geo-Life. Potential applications of these systems include the possibility for users to receive nearby geo-tagged messages submitted by friends and to find a new facility within a certain area based on friends’ opinions, and completely ignore the social aspect in social networking services. GeoSocialDB [25] provides a holistic framework consisting of three location-based social networking services, namely, location-based news feed, location-based news ranking, and location-based recommendation. Even though GeoSocialDB is not specialised for emergency situations, it can certainly be used to generate queries like: “Send me the k most relevant messages submitted by victims with tagged locations within d kilometers of my location”, or “Recommend me the best street to go away within d kilometers of my location based on emergency officers opinions”.

Event data management is concerned with providing data management or “database like” capabilities for events. It is important to treat events as objects and provide storage, querying, retrieval and indexing capabilities for them. We are working on several issues in this regard. For example, concerning event modelling, we are developing a semantic data model for events. A large collection of distributed reports are generated during a disaster. In its original form, these data is of limited use, since users can only apply keyword searching to it. It is therefore necessary to extract events and inter-event relationships to produce more structured data, in order to be able to apply most of the existing exploratory and analytical tools.

4.3 Decision Support Module

Decision support is an essential functionality of an emergency management system. This module processes the enriched event data to generate targeted alerting. This can be achieved by integrating the structured event representation with other local data such as demographics and resource availability in different areas and organisations. By purposefully utilising collected information, for instance by a data fusion system, the state of some system-relevant environment is adequately assessed to support decision-making. Various multimodal data streams and static environmental information (geo-spatial information) are fused together to produce a refined decision. The GIS-enhanced information can enable decision makers to match on-ground situations and determine alerting requirements in different areas. Pre-defined policies are incorporated into a rule-base to dictate the kind of guide and information to be provided in an alert. This is used to generate messages for specific population groups that are categorised based on location and physical disabilities (elderlies, patients in a hospital). For instance, in the case of an industrial fire, the alert could be sent to people in the neighbourhood and this alert should be different from the one sent to people living at a safe distance from the fire.

Analysis and visualisation are concerned with providing intuitive and visual analysis and querying capabilities for managers of situational information. It is reasonable to expect that managers or field commanders would finally like to see patterns and trends in the information collected and get intuitive and visual views of the information. In this regard, we propose to develop tools such as a graph based query algebra and language over events. This allows users to query and analyse events in a graphical manner. The resulting graph based semantic network can be stored as either a multi dimensional table or an RDF file. These enable the use of online analytical processing (OLAP) queries [21, 24] and SPARQL (Simple Protocol and RDF Query Language) queries [14], respectively.

An estimation of the current location and number of people in a disaster area and a prediction of the future movement of those people could provide critical information to disaster operation command staff responsible for rescue and evacuation, and also to victims looking for the best way to navigate to a safe place. One important initiative for disaster emergency personnel can be a real-time evacuation planning model, which automatically calculates the evacuation time of a user defined area based upon the transportation network, population data, and behavioral characteristics to provide emergency planners the ability to effectively plan for and manage evacuations. “People forecasting” and occupancy analysis in real-time are crucial to predict freeway traffic information and (future) event prediction. Occupancy analysis could be carried out by extracting information about human behavior from a variety of sensors such as loop sensors counting cars on a freeway, people counters at doors of buildings and GPS devices on cellphones or cars. A “personal traffic assistant” running on mobile devices can help travelers re-plan their travel when the routes are impacted by failures.

4.4 Alert and Command Dissemination Module

This module focuses on the challenges associated with the timely dissemination of information to entities participating in disaster response activities, to other organisations (e.g. mass media organisations), and to the general public. Empirical social science research has focused on issues related to the dissemination of hazard-related information in both pre- and post-disaster contexts.

A possible strategy for alert dissemination consists in dividing and routing volunteers to different threatened neighbourhoods based on the event propagation, the number of victims, and volunteers’ location. Without proper planning for the route of each volunteer, some places may be visited repeatedly while others may not be visited at all. Furthermore, repeated visits of some places may prolong the response time of exploring the whole disaster area. As a consequence, a major limitation of using social networks and broadcast systems to explore disaster areas is the lack of coordination among volunteers. Google map API, MapQuest [17] and Quantum GIS [11] open source software provide powerful and user friendly tools to find the best GIS platform for spatial management applications.

This module supports customised delivery of alerts to specific sub-populations based on location, the current status of the incident and its expected effects and propagations, status of various locations, needs, etc. A flexible policy definition mechanism allows customisation based on location and geographical information and type of event [44]. The policy determines how often an individual user may be reached as the event evolves and the protective actions change.

5 Conclusion and Future Work

In this paper we have reviewed a number of technologies and systems for emergency management. To our knowledge, none of the existing disaster management systems support the integration of data from sensor networks and social networks. We have proposed an architecture that combine a number of existing technologies to achieve such integration.

Some of the functionalities of our proposed architecture have been implemented by the WiLIFE project (Tecnologie WireLess e ICT per un efFiciente e integrato sistema per la prevenzione e gestione delle situazioni di crisi e delle Emergenze — http://www.wilife-project.it/) during 2012–2015 [50]. With respect to the architecture proposed in Sect. 4, the WiLIFE project produced: an implementation of the Event Collection Module that supports the integration of events from a wireless sensor networks with data form social networks; some functionalities of the Event Extraction and Analysis Module, including some level of geo-localisation of the extracted data; the data presentation aspects of the Decision Support Module; dissemination of commands to public service operators and alerts to Twitter users, also through a mobile app available for Android platforms, which are part of the Alert and Command Dissemination Module. The implementation of further functionalities of the Alert and Command Dissemination Module and the prediction aspects of the Decision Support Module are part of our future work, subject to funding availability.