Abstract
After the occurrence of a natural disaster, it is of paramount importance to take efficient measures to reduce the casualties and damage to infrastructure. Resource allocation is a generic problem of assigning available resources to the affected areas to cope with the devastation caused by the disaster. To mitigate the deadly effect of a natural disaster, different resources are essential at the emergency sites. Disaster response activities also need the assignment of various critical tasks to be carried out by different emergency workers at the local level. The individual emergency locations convey their demands for resources and required services to the higher-level authorities. Depending on availability, the higher-level authority allocates resources through successive lower levels to the emergency sites. This paper proposes a model for the hierarchical flow of different resources during disaster management in the Indian context, from the top-level authority to the lower levels. This hierarchical architecture also incorporates the allocation of different essential tasks at the ground level to reduce the effect of a natural disaster locally.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
Due to its climate, geographical, and geological characteristics, India is one of the worst affected countries by various natural disasters in South Asia. In order to build resilience to natural disasters, the Disaster Management Act was passed in India in 2005 [1]. Under this act, National Disaster Management Authority (NDMA) and National Disaster Response Force (NDRF) were established. NDMA, headed by the Prime Minister of India, is responsible for making policies and plans for disaster management to ensure efficient disaster response. NDRF, on the other hand, is a specialized force working under the supervision of NDMA to handle a disaster situation. A practical understanding of disaster management in India is provided in [2]. As a country of numerous rivers with tributaries and a prolonged monsoon (June–September) with extreme rainfall in different geographic regions, flood is India’s most common natural disaster. After Bangladesh, it is one of the most flood-prone countries in the world, with about 40 million hectares of land, which is around one-eighth of the country’s overall geographical area, is prone to floods [3]. The annual average flood-affected area is 7.6 million hectares [4]. Roughly around 30 million people in India are affected by floods, and more than 1500 lives are lost each year which is about one-fifth of the global death count caused by floods [5, 6]. According to official statistics given by the Government of India, about 92,000 people lost their lives, and economic losses amounted to approximately $200 billion between 1953 and 2009 flooding [4, 7]. Several other papers like [8, 9] present statistics of floods in India for two important river basins. Considering all these facts, an efficient flood management system is essential in India.
In terms of casualties and damages, flood is one of the top natural disasters. In order to mitigate the effect of floods in flood-prone geographical locations, the disaster management authorities need to implement actions toward flood management. Flood management involves various actions like risk and vulnerability assessment, early warning system, loss and damage assessment, and risk mitigation planning. In [10], flood risk management in different levels of planning and actions has been discussed. In [11], a framework for less developed countries has been proposed to evaluate policies and strategies adopted to mitigate the effects of floods. The paper by [12] covers different phases involved in flood management, focusing on the flood-prone river basins of India. Traditional flood management using ground-based data like rainfall and river discharge is not time and cost-effective. Kourgialas and Karatzas [13] presents an approach to determine flood-hazard regions by using various relevant geographical data obtained from Geographic Information system (GIS) modelling. In [14], the usage of spaceborne remote sensing for different aspects of flood management has been investigated. Two-stage stochastic programming to account for the randomness in food risk management is discussed in [15]. The paper [16] presents a survey of the application of computational intelligence-based systems in flood management. Iqbal et al. [17] reviews articles on computer vision applied to different phases of management and planning after a flood. A few papers like [18,19,20] present early warning systems devised for flood management.
The casualties and economic losses caused by a natural disaster necessitate immediate response and recovery activities to be initiated. In a post-disaster scenario, the localities affected by the disaster require various critical resources and services to mitigate the impact of the disaster that arises locally. The disaster management authority allocates available resources among emergency sites based on the requirement of resources subject to resource availability. Decision-making, during or after a natural disaster, can be very complex, considering its dynamics and severity. To automate the decision-making process by the authorities, depending on various ground truth parameters and useful data, a computer-based decision support system (DSS) can be highly effective for a quick and unbiased response to emergency sites. A DSS can support a disaster management authority with limited experience to make quick decisions based on a model developed from previous experiences. Decision support system for disaster management is reported in [21,22,23,24,25,26]. Wallace and De Balogh [26] conceptualize the operational, tactical, and strategic decision-making for different natural and manmade disasters. In [21], the concepts of decision-making in various stages of disaster management have been discussed. Zhou et al. [27] provides an overview of the emergency decision-making theory and methods of natural disasters from the methodological perspective. Sati [25] presents a scalable computing environment for emergency response DSS to sustain the response activities during power and network interruption. Newman et al. [28] introduce a detailed review of decision support systems for natural hazard risk reduction. Aifadopoulou et al. [29] develop a web-based, GIS-enabled intelligent DSS to implement protection and management measures that optimally address the transport networks and infrastructures. Zamanifar and Hartmann [30] present a systematic case study of a structured framework to suggest decision attributes for disaster recovery planning of transportation networks. DSS for enhancing performance in wildfire suppression is presented in [31] using multi-sensor technologies and geographic information system (GIS) functionalities. An integrated DSS for risk assessment associated with natural disasters is provided in [32]. Yaşa et al. [33] deal with the clearance of debris in a post-disaster scenario to maximize road network accessibility by using a stochastic mathematical model. Meta-heuristic solutions are used to solve this problem for large-scale networks. The paper [34] works on evacuation modelling in disaster scenarios using betweenness centrality to minimize the number of people waiting for rescue. Korkou et al. [35] focus on developing a humanitarian logistics framework to minimize the losses due to supply shortage. Meta-heuristic optimization algorithms are used in solving the logistics problem. A smartphone-based application for simulating flood disaster evacuation is presented in [36]. In general, decision tools are developed for different phases of disaster management: prevention, preparedness, response, and recovery. This paper mainly focuses on developing a DSS for resource allocation in the response stages of disaster management.
Decision support systems for resource allocation during a disaster are reported in some literature like [37, 38]. Kondaveti and Ganz [37] develop a decision support system for resource allocation in three phases; clustering the victims, resource allocation, and resource deployment. Hashemipour et al. [39] describe a framework based on a multi-agent coordination simulation-based decision-support system. The system helps response managers in a community-based response operation who want to test and evaluate all possible team design configurations and select the highest-performing team. Li et al. [40]; Wang and Zhang [41] use agent-based frameworks for the distribution of resources during a disaster scenario. Othman et al. [42] propose a multi-agent architecture for the management of emergency supply chains (ESC), in which a DSS states and solves the scheduling problem for the delivery of resources from the supply zones to the crisis-affected areas. Sepúlveda et al. [43]; Sepúlveda and Bull [44] report model-driven DSS for vehicle routing to distribute relief supplies in the situation of natural disasters.
Resource allocation in an emergency scenario is reported in various literature like [45,46,47,48] using different optimization methodologies. Wang et al. [47] present a multi-objective cellular genetic algorithm (MOCGA) for resource allocation in a post-disaster scenario. In [49], a project management and scheduling problem to assign personnel to different disaster locations is formulated and solved by a hybrid meta-heuristic algorithm. A vehicle scheduling and routing problem during an emergency is modelled with integer linear programming in [50] to minimize the total transportation cost to conduct the necessary activities. A post-disaster allocation of limited repair crews for recovery of infrastructure network is proposed by combining agent-based modelling and reinforcement learning [51]. Agent-based modelling is used for resource allocation to incorporate the relief urgency and behaviour of different aid carriers and providers [52]. Effective resource allocation is performed after the computation of the relief urgency index using qualitative and quantitative parameters related to allocation and distribution. A relief urgency index is developed using the intrinsic information contained by different factors like time-varying demand, population density, the ratio of frail population, damage condition, and the time elapsed from the last delivery. This index is used to improve the relief distribution after a large-scale disaster. However, this framework does not consider the weight of different factors and the scalability of the approach is not reported. Resource allocation during simultaneous disasters is reported using stochastic optimization techniques in [53] where insufficient national resources are to be allocated among the disaster regions. In [54], a resource allocation problem in a limited resource scenario is formulated as a non-cooperative game with the disaster locations as the players where the Nash equilibrium gives the solution of the game, and a mathematical analysis shows that pure strategy Nash equilibrium always exists for the game. Nagurney et al. [55] present a supply chain for disaster relief operations by multiple non-governmental organizations (NGOs) using a generalized Nash equilibrium-based game-theoretic framework. In this work, the NGOs compete with each other for financial funds from donors, and then they supply relief materials to the disaster victims. Wang et al. [56] propose a multi-period allocation of scarce resources in a post-disaster scenario while maintaining equity. Wang et al. [57] present a hierarchical refugee evacuation scenario using multi-objective optimization where refugees are classified in different hierarchy. In [58], a systematic review of the articles focused on disaster relief logistics is provided. Nappi and Souza [59] proposed selection of temporary shelters based on a hierarchy of criteria and sub-criteria.
As most of the administrative structure of the government system is hierarchical in nature, the resource allocation framework should include the hierarchical structure of resource allocation and distribution. In general, resources are propagated from the top-level authority to the emergency sites through various intermediate levels. The hierarchical approach of resource distribution is reported in [60,61,62]. Ghaffari et al. [60] propose resource distribution over the supply chain network with multiple customers using particle swarm optimization and mixed-integer programming. Widener and Horner [62] proposed a capacitated-median model hierarchical method for deciding the size and placement of relief location after varying demand of the people during hurricane relief distribution. Özdamar and Demir [61] propose a multi-level clustering algorithm-based hierarchical cluster and router procedure (HOGCR) for efficient vehicle routing that aims to minimize the travel time. Here, demand nodes are grouped into hierarchical clusters of parent and child nodes and the routing problem is solved as a capacitated network flow problem. A priority-based hierarchical model is discussed in [63] for the distribution of emergency goods in disaster logistics.
In this paper, we propose a hierarchical resource and task allocation architecture for the allocation of resources during floods among the different disaster control centres. In general, resources are allocated from top-level to bottom-level control centres based on the requirement of resources at various emergency sites. Different resources can be disaster management teams, UAVs, transport vehicles, relief materials, medical teams, etc. Resource allocation at the upper level is performed based on the total available resources, resource requirements, and priority of next-level crisis locations. The crisis locations need various resources to conduct disaster response tasks. The priorities of these crisis locations are decided based on their population density, disaster-affected area, the level of disaster, the static and dynamic conditions of the road network, etc. At the bottom level, apart from allocating the relief supplies to different emergency sites, many different tasks like search and exploration, evacuation operations, etc., are also executed. We developed a task allocation architecture where all tasks, including resource allocation, could be handled at the bottom level through task allocation. The task allocation is performed based on the requirement of resources for the task, priority of the task, task location, and time required to reach the task location. We consider the administrative structure of India for the development of hierarchical allocation architecture. The different administrative layers of the Indian government are Centre, State, District, Block, and Village government [64, 65]. In the Indian context, in the case of a flood, resources are allocated from the state level to the block level through the district level, and rescue operations are directly performed at the block level as per the needs of the emergency sites (ES).
The main contribution and significance of this work may be summarized as follows:
-
1.
Development of a general framework for scalable hierarchical resource allocation architecture
-
2.
Integration of resource and task allocation architecture for developing a decision support system (DSS) for disaster management.
The remainder of this paper is organized as follows. Section 2 describes the general formulation of resource and task allocation architecture. The proposed architecture in the context of the flood is described in Sect. 3. Sections 4 and 5 describe the resource and task allocation algorithm framework. A detailed example of resource and task allocation is presented in Sect. 6. The software framework of the proposed architecture is described in Sect. 7. Concluding remarks are given in Sect. 8.
2 Resource and Task Allocation Architecture
In this section, a detailed resource and task allocation framework for managing the needs during a disaster is discussed. In general, disaster management authorities allocate various resources among the affected regions/units based on the availability of the resources at an upper administrative level and the demand requirements at the lowest administrative level. In the proposed framework, the affected units are the crisis locations, and different affected units are assumed to be under the jurisdiction of the same administration, that is, under the same state or same district or same block. Emergency response is usually a hierarchical process with the interaction between various agencies, as mentioned in [66] among many other similar works. In our work, we identify a three-layer hierarchical framework as depicted in Fig. 1. The three hierarchical layers considered are top-level control centre (TCC), middle-level control centre (MCC), and low-level control centre (LCC). Higher authorities like TCC allocate resources to lower levels based on disasters such as flood map/road network map, demand of resources, and resource availability. Resources are distributed to affected people at the very lower level, and information about the status of the road, water level, and many other parameters are passed to the higher authority. At the LCC level, apart from resource allocation, rescue operations such as surveillance and survivor detection are performed.
A typical resource allocation architecture in a disaster management scenario should be scalable to large-scale operations and should be flexible to accommodate varying demand and supply conditions. The resource allocation framework should also be able to accommodate a multi-layer organizational structure to handle a large-scale disaster scenario. The proposed framework considers the allocation of resources and other tasks related to rescue operations, such as search and rescue, in an integrated hierarchical manner. The different hierarchical layers can be different based on the administrative structure of the country. The overall architecture, in a nutshell, takes inputs from the user about the resources, demands, and specifications of the affected units and generates the final resource allocated to each affected unit.
A typical three-tier hierarchical resource allocation scenario proposed in this paper is shown in Figs. 2 and 3. The resource pool is generally possessed by the top-level authority. Resources are distributed from the TCC to the next hierarchical levels (MCC, LCC, emergency sites), forming the supply chain networks. When a natural disaster occurs over a vast area in a state, the authorities from the different affected units/emergency locations (ES) communicate their respective demands for resources and the level of crisis to the immediate higher level, i.e. LCC. Such requests of resources from different LCC gets added up and conveyed at the MCC. Similarly, the resources required by different MCCs are considered in the TCC. The TCC allocates resources to the MCCs based on factors such as the availability and requirements of resources and the priority importance of the affected units. The priority of each unit is represented through the weightage of each unit. The weightage of each affected unit is decided based on the population density, disaster-affected area, the level of disaster, the static and dynamic conditions of the road network, etc. The same principle applies to the allocation from MCC to the LCC level.
At the LCC, the overall tasks involve relief supply, evacuation, search and rescue, and survivor tracking, among many other tasks. These tasks can be dynamic or static. Tasks such as exploring an area are considered static since the object of interest and its location do not change with time. On the other hand, tasks such as survivor tracking are considered dynamic tasks since the survivor may move and change their location, or the number of survivors can change with time. At LCC, the overall work related to disaster is performed in a systematic task allocation framework to handle critical tasks in a resource-constrained scenario. The task allocation is performed based on requirement, priority, and task location. In Fig. 2, LCC-\(i\hat{i}\) denotes the \(\hat{i}^{\text {th}}\) LCC under \(i^{\text {th}}\) MCC. Similarly, ES-\(i \hat{i} \tilde{i}\) means the \(\tilde{i}^{\text {th}}\) emergency site which comes under the \(\hat{i}^{\text {th}}\) LCC of \(i^{\text {th}}\) MCC.
3 Flood Management
The proposed hierarchical framework is developed for relief and rescue operations during a flood scenario. The detailed architecture is shown in Fig. 4. In the case of flood scenarios, different resources are disaster management teams, unmanned aerial vehicles (UAVs), transport vehicles, relief materials, and medical teams. The different tasks such as search and rescue operations, evacuation, and survivor tracking are performed at local levels. The requirement of the different resources of different units is processed to obtain the total requirement. The authority considers both prior and current information about the immediate lower-level locations to decide the allocation matrix at each level. The predicted information about road networks and the water level is also considered for decision-making. As shown in Fig. 4, in the proposed hierarchical architecture, the information about the crisis location goes from the bottom level to the top level, whereas the allocation of resources/tasks is performed from top to bottom level. In Fig. 4, the arrows marked in red, green, black, blue, and teal colour indicate current information, resources, prior information, tasks, and forecasted information, respectively.
In the case of TCC, the prior information of MCC includes population, disaster handling capability, existing resources, etc., whereas the current information is level of devastation, affected flooded area, and requirement of resources. Disaster handling capacity is the capability of a crisis location to handle a disaster, and it depends on the existing infrastructure and demography of an area. As shown in Fig. 4, the resources at TCC are allocated to two different MCC locations, MCC-1 and MCC-2. At the lower level, resource allocation is performed considering more detailed information about the affected units. At the MCC level, the prior information about LCC locations includes population density, economic level, disaster handling capability, demography, existing resources, and existing supply chain. At this stage, the level of devastation, affected area, the status of the road network, and water level of different LCCs are considered for arriving at the allocation matrix. In Fig. 4, the allocated resources to MCC-1 are distributed to LCC-11 and LCC-12. Similarly, allocated resources of MCC-2 are distributed to LCC-21 and LCC-22. At the lowest level, different tasks are performed based on the priority of the task, and the proportion of relief materials is decided based on the population, demography, economic level, and road network of the emergency sites. In Fig. 4, emergency sites ES-111 belongs to LCC-11, ES-121 belongs to LCC-12, ES-211 and ES-212 belong to LCC-12, ES-221, ES-222, ES-223 belong to LCC-22, and TB 11-1 refers to task-1 of LCC-11.
4 Resource Allocation
The resource and task allocation architecture for a three-tier system is discussed in this section. The proposed scheme is scalable to a multi-layer architecture. In the three-tier architecture, the TCC allocates resources to different MCC units, and each MCC unit allocates to different LCC units.
4.1 Resource Allocation Framework
Let \(R_{1}^{T},\ldots , R_{n }^{T}\) be different resources available at the TCC level for allocating to m different MCC units, say, \(M_{1}\),..., \(M_{m}\). The requirement of an \(i^{th}\) MCC \((M_i)\) for the \(j^{th}\) resource \((R_j^T)\) is denoted as \(R_{ij}^{M}\). The significance for the allocation of resources to an MCC often depends on various factors such as population density \((\rho )\), disaster-affected area (a), and level of disaster \((\gamma )\), among other factors. To account such factors competently, it is assumed that there are \(\eta\) number of such factors at each \(i^{th}\) MCC level, denoted by \(f_{ik}\), where \(k=1, \ldots , \eta\).
To account the importance of these different factors, each of them is assigned a normalized weight \(w_{k}\) such that they sum up to 1. The weight \(w_k\) is considered to be the same for each MCC. Then, the weight \(p_i^M\) of the \(i^{\text {th}}\) MCC unit is,
A typical example of weight calculation is provided in Sect. 6.1. The resource allocation vector of \(j^{\text {th}}\) resource for the \(i^{\text {th}}\) MCC is as follows:
Resource allocated at the TCC level is the minimum of the quantity of available resources and total requirement from different MCC units. The allocation of \(j^{\text {th}}\) resource at the TCC level is calculated as follows:
Then, the maximum possible allocation of the \(j^{\text {th}}\) resource for the \(i^{\text {th}}\) MCC is calculated as follows.
Let us consider that \(i^{\text {th}}\) MCC has \(l_{i}\) LCC units indexed by \(\hat{i} = 1, \ldots , l_{i}\), and the requirement of \(j^{\text {th}}\) resource by \(\hat{i}^{\text {th}}\) LCC (\(L_{\hat{i}}\)) is \(R_{\hat{i}j}^{M_{i}L}\), where \(\sum _{\hat{i}=1}^{l_{i}} R_{\hat{i}j}^{M_{i}L} =R_{ij}^{M}\). Then, the allocation of the \(j^{\text {th}}\) resource to \(i^{\text {th}}\) MCC is calculated as follows:
The allocated resources to MCC level is further distributed to LCC level. Let, consider the allocation of resources at \(i^{\text {th}}\) MCC. Let, the weight of \(\hat{i}^{\text {th}}\) LCC of \(i^{\text {th}}\) MCC be \(p_{i\hat{i}}^{L}\) and there are \(\hat{\eta }\) different factors which affect the allocation at MCC level. The value of \(\hat{k}^\text {th}\) factor of \(\hat{i}^{\text {th}}\) LCC of \(i^{\text {th}}\) MCC unit is \(f_{i\hat{i}\hat{k}}\). The weight given to \(\hat{k}^\text {th}\) factor is \(w_{\hat{k}}\) and \(\sum _{\hat{k}=1}^{\hat{\eta }} w_{\hat{k}} =1\). Then, the weight of \(\hat{i}^{\text {th}}\) LCC unit is,
If the resource allocation vector of \(j^{\text {th}}\) resources for the \(\hat{i}^{th}\) LCC is
Then the allocation of the \(j^{\text {th}}\) resource for \(\hat{i}^{th}\) LCC unit of \(i^{\text {th}}\) MCC is
4.2 Allocation Weightage
The weightage (\(w_{k}\) and \(w_{\hat{k}}\)) of different crisis location (MCC, LCC) at different level is determined based on different factors, and can be decided by the disaster management authority. This decision can benefit from experience with the earlier similar disasters. From (1), it can be shown that sum of the weights of all MCCs is 1. The total weight of m MCC units is:
Since \(\sum _{i=1}^{m} \frac{ f_{ik} }{\sum _{i=1}^{m} f_{ik}} = 1\), we can write
Because the normalized weights \(w_k,~k=1,\ldots ,\eta\) sum up to 1 by definition, Eq. (10) becomes
Similarly, the sum of the weights of all LCC units can also be shown as
Considering Eqs. (11) and (12), the weight of each crisis location is less than one, and the sum of the weight of all crisis locations is one. The proposed resource allocation framework does not pose any restrictions on the value of each factor (\(f_{ik}\), \(f_{i\hat{i}\hat{k}}\)), number of factors (\(\eta\), \(\hat{\eta }\)), and the number of crisis locations. The calculated weight of each crisis location is normalized and non-dimensional. The total resources will be distributed among the crisis locations as the sum of the weight of the crisis locations is one. Therefore, the framework can include any number of locations, number of factors, and the value of factors. So, the overall architecture is scalable in terms of the number of locations, number of factors considered in the allocation process, and the value of these factors.
4.3 Computation of Demand Based on Forecast
The basic demand of \(j^{\text {th}}\) resource (\(D_{j}^{b}\)) from any unit is the demand of \(j^{\text {th}}\) resource placed by the unit based on their resource requirements and it is adjusted based on the forecast of disaster level and road network. Disaster level is the measure of the severity of a disaster affected area. The overall demand of \(j^{\text {th}}\) resource (\(D_{j}^{o}\)) for the allocation is calculated based on the basic demand and the excess demand considering the forecast information.
Let there be r forecasted parameters which affects the allocation. The value of the \(n^{\text {th}}\) forecasted parameter of \(i^{\text {th}}\) MCC is \(E_{in}^{M}\). The relative criticality of the \(n^{\text {th}}\) parameter of the \(i^{\text {th}}\) MCC \((C_{in}^{M})\) is calculated as,
where \(E_{n}^{\text {max} }\) is the maximum value of the \(n^{\text {th}}\) parameter. Let the basic demand of \(j^{\text {th}}\) resource of \(i^{\text {th}}\) MCC is \(D_{ij}^{b}\). Then, the excess demand (\(D_{ij}^{e}\)) of a resource is calculated as follows,
where \(\zeta _{n}\) is relative weight provided to the different r parameters. If X percentage of basic demand is considered to accommodate the forecasted value, then,
The overall demand of \(j^{\text {th}}\) resource of \(i^{\text {th}}\) MCC \((D_{ij}^{o})\) is calculated as,
Clearly, the maximum value of the overall demand is \((1+\frac{X}{100})D_{ij}^{b}\) and minimum value is \(D_{ij}^{b}\). Similarly, the demand modification based on the forecast can be extended to other layers.
Let us consider an example where allocation at the TCC level is performed for two MCC units. Let the ratio of predicted disaster level to current disaster level and the ratio of predicted road network status to current road network status be used to calculate excess demand. It is assumed that the higher value of these parameters is related to high severity, and the minimum value of the ratio is assumed to be 1. The basic demand of two MCC units and the parameters are shown in Table 1.
In this case, we have two forecast parameters disaster level \((E_{1})\) and road network status \(( E_{2})\). The maximum value of \((E_{1})\) and \((E_{2})\) is \((E_{1}^{\text {max}})\) and \((E_{2}^{\text {max}})\). Here, the values of \((E_{1}^{\text {max}})\) and \((E_{2}^{\text {max}})\) are both 3. Therefore, the relative criticality of disaster level for MCC-1 is \(\frac{2}{3}\). Similarly, the relative criticality of all forecast parameters for each of MCC unit is derived. Let the value of X be 25, that is, a maximum of 25 % of basic demand is considered to accommodate future scenario. The relative weights (\(\zeta _{n}\)) of the parameters are considered as 0.15 and 0.10. Using Eq. (14), excess demand \((D_{1j}^{e})\) of \(j^{\text {th}}\) resource for MCC-1 is calculated as follows.
So, the overall demand of \(j^{\text {th}}\) resource for MCC-1 is
Similarly, the overall demand of MCC-2 is calculated. The calculations of overall demand are presented in Table 2.
4.4 Algorithm
The overall algorithm for resource allocation at TCC level is shown in Algorithm 1. The algorithm is performed iteratively. The algorithm for resource allocation at MCC level will have similar equivalent steps.
4.5 Allocation Sensitivity
In this section, the sensitivity of the resource allocation at TCC level with respect to the requirements of resources at MCC level and availability of resources at TCC level is calculated. The allocation of the \(j^{\text {th}}\) resources for the \(i^{\text {th}}\) MCC at the TCC level with respect to with respect to the requirements of \(j^{\text {th}}\) resources for the \(i^{\text {th}}\) MCC is \(\frac{\partial A_{ij}^{M} }{ \partial R_{ij}^{M}}\). Similarly, the sensitivity of the allocation with respect to availabilities of \(j^{\text {th}}\) resource at TCC is \(\frac{\partial A_{ij}^{M} }{ \partial R_{j}^{T}}.\)
Case 1
Available resources at TCC is lower than the sum of the individual demand of different MCC; i.e. \((R_{j}^{T} \le \sum _{i=1}^{m} R_{ij}^{M} )\) and \((R_{j}^{T} W_{ij}^{M} \le R_{ij}^{M} )\). Then, from (5),
Therefore,
Case 2
Available resources at TCC is higher than the sum of the individual demand of different MCC; i.e. \((R_{j}^{T} \ge \sum _{i=1}^{m} R_{ij}^{M} )\) and \((\sum _{i=1}^{m} R_{ij}^{M} W_{ij}^{M} \le R_{ij}^{M} )\). Then, using (5),
Case 3
\(R_{ij}^M \le \text {min}( R_{j}^{T}, \sum _{i=1}^{m} R_{ij}^{M} ) W_{ij}^{M}\). Then,
In all the Cases, at higher level, the allocation sensitivity of resources with respect to requirements is nonzero; therefore, the allocation of resources at the TCC level will depend on the demand of resources by the individual MCC. In Case 3, the allocation sensitivity with respect to requirements is 1, as this represents the situation where the proportional demand of an MCC is comparatively lower than other locations, and resources are available at the TCC level. Also, the allocation sensitivity with respect to availabilities is nonzero only if the available resources at TCC are not sufficient to cater to all the demands of individual MCC. The same results can also be extended to allocation at other layers.
5 Task Allocation at LCC Level
Every task is defined in terms of the requirement of different types of resources to accomplish the tasks. Considering flood, it is considered that each task will require ground vehicles (GV), unmanned aerial vehicles (UAV), and boats (B) to perform various tasks such as relief supply, surveillance, and survivor tracking. So, a task \(T_{x}\) which requires \(T_{x_{GV}}\) number of GVs, \(T_{x_{UAV}}\) number of UAVs and \(T_{x_{B}}\) number of boats is defined as,
Similarly, the priority of each task is defined based on the criticality of the tasks involved. In this case, each task is classified as different sub-tasks as normal supply (NS), relief supply (RS), surveillance (SL), survivor tracking (ST), and critical supply (CS). The priority of task (\(T_{x}^{p}\)) is defined as a vector consisting of the priority of each sub-tasks,
where \(NS^{p}\), \(RS^{p}\), \(SL^{p}\), \(ST^{p}\), \(CS^{p}\) are the priority associated with the sub-tasks NS, RS, SL, ST, CS. Let the priority of the \(r^{\text {th}}\) sub-tasks is \(P_{r}^{s}\). Then, for example, a task involving of sub-tasks consisting of surveillance and survivor tracking will have priority vector as follows,
Let consider priority of sub-tasks normal supply, relief supply, surveillance, survivor tracking, and critical supply are 1, 10, 50, 200, and 100, respectively. Let task \(T_{x_{1}}\) consists of surveillance and survivor tracking, and it requires 10 nos. of UAVs; whereas, task \(T_{x_{2}}\) consists of relief supply and it requires 5 nos. GV and 3 nos. boat. Then, tasks \(T_{x_{1}}\) and \(T_{x_{1}}\) are defined as \(T_{x_{1}}=[0,0,10]\) and \(T_{x_{2}}=[5,0,3]\), respectively. The priority of \(T_{x_{1}}\) and \(T_{x_{2}}\) is defined as \(P_{ T_{x_{1}}}= [0,0, 50, 200, 0]\), and \(P_{ T_{x_{2}}}= [0, 10,0,0,0]\). Resources are allocated from the available resources to the tasks with high priority. The norm of the priority vector is used to sort the tasks. As the norm of \(P_{ T_{x_{1}}}\) is higher than \(P_{ T_{x_{2}}}\), task \(T_{x_{1}}\) will have higher priority than \(T_{x_{2}}\).Therefore, if all the resource required to execute \(T_{x_{1}}\) is available,\(T_{x_{1}}\) will be executed prior to \(T_{x_{2}}\). If the priorities are equal, the distance from the resource location to task location is considered for the resource allocation. The distance from the resource location to task location is calculated based on the available current network. This distance can be derived more accurately using the dynamic condition of the road network considering the predicted value of the water level and the extent of the damage. Algorithm 2 gives the details of the task allocation algorithm.
6 Resource and Task Allocation Case Study
We consider an example of allocating resources during a disaster scenario caused by a flood in India. From [64, 67], an overview of the administrative structure in India can be understood. The highest administrative layer of India is the central government (National level), headed by the prime minister of India. Different state governments led by respective chief ministers fall under the central government. The subsequent layers in the hierarchy are districts, blocks, and villages. In our case study, we assume that the flood occurs in a particular state of India. This example represents the state, district, and block level administrative hierarchy as TCC, MCC, and LCC. We consider the allocation of five distinct resources at the state (TCC) level to two districts denoted as \(D_{1}\) and \(D_{2}\) (MCCs) with different priorities. Further the allocated resources at each district level are further distributed to three individual blocks (LCCs) following the proposed principles in Sect. 4. District \(D_{1}\) has three blocks (\(D_{1}\_B_{1}\), \(D_{1}\_B_{2}\), \(D_{1}\_B_{3}\)) and District \(D_{2}\) has also three blocks (\(D_{2}\_B_{1}\), \(D_{2}\_B_{2}\), \(D_{2}\_B_{3}\)).
6.1 Weightage Calculation
For simplicity, let us consider at the TCC level, the population density (per sq Km), disaster-affected area (\(\text {Km}^2\)), and level of the disaster of the MCC units for weightage calculation of different units. The level of disaster over an area can be classified into different levels. A typical classification of levels adopted here is from 1 to 5, with Level 1 being the least severe and Level 5 the most severe. Based on the current water level, road network status, and rainfall information, authorities decide the level of disaster at respective levels based on the information gathered from the lower-level units. The values of population density, disaster-affected area, and the level of the disaster of each district and block are shown in Table 3.
The weights \((w_{k})\) associated with different factors such as population density (\(f_{i1}\)), affected area (\(f_{i2}\)), and level of disaster (\(f_{i3}\)) are fixed at 0.4, 0.4, and 0.2, respectively. Higher weights are provided to population density and disaster affected area as these two factors are directly responsible for the relief requirement of a given area. These factors are often decided by the allocation authority based on suitable judgement. Using (1) and the values given in Table 3, the weights \(p_1^M\) and \(p_2^M\) for districts \(D_{1}\) and \(D_{2}\), respectively, are calculated as
It can be seen that the sum of the weights \(p_1^M\) and \(p_2^M\) is evidently unity as argued in Subsect. 4.2. Further, like the level of disaster, other factors such as handling capability of disaster over an area can also be considered in the proposed framework, provided they are mapped to an equivalent quantitative factor satisfying the details in Subsects. 4.1 and 4.2. Similarly, based on the different parameters associated with the blocks given in Table 3, the weightage of each blocks under district \(D_{1}\) and \(D_{2}\), respectively \(p_{1\hat{i}}^L\) and \(p_{2\hat{i}}^L\) for \(\hat{i} = 1,\ldots ,3\), are calculated. They are \(p_{11}^L\) = 0.4, \(p_{12}^L\) = 0.25, \(p_{13}^L\) = 0.35, \(p_{21}^L\) = 0.26, \(p_{22}^L\) = 0.48 and \(p_{23}^L\) = 0.26.
6.2 Availability and Requirement of Resources
The available quantity of five distinct resources \((R_{j}^{T})\) at the state control centre (TCC) is given in Table 4. The requirement for each of the five resources at the different blocks (LCC) of district \(D_{1}\) and district \(D_{2}\) are provided, respectively, in Tables 5 and 6. The resources claimed by the individual units are modified using the excess demand considering the forecast scenario as detailed in Subsect. 4.3.
The requirement of resources at the block level (LCC) is gathered at the district level (MCC). Further, the resource requirement from each of the districts is accumulated at the state level (TCC). The individual resource requirement for each district is the summation of the resources required for the individual block (shown in Tables 7 and 8).
6.3 Resource Allocation
The output of the allocation algorithm given in Algorithm 1 after two iterations is presented here. The allocation of resources to districts from the state at different stages of allocation is shown in Tables 9 and 10.
The allocation of resources to blocks from districts at different stages of allocation is shown in Tables 11 and 12.
The allocation of different districts and blocks at different allocation levels is presented through bar diagram in Figs. 5, 6, 7, and 8. Here, the length of the bar presents the total demand, the blue colour fraction presents the allocated amount, and the red colour presents the deficiency in allocation.
6.4 Task Allocation
After allocation of resources at the TCC and MCC level, each LCC unit needs to distribute these resources at emergency sites. A task allocation scenario consisting of five tasks at the LCC level is presented in this section. Let us consider the case for block \(B_{1}\) of district \(D_{1}\) \((D_{1}\_B_{1})\) and the resources \({A_{\hat{i}1}^{M_{2}L}}\), \({A_{\hat{i}2}^{M_{2}L}}\), and \({A_{\hat{i}3}^{M_{2}L}}\) are related to numbers of GV, UAV, and boat, respectively. Each task related to rescue operations is initially divided into different sub-tasks, and the resources (GVs, UAVs, and boats) required to perform each task are mentioned in Table 13. For example, in this case, task \((T_{1})\) is defined as \(T_{1}=[0,2,0]\). The five different sub-tasks are survivor tracking, relief supply, critical supply, surveillance, and normal supply. The priority associated with the individual sub-tasks is given in Table 14.
The priority vector of each task is decided based on its composition of sub-tasks and the priority involved with the individual sub-tasks (shown in Table 15). For example, the priority vector of task (\(P_{T_{1}}\)) is defined as \(P_{T_{1}}\): [0,0,0, 200, 0].
Tasks are sorted out using the norm of the priority vector, so in this case, the sequence of tasks are \(P_{T_{1}}\), \(P_{T_{3}}\), \(P_{T_{4}}\), \(P_{T_{2}}\), \(P_{T_{5}}\). The actual allocation available at block \(B_{1}\) of district \(D_{1}\) from the resource allocation is shown in Table 16.
So, after the first level of allocation, the resource required to perform \(T_{1}\), \(T_{2}\), \(T_{3}\) is available, so these tasks are executed. Although \(T_{4}\) has higher priority than \(T_{2}\), sufficient resources are not available to execute \(T_{4}\). The resources related to \({A_{\hat{i}2}^{M_{1}L}}\), that is, sufficient number of UAV is not available to perform the tasks \(T_{4}\) along with \(T_{1}\) and \(T_{2}\). After the second level of allocation, sufficient resources are still not available to perform the next high-priority tasks \(T_{4}\). So, the next high-priority tasks \(T_{5}\), which can be allocated the desired resources, are performed. In summary, initially mission related to \(T_{1}\), \(T_{2}\) , \(T_{3}\) is executed and after second level of allocation, task \(T_{5}\) is also executed. Task \(T_{4}\) can be executed only after receiving sufficient resources during future allocation. If the tasks are broken into smaller tasks, the utilization of allocated resources can be increased.
6.5 Needs of Future Development of the Proposed DSS
A few issues in the proposed DSS framework are described below.
-
1.
The task allocation is performed based on the current status of the available resources and task requirements. Resources are utilized in a maximum way after each allocation cycle and not stacked for future use. Tasks with the requirement of a higher amount of resources might not get executed in this approach.
-
2.
Allocation is performed based on the weights of individual crisis locations, and weight is provided equally for each of the resource items. The criticality of demand of a particular resource item for a location with low weight might not get priority in the allocation process.
These points will be addressed in our future work.
7 Software Implementation
The proposed resource allocation architecture is implemented in a software framework. The inputs to the software framework are the existing resources and demands of the different units and their specifications. It outputs the resources to be allocated to each unit. The resources can be selected from the pre-defined database, and the user can also add a new resource. A typical software framework for resource allocation is shown in Fig. 9. The resource allocation software framework is integrated with a Graphical User Interface (GUI) using MATLAB for easy implementation at the ground level. Currently, the GUI is developed for five crisis locations and six factors; however, it is easily scalable to higher numbers of crisis locations/factors. In the “Home” tab of GUI (shown in Fig. 10(a)). The backend program is run based on user-selected crisis location and the number of factors. Currently, a virtual allocation scenario is shown in the GUI for two districts of the Indian state of Kerala, namely Ernakulam and Alapuzha, and the factors considered are affected area, population density, disaster level, and disaster handling capacity (as shown in Fig. 10(a)). In the “Specification” tab (shown in Fig. 10(b)), the associated factors related to crisis locations need to be provided. The overall weights of each district are calculated through this tab, where there is a provision for the user to enter custom weights. The quantities of different resources available to the resource allocation authority are considered through the “Resources” tab in Fig. 10(c). The demands of individual crisis locations are entered through the “Demand” tab (shown in Fig. 10(d)). Allocation of individual resources among the crisis location can be obtained as given in Fig. 10(e). The overall allocation of the individual crisis locations can be obtained through the “Area wise Allocation” tab. The allocation for Ernakulam and Alapuzha for the current scenario is shown in Fig. 10(f) and (g), respectively. The allocation summary through bar diagram can be obtained through the “Graphs” tab (as shown in Fig. 10(h)).
8 Conclusions
This paper presents a hierarchical allocation architecture for resource and task allocation during a flood-like disaster scenario. The proposed architecture is developed considering the flow of resources from the hierarchical administration of the government in a flood scenario. The allocation is performed based on the priority of the units based on the different factors such as population density, and disaster level. The proposed framework is scalable to accommodate different factors affecting allocation and different levels of multiple hierarchical units. An example scenario of resource and task allocation for flood management considering the administrative structure of India is studied. The proposed framework can be applied to resource allocation during other similar large-scale natural disasters; however, the factors for priority determination will be different. Future work includes the development of a software system deployable for a large-scale natural disaster scenario for resource and task allocation. It would also address the limitations of the current DSS discussed in the paper.
Data Availability Statement
The authors declare that the data supporting the findings of this study are available within the article.
Abbreviations
- \(A_{j}^{T}\) :
-
The allocation of \(j^{\text {th}}\) resource at TCC level
- \(A_{ij}^{M}\) :
-
The allocation of the \(j^{\text {th}}\) resource to \(i^{\text {th}}\) MCC
- \(A_{\hat{i}j}^{M_{i}L}\) :
-
The allocation of the \(j^{\text {th}}\) resource for \(\hat{i}^{\text {th}}\) LCC unit of \(i^{\text {th}}\) MCC
- \(C_{in}^{M}\) :
-
The relative criticality of the \(n^{\text {th}}\) forecasted parameter of \(i^{\text {th}}\) MCC
- \(D_{ij}^{b}\) :
-
Basic demand of \(j^{\text {th}}\) resource of \(i^{\text {th}}\) MCC considering forecast
- \(D_{ij}^{e}\) :
-
Excess demand of \(j^{\text {th}}\) resource of \(i^{\text {th}}\) MCC considering forecast
- \(D_{ij}^{o}\) :
-
Overall demand of \(j^{\text {th}}\) resource of \(i^{\text {th}}\) MCC considering forecast
- \(\eta\) :
-
Allocation factors such as population density and disaster level at TCC level
- \(\hat{\eta }\) :
-
Allocation factors at TCC level
- \(E_{in}^{M}\) :
-
The value of the \(n^{\text {th}}\) forecasted parameter of \(i^{\text {th}}\) MCC
- \(E_{n}^{\text {max}}\) :
-
The maximum value of the \(n^{\text {th}}\) forecasted parameter
- \(f_{ik}\) :
-
The value of \(k^{\text {th}}\) factor of \(i^{\text {th}}\) MCC
- \(f_{i\hat{i}\hat{k}}\) :
-
The value of \(\hat{k}^{\text {th}}\) factor of \(\hat{i}^{\text {th}}\) LCC of \(i^{\text {th}}\) MCC
- \(F_{ij}^{M}\) :
-
The maximum possible allocation of \(j^{\text {th}}\) resource for the \(i^{\text {th}}\) MCC
- i :
-
MCC index
- \(\hat{i}\) :
-
LCC index
- \(\tilde{i}\) :
-
ES index
- j :
-
Resource item index
- k :
-
Allocation parameter index at TCC level
- \(\hat{k}\) :
-
Allocation parameter index at MCC level
- \(l_{i}\) :
-
Number of LCC unit under \(\hat{i}^{\text {th}}\) MCC
- \(L_{\hat{i}}\) :
-
\(\hat{i}^{\text {th}}\) LCC
- m :
-
Number of MCC units
- n :
-
Forecasted parameter index
- \(M_{i}\) :
-
\(i^{\text {th}}\) MCC
- n :
-
Number of different resources available at TCC level
- \(p_{i}^{M}\) :
-
The allocation weight of \(i^{\text {th}}\) MCC
- \(p_{i\hat{i}}^L\) :
-
The weight of \(\hat{i}^{\text {th}}\) LCC of \(i^{\text {th}}\) MCC
- \(P_{T_{x}}\) :
-
Priority of task \(T_{x}\)
- r :
-
The total number of forecasted parameters
- \(R_{j}^T\) :
-
Amount of \(j^{\text {th}}\) resource available to TCC
- \(R_{ij}^{M}\) :
-
The requirement of \(i^{\text {th}}\) MCC for the \(j^{\text {th}}\) resource
- \(R_{\hat{i}j}^{M_{i}L}\) :
-
The requirement of \(j^{\text {th}}\) resource by \(\hat{i}^{\text {th}}\) LCC of \(i^{\text {th}}\) MCC
- \(T_{x}\) :
-
Task vector consists of requirements of number of GVs, UAVs, and boat
- \(w_{k}\) :
-
The weight of \(k^{\text {th}}\) allocation factor provided at TCC level
- \(\hat{w_{k}}\) :
-
The weight of \(\hat{k}^{\text {th}}\) allocation factor provided at MCC level
- \(W_{ij}^{M}\) :
-
The resource allocation vector of \(j^{\text {th}}\) resource for the \(i^{\text {th}}\) MCC
- \(W_{\hat{i}j}^{M_{i}L}\) :
-
The resource allocation vector of \(j^{\text {th}}\) resources for the \(\hat{i}^{\text {th}}\) LCC of \(i^{\text {th}}\) MCC
- \(\zeta _{n}\) :
-
Relative weight of \(n^{\text {th}}\) forecast parameter
- CS:
-
Critical supply
- ES:
-
Emergency site
- GV:
-
Ground vehicle
- LCC:
-
Low-level control centre
- MCC:
-
Middle-level control centre
- NS:
-
Normal supply
- RS:
-
Relief supply
- SL:
-
Surveillance
- ST:
-
Survivor tracking
- TCC:
-
Top-level control centre
- UAV:
-
Unmanned aerial vehicle
References
GoI-Document (2005) The disaster management act, 2005. https://cdn.s3waas.gov.in/s365658fde58ab3c2b6e5132a39fae7cb9/uploads/2018/04/2018041720.pdf. Date last accessed: 09-04-2022
Dave R (2018) Disaster management in India: Challenges and strategies. Prowess Publishing
Singh O, Kumar M (2017) Flood occurrences, damages, and management challenges in India: a geographical perspective. Arab J Geosci 10(5):1–19
Central-Water-Commission (2019) Water and related statistics. http://www.indiaenvironmentportal.org.in/files/file/water-and-related-statistics-2019.pdf. Date last accessed: 23-03-2022
Gupta S, Javed A, Datt D (2003) Economics of flood protection in India. In: Flood Problem and Management in South Asia, Springer. pp 199–210
Singh O, Kumar M (2013) Flood events, fatalities and damages in India from 1978 to 2006. Nat Hazards 69(3):1815–1834
Central-Water-Commission (2022) Statewise flood damage statistics. http://www.indiaenvironmentportal.org.in/files/file/flood%20damage%20data.pdf. Date last accessed: 23-03-2022
Arora A, Siddiqui MA, Pandey M (2020) Statistical analysis of major flood events during 1980-2015 in middle Ganga plain, Ganga river basin, India. In: Spatial Information Science for Natural Resource Management, IGI Global. pp 225–241
Mahanta R, Das D (2017) Flood induced vulnerability to poverty: Evidence from Brahmaputra valley, Assam, India. International Journal of Disaster Risk reduction 24:451–461
Plate EJ (2002) Flood risk and flood management. J Hydrol 267(1–2):2–11
Hansson K, Danielson M, Ekenberg L (2008) A framework for evaluation of flood management strategies. J Environ Manage 86(3):465–480
Mohapatra P, Singh R (2003) Flood management in India. In: Flood Problem and Management in South Asia. Springer, pp 131–143
Kourgialas NN, Karatzas GP (2011) Flood management and a GIS modelling method to assess flood-hazard areas-a case study. Hydrological Sciences Journal-Journal des Sciences Hydrologiques 56(2):212–225
Rahman M, Di L et al (2017) The state of the art of spaceborne remote sensing in flood management. Nat Hazards 85(2):1223–1248
Liu Z, Huang G (2009) Dual-interval two-stage optimization for flood management and risk analyses. Water Resour Manage 23(11):2141–2162
Fotovatikhah F, Herrera M, Shamshirband S, Chau K-W, Faizollahzadeh Ardabili S, Piran MJ (2018) Survey of computational intelligence as basis to big flood management: Challenges, research directions and future work. Engineering Applications of Computational Fluid Mechanics 12(1):411–437
Iqbal U, Perez P, Li W, Barthelemy J (2021) How computer vision can facilitate flood management: A systematic review. International Journal of Disaster Risk Reduction 53:102030
Billa L, Mansor S, Mahmud AR (2004) Spatial information technology in flood early warning systems: an overview of theory, application and latest developments in Malaysia. Disaster Prevention and Management: An International Journal
Duncan A, Keedwell E, Djordjevic S, Savic D (2013) Machine learning-based early warning system for urban flood management. In: International Conference on Flood Resilience: Experiences in Asia and Europe, University of Exeter
Krzhizhanovskaya VV, Shirshov G, Melnikova NB, Belleman RG, Rusadi F, Broekhuijsen B, Gouldby BP, Lhomme J, Balis B, Bubak M et al (2011) Flood early warning system: design, implementation and computational modules. Procedia Computer Science 4:106–115
Altay N, Green WG III (2006) OR/MS research in disaster operations management. Eur J Oper Res 175(1):475–493
Horita FE, deAlbuquerque JP (2013) An approach to support decision-making in disaster management based on volunteer geographic information (VGI) and spatial decision support systems (SDSS). In: 2013 Tenth ISCRAM, Citeseer
Liashenko O, Kyryichuk D, Krugla N, Lozhkin R (2019) Development of a decision support system for mitigation and elimination the consequences of natural disasters in Ukraine. International Multidisciplinary Scientific GeoConference: SGEM 19(2.1):825–832
Moehrle S, Raskob W (2019) Reusing strategies for decision support in disaster management–a case-based high-level petri net approach. Advances in Artificial Intelligence: Reviews 1
Sati S (2015) Scalable framework for emergency response decision support systems. In: 2015 IEEE International Symposium on Technologies for Homeland Security (HST). IEEE, pp 1–6
Wallace WA, DeBalogh F (1985) Decision support systems for disaster management. Public Administration Review. pp 134–146
Zhou L, Wu X, Xu Z, Fujita H (2018) Emergency decision making for natural disasters: An overview. International Journal of Disaster Risk Reduction 27:567–576
Newman JP, Maier HR, Riddell GA, Zecchin AC, Daniell JE, Schaefer AM, van Delden H, Khazai B, O’Flaherty MJ, Newland CP (2017) Review of literature on decision support systems for natural hazard risk reduction: Current status and future research directions. Environ Model Software 96:378–409
Aifadopoulou G, Chaniotakis E, Stamos I, Mamarikas S, Mitsakis E (2018) An intelligent decision support system for managing natural and man-made disasters. International Journal of Decision Support Systems 3(1–2):91–105
Zamanifar M, Hartmann T (2021) Decision attributes for disaster recovery planning of transportation networks; a case study. Transp Res Part D: Transp Environ 93:102771
Lourenço M, Oliveira LB, Oliveira JP, Mora A, Oliveira H, Santos R (2021) An integrated decision support system for improving wildfire suppression management. ISPRS Int J Geo Inf 10(8):497
Marin G, Modica M, Paleari S, Zoboli R (2021) Assessing disaster risk by integrating natural and socio-economic dimensions: A decision-support tool. Socioecon Plann Sci 77:101032
Yaşa E, TüzünAksu D, Özdamar L (2022) Metaheuristics for the stochastic post-disaster debris clearance problem. IISE Transactions pp 1–22
Vogiatzis C, Pardalos PM (2016) Evacuation modeling and betweenness centrality. In: International Conference on Dynamics of Disasters. Springer, pp 345–359
Korkou T, Souravlias D, Parsopoulos K, Skouri K (2016) Metaheuristic optimization for logistics in natural disasters. In: International Conference on Dynamics of Disasters. Springer, pp 113–134
Matsuno Y, Fukanuma F, Tsuruoka S (2021) Development of flood disaster prevention simulation smartphone application using gamification. In: International Conference on Dynamics of Disasters. Springer, pp 147–159
Kondaveti R, Ganz A (2009) Decision support system for resource allocation in disaster management. In: 2009 Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE, pp 3425–3428
Park J, Cullen R, Smith-Jackson T (2014) Designing a decision support system for disaster management and recovery. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, SAGE Publications, Los Angeles, CA 58:1993–1997
Hashemipour M, Stuban SM, Dever JR (2017) A community-based disaster coordination framework for effective disaster preparedness and response. Australian Journal of Emergency Management, The 32(2):41–46
Li X, Pu W, Zhao X (2019) Agent action diagram: Toward a model for emergency management system. Simul Model Pract Theory 94:66–99
Wang Z, Zhang J (2019) Agent-based evaluation of humanitarian relief goods supply capability. International Journal of Disaster Risk Reduction 36:101105
Othman SB, Zgaya H, Dotoli M, Hammadi S (2017) An agent-based decision support system for resources’ scheduling in emergency supply chains. Control Eng Pract 59:27–43
Sepúlveda JM, Arriagada IA, Derpich I (2018) A decision support system for distribution of supplies in natural disaster situations. In: 2018 7th International Conference on Computers Communications and Control (ICCCC). IEEE, pp 295–301
Sepúlveda JM, Bull J (2019) A model-driven decision support system for aid in a natural disaster. In: International Conference on Human Systems Engineering and Design: Future Trends and Applications. Springer, pp 523–528
Arora H, Raghu T, Vinze AS (2010) Resource allocation for demand surge mitigation during disaster response. Decision Support System 50:304–315
Fiedrich F, Gehbauer F, Rickers U (2000) Optimized resource allocation for emergency response after earthquake disasters. Saf Sci 35:41–57
Wang F, Pei Z, jun Dong L, Ma J (2020a) Emergency resource allocation for multi-period post-disaster using multi-objective cellular genetic algorithm. IEEE Access 8:82255–82265
Worby CJ, Chang HH (2020) Face mask use in the general population and optimal resource allocation during the Covid-19 pandemic. Nat Commun 11(1):1–9
Rolland E, Patterson RA, Ward K, Dodin B (2010) Decision support for disaster management. Oper Manag Res 3(1–2):68–79
Faiz TI, Vogiatzis C, Noor-E-Alam M (2019) A column generation algorithm for vehicle scheduling and routing problems. Comput Ind Eng 130:222–236
Sun J, Zhang Z (2020) A post-disaster resource allocation framework for improving resilience of interdependent infrastructure networks. Transp Res Part D: Transp Environ 85:102455
Das R, Hanaoka S (2014) An agent-based model for resource allocation during relief distribution. Journal of Humanitarian Logistics and Supply Chain Management 4(2):265–285
Doan XV, Shaw D (2019) Resource allocation when planning for simultaneous disasters. Eur J Oper Res 274(2):687–709
Majumder R, Warier RR, Ghose D (2019) Game theory-based allocation of critical resources during natural disasters. In: 2019 Sixth Indian Control Conference (ICC). IEEE, pp 514–519
Nagurney A, Flores EA, Soylu C (2016) A generalized Nash equilibrium network model for post-disaster humanitarian relief. Transportation Research Part E: Logistics and Transportation Review 95:1–18
Wang Y, Bier VM, Sun B (2019) Measuring and achieving equity in multi-period emergency material allocation. Risk Anal 39(11):2408–2426
Wang J, Shen D, Yu M (2020b) Multiobjective optimization on hierarchical refugee evacuation and resource allocation for disaster management. Mathematical Problems in Engineering 2020
Lechtenberg S, Widera A, Hellingrath B (2017) Research directions on decision support in disaster relief logistics. In: 2017 4th International Conference on Information and Communication Technologies for Disaster Management (ICT-DM). IEEE, pp 1–8
Nappi MML, Souza JC (2015) Disaster management: Hierarchical structuring criteria for selection and location of temporary shelters. Nat Hazards 75(3):2421–2436
Ghaffari Z, Nasiri MM, Bozorgi-Amiri A, Rahbari A (2020) Emergency supply chain scheduling problem with multiple resources in disaster relief operations. Transportmetrica A: Transport Science 16(3):930–956
Özdamar L, Demir O (2012) A hierarchical clustering and routing procedure for large scale disaster relief logistics planning. Transportation Research Part E: Logistics and Transportation Review 48(3):591–602
Widener MJ, Horner MW (2011) A hierarchical approach to modeling hurricane disaster relief goods distribution. J Transp Geogr 19(4):821–828
Liberatore F, Ortuño MT, Tirado G, Vitoriano B, Scaparra MP (2014) A hierarchical compromise model for the joint optimization of recovery operations and distribution of emergency goods in humanitarian logistics. Comput Oper Res 42:3–13
Ogra A, Donovan A, Adamson G, Viswanathan K, Budimir M (2021) Exploring the gap between policy and action in disaster risk reduction: a case study from India. International Journal of Disaster Risk Reduction 63:102428
Wescoat JL, Bramhankar R, Murty J, Singh R, Verma P (2021) A macrohistorical geography of rural drinking water institutions in India. Water History 13(2):161–188
Salmoral G, Rivas Casado M, Muthusamy M, Butler D, Menon PP, Leinster P (2020) Guidelines for the use of unmanned aerial systems in flood emergency response. Water 12(2):521
GoI-Document (2018) Institutional arrangements for disaster management (DM). https://cdn.s3waas.gov.in/s38d7d8ee069cb0cbbf816bbb65d56947e/uploads/2018/02/2018022462.pdf. Date last accessed: 21-03-2022
Acknowledgements
The research work was supported by the EPSRC, EP/P02839X/1 ``Emergency flood planning and management using unmanned aerial systems". The authors would like to thank their colleagues in the EPSRC project partner institutes, University of Exeter, Cranfield University, Indian Institute of Science, Indraprastha Institute of Information Technology Delhi, and the Tata Consultancy Services, for their help and assistance.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflicts of Interest
On behalf of all authors, the corresponding author states that there is no conflict of interest.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
This article is part of the Topical Collection on Dynamics of Disasters.
Rights and permissions
About this article
Cite this article
Jana, S., Majumder, R., Menon, P.P. et al. Decision Support System (DSS) for Hierarchical Allocation of Resources and Tasks for Disaster Management. Oper. Res. Forum 3, 37 (2022). https://doi.org/10.1007/s43069-022-00148-6
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s43069-022-00148-6