1 Introduction

In 2017, the EM-DAT database recorded 335 natural disasters worldwide, affecting over 95.6 million people, killing an additional 9,697, and costing a total of US $335 billion. China, USA, and India experienced the highest number of disasters, with 25, 20, and 15 events, respectively (CRED 2018). Between 1998 and 2017, 1.3 million people were killed by climate-related and geophysical disasters, leaving another 4.4 billion injured, homeless, displaced, or in need of emergency assistance (Wallemacq 2018). Natural disaster risk analysis is the premise and foundation of natural disaster risk management, natural disaster risk zoning, and disaster loss assessment. Natural disasters are extreme geophysical and biological events, which include geological disasters (earthquakes, volcanic eruptions, landslides, avalanches), atmospheric disasters (tropical cyclones, tornadoes, hail, ice, and snow), hydrological disasters (river floods, coastal floods, and drought), and biological disasters (epidemic diseases and wildfire) (Smith 2003). Some scholars believe that natural disasters are caused by natural events or forces, resulting in casualties and loss of human social property (Huang 2009). Natural disaster risk is the uncertainty of future loss caused by the evolution of the natural disaster system itself (Ni and Wang 2012), while risk is a future scenario related to certain adverse events (Huang et al. 2010). Natural disaster risk analysis can help reduce the loss of property and life. As such, natural disaster risk has three characteristics, namely uncertainty, disadvantage, and the future.

Disaster risk analysis must focus on what disaster problems are about to happen, how likely they are to happen, and the possible consequences thereof. However, problems in terms of data in disaster risk analysis are either insufficient data, a small sample, and poor information; or massive data, a large sample, and poor information. Analyzing such requires not only conventional mathematical modeling and system analysis methods, but also decision analysis tools for dealing with uncertain information. Traditional mathematical statistics tools, measurement methods, optimization, and predictive decision-making techniques, as well as constantly innovating gray system theory, uncertainty theory, and intelligent algorithms provide a solid tool for disaster risk analysis modeling. Note that different risk analysis methods may produce different risk assessment results; thus, the nature and applicability of risk analysis methods should be correctly understood (Liu and Shang 2014). In general, practical risk analysis models can improve the accuracy of risk analysis results. Some researchers have summarized five basic methods of risk analysis: the probability of occurrence calculation method, exposure evaluation method, risk identification method, expected value calculation method, and empirical synthesis method (Huang 2011). In addition, some research results delineate quantitative risk analysis methods into two categories, namely the relative value method and the absolute value method (Liu and Shang 2014).

At present, there are various methods for natural disaster risk analysis. Most related researches combine one or several methods of risk analysis. This paper aims to sort out and summarize the methods of natural disaster risk analysis, so as to provide useful references for researchers of natural disaster risk and decision-makers of disaster prevention and reduction organizations. Since not all researchers or decision-makers have proficient mathematical backgrounds, we decided to start from the perspective of natural disaster risk, based on the three characteristics of natural disasters, in order to enhance common understanding of this article. We then go into the research method and model itself, which summarizes the application scope, research results, and emphases of disaster risk analysis methods, and clarifies the advantages, disadvantages, and application scope of various methods to provide a reference for the application and optimization of future disaster risk analysis methods.

The remainder of this paper is structured as follows. Section 2 summarizes the analysis methods based on risk uncertainty, Sect. 3 summarizes the analysis methods based on risk disadvantage, and Sect. 4 summarizes the analysis methods based on risk future. Section 5 provides the conclusion and prospects.

2 Analysis methods based on risk uncertainty

The uncertainty of natural disaster risk corresponds to the randomness and fuzziness of things (Huang et al. 1994). Therefore, the probability and statistics method is often used in the risk analysis of natural disasters. Subsequently, fuzzy mathematics, gray system theory, and other methods were also introduced into the risk analysis of natural disasters.

2.1 Probability and statistics method

The probability and statistics method employs probability theory to study the regularity of a large number of random phenomena. It is also known as the mathematical statistics method. Its main research objects are random events, random variables, and random processes (Sui and Qu 2014). Given that the uncertainty of natural disasters reflects randomness, the probability and statistics method is more commonly used in risk analysis research. For example, Deng et al. (2001) adopted the probability and statistics method to express a grain yield sequence after processing it as a probability density function and then calculated the risk level according to the statistical properties of the probability density function. Guikema (2009) provided an overview of statistical learning theory methods, discussing their potential for greater use in risk analysis. Xu et al. (2013) evaluated the risk of agricultural drought in Chengde city (Hebei province, China) based on the probability and statistics method. The probability and statistics method refers to diverse methods designed to derive inferences from large and complex data sets, and can provide a basis for risk analysis in data-rich systems. This method is widely used in disaster risk modeling (Michel-Kerjan et al. 2013), meteorological disaster risk analysis (Wu et al. 2014), geological disaster risk zoning (Yin et al. 2017), and so on.

In recent years, many scholars have been improving and optimizing the probability and statistics method. For example, Velásquez et al. (2014) proposed a disaster risk assessment method based on empirical estimation (retrospective disaster assessment) and probabilistic assessment, namely the “hybrid” loss exceedance curve. Kostyuchenko (2015) proposed a method of analysis of multi-source data statistics for extreme forecasting and meteorological disaster risk analysis. Li et al. (2018) revealed the risk distribution of local hydrological disasters of integrated hydrological disasters in Urumqi city, Xinjiang, China, and examined the influencing factors to consider in hydrological disasters based on the risk assessment theory of historical disasters and the statistical analysis method.

The probability and statistics method can be used to study the regularity of a large number of random phenomena of natural disasters, which enables us to determine whether a certain judgment can be guaranteed to be correct with a fairly high probability; likewise, we can control the probability of error from a group of samples. Note that the probability and statistics method must be based on sufficient historical disaster data. Furthermore, research on the fuzzy and gray nature of natural disaster risk is relatively weak (Wang et al. 2006).

2.2 Fuzzy mathematical method

The fuzzy mathematical method studies fuzzy phenomena. In 1978, Zadeh (1999) proposed the fuzzy sets theory, defining the concept of probability distribution as a fuzzy restriction that acts as an elastic constraint that may be assigned to the values of a variable. This method has been widely used and optimized in natural disaster risk analysis (Goyal and Sharma 2016; Misra and Weber 1990; Ding and Shi 2002; Huang et al. 1994; Zou et al. 2012a, b).

Information diffusion is a kind of fuzzy mathematical processing method of sample set-values to compensate for the lack of information and optimize the use of sample fuzzy information. Huang et al. (1998) introduced the fuzzy mathematical method into natural disaster risk analysis based on the information diffusion theory. This method is widely used in the risk analysis and assessment of natural disasters such as storm surges (Qi et al. 2010; Zhang et al. 2007), floods (Fang and Gao 2009; Huang et al. 2013), droughts (Liu et al. 2010a, b), and low temperatures (Wu et al. 2016). Some have described flood disaster risk quantitatively by building a compound model based on the information diffusion theory (Zou et al. 2012a, b). The fuzzy mathematics method based on information diffusion overcomes the difficulties of less historical disaster data and an unknown probability distribution of disasters in the research process, thus improving the rationality of the evaluation results. However, the information diffusion model is applicable to the study of a single disaster and has some limitations in comprehensively describing disaster risks (Mao et al. 2012).

The fuzzy mathematics method, especially the fuzzy comprehensive evaluation method, has the characteristics of clear evaluation results and strong systemicity. It can better solve fuzzy and difficult to quantify problems and is suitable for solving various non-deterministic problems. Therefore, the fuzzy mathematics method has a great advantage in dealing with the uncertainty of natural disasters. However, this method is weak in complexity control because of the numerous factors that affect the risk of natural disasters (Xie et al. 2011). In addition, it is also difficult to describe the relationship between indicators using this method.

2.3 Gray system method

Gray system theory was established by Professor Julong Deng in the 1980s (Deng 1982). This is a new method to study the uncertainty of minority data and poor information. Gray system theory methods include gray correlation analysis, gray clustering, gray prediction, gray decision-making, gray planning, gray input–output, gray game, and the gray control system. After decades of development, this method has been applied to many fields and has gradually received positive attention in natural disaster risk analysis. For example, Gong and Forrest (2013) introduced the combination of a meteorological disaster risk analysis and gray system theory. Based on the gray clustering method, Xie et al. (2013) studied the classification of regional meteorological disaster losses. Based on the gray relational analysis method, Gu and Ma (2016) applied gray system theory to calculate the correlation between various factors of drought vulnerability and rural poverty in China.

A new research trend is using the gray system method with other methods in empirical research. Furthermore, the gray system method itself is being continuously optimized and developed. Examples include the gray-random risk rate method (Hu and Xia 2001), tropical cyclone disaster risk assessment model (Liu and Zhang 2012), Gray Hazard-Year Prediction Model (Luo et al. 2017), and so on.

The research object of gray system theory is the uncertainty system of “small data” and “poor information” (Liu and Yang 2015). According to the gray system theory, random quantity is regarded as the gray quantity that changes in a certain range, in which the original data are processed in an appropriate way, the gray numbers are transformed into the generated numbers, and the generated function with strong regularity is obtained from the generated numbers. The quantitative basis of gray system theory is the generated number, which breaks through the limitation of probability statistics and makes the result not an empirical statistical rule based on a large amount of previous data, but a realistic generation rule. Some achievements have been made in the area of uncertainty analysis of natural disaster risk. However, the gray system theory method is still in the stage of continuous improvement. For example, in the construction of gray system model test criteria and specific quantitative standards, the model parameters, model mechanism to conduct in-depth research, and other issues (Liu et al. 2013) still require further development.

2.4 Comentropy method

Entropy is a physical quantity indicating the degree of molecular state disorder. In 1948, Shannon proposed the concept of “comentropy” to describe the degree of uncertainty of information sources (Shannon 2001). In the 1980s, the comentropy method was applied to erosion and general basins. The entropy calculation method was introduced into geomorphology; the entropy value became a quantitative indicator of soil erosion and natural disaster intensity (Ai 1987). Based on the concept of “disaster entropy,” Ding et al. (2006) classified the risk of regional debris flow disaster. Ma and Xu (2010) introduced comentropy to characterize the degree of damage of the risk source to the receptor, conducting an ecological risk assessment of the four meteorological disasters of drought, flood, gale, and hail in the county-level administrative region of Qingdao.

Comentropy is a quantitative expression of the degree of uncertainty and disorder of the system. It does not require the distribution of risk variables and can simultaneously consider the impact of multiple factors on the natural disaster risk. Using the entropy weight method to determine the weight of disaster risk indicators can reflect the disorder degree of the index system and reduce the subjective windage effectively, with strong objective characteristic. However, it often ignores the subjective intention of decision-makers or analysts.

2.5 Comprehensive use of uncertainty analysis methods

The occurrence of natural disasters often has a composite superposition phenomenon (Xu et al. 2006). Thus, a comprehensive evaluation method should be explored. To compensate for the defects of the single uncertainty analysis method, Zuo et al. (2003) proposed a risk calculation model based on fuzzy membership (for fuzzy), gray number (for gray), and an unascertained number (for unascertained). Liu et al. (2010a, b) proposed a method based on GIS and information diffusion to analyze the spatiotemporal risks of grassland fires to livestock production in northern China.

Multi-method complexes have become a trend in natural disaster risk uncertainty analysis and have been widely used in risk analysis of geological disasters (Rong et al. 2012), ice disasters (Wu et al. 2015), flood disasters (Almeida et al. 2017), and so on (Table 1).

Table 1 Analysis methods based on risk uncertainty

3 Analysis methods based on risk disadvantage

3.1 Qualitative analysis method

The disadvantages of natural disaster risk are more abstract and ambiguous than the uncertainty of natural disaster risk. Some scholars conduct disaster risk analysis from a qualitative perspective based on experience and theoretical knowledge. Eldeen (1980) combined a disaster risk and vulnerability analysis and integrated disaster risk aspects into physical planning. Van Aalst (2006) provided an overview of the relationship between climate change and extreme weather, examining three cases. Donner and Rodríguez (2008) explained the different impacts of disasters such as the Indian Ocean tsunami and Hurricane Katrina, and studied population growth, composition, and distribution in the context of disaster risk and vulnerability. Chen et al. (2010) adopted the system engineering method to form the baseline design of the prototype of the multi-hazard disaster risk analysis system. In the context of disaster management in Brazil, Horita et al. (2017) presented a study using oDMN + (a framework for extending modeling notation and a modeling process to connect decision-making with data sources).

The focus of qualitative research is not on operating variables or answering questions with verified hypotheses, but on observing with the intention of describing and understanding, and then turning the research into opinions, feelings, and experiences. These enable researchers to combine their knowledge and experience with their understanding of adverse risk; make the researcher’s information richer; provide the researcher with a larger space for interpretation and creativity; and enable researchers to judge the rules governing the nature, characteristics, and change of research objects. This is more applicable when the data are insufficient. However, this method also has its disadvantages. First, it is more subjective, given that the research results are more abstract and difficult to quantify. Second, since the data collection is based on the personal observation of researchers, and the research object is a specific group, it is difficult to generalize the conclusions to a wider range of occasions; thus, the objectivity of the conclusions will be limited.

3.2 GIS technology method

The geographic information system (GIS) is a computer-based spatial analysis tool, which can analyze and process spatial information and generate relatively intuitive map data. To avoid or mitigate natural disasters, the simplest and most effective method is to carry out risk zoning and upgrade natural disaster management to risk management level (Qin and Jiang 2005). GIS is an optimal method for risk zoning and has been widely used in the risk analysis of geological disasters (Zhu et al. 2002), flood disasters (Ahmad et al. 2013; Büchele et al. 2006; Criado et al. 2019; Qin and Jiang 2005; Xie et al. 2011), and lightning disasters (Zhu et al. 2017).

The combination of GIS and other methods has also been widely used in natural disaster risk analysis and risk zoning. Yonson et al. (2018) combined econometric estimation methods with GIS to systematically assess disaster risks caused by tropical cyclones. Luo et al. (2018), Ji et al. (2018), and Palchaudhuri and Biswas (2016) combined the AHP (analytic hierarchy process), TOPSIS (technique for order of preference by similarity to ideal solution), and other MCDM (multi-criteria decision-making) methods with GIS to conduct risk zoning or assess floods and droughts. Based on ArcGIS, Du et al. (2016) used the information model to quantitatively assess the ecological risk of landslide hazards in the Dali Bai Autonomous Prefecture of Yunnan Province. Huang and Jin (2019) introduced a methodology for a simple 2-D inundation analysis in urban areas using a Storm Water Management Model and GIS.

The GIS technology method has powerful spatial analysis functions; the advantages of graphics, visualization, and so on; and can update information in time. Thus, the content expressed is richer and more flexible. However, this method requires high data quality and strong professionalism and must be combined with other methods when weighting risk analysis indicators.

3.3 Modeling

Modeling is an effective way to quantify disaster risk. Peduzzi et al. (2009) proposed a factors model of influencing levels of human losses from natural hazards at the global scale from 1980 to 2000, and used GIS to model four hazards: droughts, floods, hurricanes, and earthquakes. These were overlaid with the population distribution model to extract human exposure. Tsai and Chen (2011) established a fast risk assessment model suitable for tourism to quickly analyze the characteristics of local disaster formation and risk vulnerability. The construction of a disaster risk analysis model has become an important research tool in the risk analysis of natural disasters such as floods (Apel et al. 2004; Koks et al. 2015; Muis et al. 2015; Trigg et al. 2016), droughts (Hao et al. 2011), and earthquakes (Tamura et al. 2000).

The purpose of disaster risk analysis is to serve disaster risk management. In response to the disadvantages of natural disaster risks, Akgün et al. (2014) established a nonlinear p-central disaster relief facility location model with the goal of minimizing maximum risk. Kawasaki et al. (2017) developed the Data Integration and Analysis System to provide results for the crisis management of threats such as global environmental issues and natural disasters. Li et al. (2010) explored how to build and optimize a Bayesian network using domain knowledge and spatial data.

Based on multi-domain knowledge, integrated multi-source data, or multiple factors, building disaster risk analysis or assessment models can be used in the disaster risk assessment of data scarcity and to quantify the characteristics of risk uncertainty and disadvantages based on a unified risk analysis system. However, since different disaster risk models are based on various types of disaster characteristics, the scientificity, applicability, and practicality of various models must be verified. In general, the advantages of building models are as follows: First, the established model can be more closely connected with reality and solve problems encountered in actual situations, and thus has strong universality and generalization. Second, some of the model algorithms are novel and convenient to calculate, relatively comprehensive, generates reasonable simulation results, are accurate, and have high factor weight credibility. Third, some models have no strict restrictions on data distribution, sample size, and indicators, which are suitable for both small sample data and large systems with multiple evaluation units and multiple indicators, and are also more flexible and convenient. However, building models also have disadvantages: First, some models have cumbersome operation processes, a large amount of data, a huge operation process, and a large amount of programming and program running time. Second, the difficulty of parameter determination in some models determines the relative difficulty of promotion, which needs more professional treatment. Third, there are many factors to be considered in the construction of the model. If the factors are not fully considered, there will be some inconsistencies between the established model and the reality (Table 2).

Table 2 Analysis methods based on risk disadvantages

4 Analysis methods based on risk future

Natural disaster risk is a future scenario related to a natural disaster (adverse event). The UN International Strategy for Disaster Reduction (ISDR) considers risk to be the relationship between the hazard factor and vulnerability of the hazard-affected body caused by natural or human factors, manifested as the probability of causing damage or loss of life, property, or economic activity. In other words, risk is equal to the product of the hazard and vulnerability (ISDR 2004). Therefore, the risk of natural disasters is the possibility of characterizing the adverse effects of hazard factors on the future of hazard-affected bodies (personnel casualties, economic losses, crop losses, etc.). In the current research, the technical methods of risk characterization, ensemble prediction, and scenario analysis were used to qualitatively and quantitatively show the adverse effects of hazard factors on the future of hazard-affected bodies.

4.1 Risk characterization method

The risk characterization method mainly presents disaster risk research results in the form of risk mapping and risk curve or risk index construction (Ma 2015).

In the area of risk mapping research, Skakun et al. (2014) presented the method of flood hazard mapping and flood risk assessment using a time series of satellite imagery. Wu et al. (2018) introduced a top-down (or downscaling) approach to generate a high spatial resolution asset value map for China in 2015, analyzed the spatial characteristics of exposure, and uncovered the contributions of both natural hazard and disaster physical and social drivers across space and time. Lu et al. (2018) proposed a framework for flood risk mapping.

In the area of risk curve research, the construction of a vulnerability curve can accurately and quantitatively express the future relationship between the intensity of hazard factors and vulnerability of hazard-affected bodies. Vulnerability is a property that makes the structure and function of the system easy to change because of the sensitivity of the system (subsystem, system components) to internal and external disturbances and the lack of coping ability. This is an intrinsic property derived from the system, and is only shown when the system is disturbed (Li et al. 2008). As an important link in damage estimation and risk assessment, vulnerability measures the extent of damage suffered by the hazard-affected body and acts as a bridge between hazard factors and disaster situations (Zhou and Wang 2012). The vulnerability curve, also known as the damage curve or damage function, is used to measure the relationship between the intensity of different disasters and their corresponding losses, which can be expressed in the form of curves, surfaces, or tables (Shi 2011). White first proposed the vulnerability curve method in 1964 for the vulnerability assessment of flood hazards. Methods to construct a vulnerability curve include establishing a vulnerability curve based on an expert weight vulnerability index analysis, a flood risk curve based on a frequency analysis, vulnerability curve based on various statistical methods, and a damage curve based on probability prediction and simple loss model. The vulnerability curve method has been widely used in natural disaster risk analysis such as of flood (Domeneghetti et al. 2015; Gissing and Blong 2004; Penning-Rowsell and Chatterton 1977; Godfrey et al. 2015; Bouwer et al. 2010; Papadrakakis et al. 2008; Sultana and Chen 2007, 2009; Ward et al. 2011), geological (Singhal and Kiremidjian 1996, 1998; Shih and Chang 2006; Fuchs 2009; Fuchs et al. 2007), hail (Hohl et al 2002a, b), and tsunami disasters (Tarbotton et al. 2015).

In the area of risk index research, the construction of a risk index has become an effective assessment tool for conducting regional disaster risk analysis, comparing relative risks between regions, and describing the relative contribution of different factors to overall risks (Dilley et al. 2005; Islam et al. 2013). According to the different types of disasters or research purposes, researchers have established different risk indexes to analyze the risks of disasters such as earthquakes (Davidson and Shah 1997), hurricanes (Davidson and Lambert 2001), drought (Jülich 2014), floods (Giannakidou et al. 2019; Guo et al. 2014; Quan 2014; Rahman et al. 2016), snowfall and freeze (Gao 2016), sea ice (Ning et al. 2018), and landslides (Hong et al. 2019), or established risk composite indexes to measure the relative risks of various natural disasters. For example, the Disaster Risk Index (DRI) established by the United Nations Development Programme (UNDP) in 2004 (Pelling et al. 2004) is used to measure the relative vulnerability of countries to three major natural disasters: earthquakes, tropical cyclones, and floods, and to identify development factors that lead to risks. De Almeida et al. (2016) established the Brazilian Disaster Risk Index based on the Global Risk Index, including exposure and vulnerability indicators, and used statistical analysis and the GIS method to evaluate the disaster risk in Brazilian counties. Chen et al. (2017) designed a nonlinear evaluation model named the “universal risk model,” taking a Chinese disaster risk index as the research object. They provided a disaster risk index and risk map of 31 provinces in China by adopting the nonlinear disaster index model combined with the world risk index structure. Kumar and Bhaduri (2018) used the composite vulnerability index method to analyze and measure the vulnerability of disaster risk in the urban villages of Delhi.

Risk mapping can express the results of disaster risk analysis and related information clearly on the map according to cartographic specifications, which is the basis of regional disaster risk management. However, further research is needed on the standardization, unity, and diversification of cartographic results. Vulnerability analysis is an important bridge linking disaster and risk studies. Its main purpose is to analyze the interaction between social, economic, natural, and environmental systems, as well as their driving force, inhibiting mechanism, and ability to respond to disasters. The quantitative relationship between the intensity of the hazard factors and the vulnerability of the hazard-affected bodies is the core element of the vulnerability curve. However, there are still some problems in the research of vulnerability curve, such as the lack of certain norms and evaluation standards in terms of the definition of vulnerability, index selection, data specification, and curve accuracy in the same disaster. The disaster risk index can summarize a vast volume of technical information on natural disaster risks in a way that is easily understood by non-experts and can be used to make risk management decisions. Although the risk index does not provide an in-depth, high-resolution risk assessment as does the loss estimation model, it can reduce the amount of data and calculations and is easier to interpret (Davidson and Lambert 2001).

4.2 Ensemble forecast technology

Ensemble forecast technology is a numerical forecast product for short-term weather forecasting. The development of this technology provides a new and effective tool for improving the ability of rainstorm forecasting and risk analysis of storm floods. Krzysztofowicz (1998) developed a comprehensive probabilistic hydrometeorological forecast system consisting of a quantitative precipitation probability forecast subsystem, river water level probability forecast subsystem, and flood warning decision subsystem to analyze and quantify the uncertainty of flood forecasting. Deng et al. (2006) estimated the future disastrous weather process by studying the uncertainty of numerical prediction and using a group of numerical prediction results. Zhao et al. (2017) used a reorganized precipitation collection forecast to drive the hydrological model and realized the 12 time flood probability forecast of the flood process in three sub-basins.

The ensemble forecast technology not only gives the single best possible forecast, but also quantitatively estimates the uncertainty of weather forecast. The uncertainty of the weather forecast changes day by day depending on the weather situation, and ensemble forecasting provides an estimate of the uncertainty of this day-to-day change. However, this method has strong professionalism and has high requirements for researchers’ meteorological knowledge. Currently, this method is mainly used in storm flood forecasts, needing expansion to other disaster forecasts.

4.3 Scenario analysis method

Scenario analysis was first applied to the military and has been used for decades. Based on the various key assumptions for the significant evolution of the research object, the method conceives various possible future solutions through detailed, rigorous reasoning and a description of the future. The most basic viewpoint of scenario analysis is that while the future is uncertain, part of it is predictable (Yue and Lai 2006). The motivation for scenario analysis is not to explore the possibilities of the future or expected future, but to describe the most likely future that the policymakers can expect (Yu and Qian 2006).

At present, disaster risk analysis research based on the scenario analysis method is concentrated in the field of flood risk. Ranger et al. (2010) evaluated the future flood risk in Mumbai under different climate scenarios by using the adaptive regional input–output method. Yin et al. (2011) took the rainstorm waterlogging disaster in Jing'an district, Shanghai as an example, and proposed a comprehensive analysis method for the risk assessment of an urban small-scale thunderstorm waterlogging disaster. Hirabayashi et al. (2013) employed the state-of-the-art global river routing model with an inundation scheme 6 to compute the river discharge and inundation area based on the outputs of 11 climate models to determine the relationship between flood frequency and climate warming in different future scenarios. Liu et al. (2012, 2015), and Wang et al. (2013) used the Pearson-III model and ArcGIS spatial analysis tools to conduct empirical studies on three spatial scales: large, medium, and small, including typhoon storm and flood risk scenarios.

According to existing studies, scenario analysis has the following characteristics: (1) The method recognizes that future developments are diversified, there are many possible development trends, and the prediction results will be multidimensional. (2) The method takes group intentions and aspirations of decision-makers in the future as an important aspect of scenario analysis, and maintains an open exchange of information with decision-makers in the scenario analysis process. (3) Scenario analysis embeds a large number of qualitative analyses in the quantitative analysis to guide the quantitative analysis, which is a new prediction method integrating qualitative and quantitative analyses. In general, the biggest advantage of scenario analysis is that it allows risk managers to identify certain trends in future changes and avoid overestimating or underestimating future changes and their impacts. Therefore, scenario analysis provides a new method for regional natural disaster risk analysis (Zhao et al. 2012) and a new idea for the in-depth exploration of the spatiotemporal scale of natural disaster risk (Zhao 2012). The scenario analysis can comprehensively analyze various natural disasters; however, this method relies to some extent on the manager's intuition, lacks the programmatic mode, and the process of analysis is more complicated and difficult to operate. At the same time, there are certain limitations in the scientific, rational, complex, and short-term effects of scenario setting, which determine the reliability and feasibility of the evaluation results (Table 3).

Table 3 Analysis methods based on risk future

5 Conclusion and prospects

Currently, there are many methods for natural disaster risk analysis and assessment. This paper classified and summarized the current research methods according to the three characteristics of natural disaster risk, namely uncertainty, disadvantages, and the future, and analyzed the advantages, disadvantages, and applicable scope of the different methods. From the perspective of method characteristics, different research methods have different applicability to various natural disasters.

  1. 1.

    Uncertainty analysis methods The probability and statistics method, used to study the regularity of a large number of random phenomena of natural disasters, and can derive inferences from large and complex data sets, and provide a basis for risk analysis in data-rich systems. However, the method must be based on sufficient historical disaster data, and the research on the fuzzy and gray nature of natural disaster risk is relatively weak. Fuzzy mathematics and gray system methods can better solve the problems of natural disasters risk with fuzzy, difficult to quantify, incomplete, and uncertain data. The fuzzy mathematical method overcomes the difficulties of historical disaster data and the unknown probability distribution of disasters in the research process, which have the characteristics of clear evaluation results and strong systemicity, and can better solve fuzzy and difficult to quantify problems. However, it is weak in complexity control because of the numerous factors affecting the risk of natural disasters, and it is difficult to describe the relationship between indicators using this method. The grey system theory can deal with the data types of “small data” and “poor information.” The quantitative basis of grey system theory is generated number, which breaks through the limitation of probability statistics and makes the result not an empirical statistical rule based on a large amount of previous data, but a realistic generation rule. However, the testing criteria and specific quantitative criteria of the model need further study. Nevertheless, the comentropy method can reflect the disorder degree of the index system, reduce the subjective windage effectively, has no requirement for the distribution of risk variables, and can simultaneously consider the impact of multiple factors on the natural disaster risk, compensating for the shortcomings of fuzzy mathematics. The method often ignores the subjective intention of decision-makers or analysts. Therefore, the comprehensive analysis of natural disaster risk with multiple methods has become an effective way to solve the problems of insufficient data, the probability distribution of disasters, and numerous influencing factors of a single method.

  2. 2.

    Disadvantage analysis methods Qualitative analysis methods provide basic theories and paradigms for the quantitative analysis of natural disaster risk; clarify the characteristics of disasters, internal factors, and formation mechanisms; enable researchers to combine their knowledge and experience with their understanding of adverse risk; make the researcher’s information richer; provide the researcher a larger space for interpretation and creativity; and enable researchers to judge the rules governing the nature, characteristics, and change of research objects. It is more applicable when the data are insufficient. However, being more subjective, the research results are more abstract and difficult to quantify. The research object being a specific group, it is difficult to generalize the conclusions to a wider range of occasions, limiting the objectivity of the conclusions. The functions of a GIS spatial analysis and superposition provide a favorable tool for risk zoning, and can update the information in time, making the content expressed richer and more flexible. However, this method requires high data quality and strong professionalism. Risk modeling is an effective way to quantify disaster risk analysis and can be closely connected with reality in order to solve the problems put forward in combination with the actual situation. In addition, some models have novel algorithms, convenient calculations, comprehensive calculations, reasonable simulation results, high accuracy, and high credibility of factor weights. Some models have no strict restrictions on data distribution, sample size, and indicators, which are suitable for both small sample data and large systems with multiple evaluation units and multiple indicators, and are more flexible and convenient. Given that different risk analysis models are based on different disaster characteristics, the scientific and practical nature of the model must be verified.

  3. 3.

    Future analysis methods In view of the future characteristics of natural disaster risk, methods such as risk characterization, ensemble forecast, and scenario analysis are currently used. The risk characterization method mainly displays the disaster risk intuitively through risk mapping and risk curve or risk index construction, which have the advantages of directness and comparability. However, the subjective influence of the evaluators when the indicators or criteria are selected means that the applicability of these methods needs further exploration. Furthermore, there is a lack of certain norms and evaluation standards in terms of the definition of vulnerability or risk index, index selection, data specification, and curve accuracy in the same disaster. Ensemble forecast technology is currently used in the field of professional meteorology, especially for flood forecasting, and provides an estimate of the uncertainty of the daily weather forecast changes depending on the weather situation. However, few studies focus on ensemble forecast technology for other disasters, which thus need expansion. The scenario analysis method embeds a large number of qualitative analysis in the quantitative analysis to guide the quantitative analysis, which is a new prediction method integrating qualitative and quantitative analyses. It can comprehensively analyze various natural disasters, compensating for the shortcomings of most methods that are only applicable to analysis of a single disaster. However, this method relies to some extent on the manager's intuition, lacks the programmatic mode, the analysis process is more complicated and difficult to operate, and the rationality and complexity of scenario setting need further study.

In order to improve the scientific and accurate results of risk analysis and further management’s requirements for the accuracy and strength of natural disaster risk analysis, further research is required to continuously optimize and develop risk analysis technology and methods. Multi-disaster comprehensive risk analysis, disaster risk dynamic assessment, and inter-regional disaster risk grading will become the new trend of future research.