1 Introduction

The term seismomatics born at the homonym conference which took place in Valparaiso, Chile, from 5th to 9th of January 2015 at the Technical University Federico Santa Maria. The organizers, Emilio Porcu, Patricio A. Catalán and Ronny Vallejos, invited eminent scientists from disciplines as diverse as seismology, oceanography, tsunamis, mathematics, physics, and statistics, from all around the world. The main objective was to enable and promote communication amongst disciplines having different languages, but a common objective: understanding the structure of certain classes of natural and antropogenic events, that given the circumstances, could become catastrophes.

The main goal of seismomatics is to create a common framework where mathematics and statistics cover a special role at the service of applied branches explicitly interested in certain classes of catastrophes. Special emphasis was put on earthquakes, tsunamis, floodings, and air and water pollution. These events are of special importance for Chile. In fact, as it turned out, 2015 would later become one of the worst years in Chilean history in terms of disasters due to natural events for Chile, when flash floodings, volcanic eruptions and an earthquake and tsunami took place throughout the year, affecting thousands of lives and causing major economic impact. It is clear that science must supply the backbone to improve disaster preparedness, mitigation and response, and help it become more resilient. However, as we shall see in this special issue, the problem is far from being a local one, and is indeed relevant all over the world.

In particular, the role of statistics in this case is to introduce the paradigm of space-time uncertainty (probabilistic models) into fields which have been typically characterized by deterministic approaches. So far, there seems to be little interaction between above disciplines. In the case of earthquakes, for instance, Kagan points out that theoretical physics has largely failed to explain and predict earthquakes occurrences. Among various points he highlights, the following is especially attractive to space-time statisticians: the intrinsic randomness of earthquake occurrence needing the use of stochastic point processes and appropriate complex statistical techniques.

Clearly, an improved interaction between these communities will offer the key for a better assessment and prediction of earthquakes occurrences, magnitude distributions, focal mechanisms (these last being extremely difficult to theorize). Moreover, when an earthquake takes place under the ocean surface, it can trigger a tsunami. Here, more details about the earthquake are required, for instance, not solely where the earthquake occurs but also how the crust is displaced, see e.g. Power and Downes (2009) for a review. One approach to address this problem is to develop logic trees to establish a Probabilistic Tsunami Hazard Assessment (e.g. Geist and Parsons 2006; Annaka et al. 2007; González et al. 2009). However, concentration of slip can dramatically change the tsunami response, see e.g. Geist and Dmowska (1999), Geist (2002), and Geist and Lynett (2014) for an overview. Much is needed to be done to improve its characterization during the event itself, but perhaps more importantly, for future events. These efforts have increased over the last few years, especially when the influence of the slip distribution of the earthquake is taken into account to develop synthetic but realistic tsunami inundation and/or runup scenarios (Løvholt et al. 2012; Power and Downes 2009; Power et al. 2007; Reese et al. 2007; Davies et al. 2015). For example, the detailed information gathered during the Tohoku-Oki Earthquake and Tsunami has allowed testing the predictive capabilities of these synthetic models Goda et al. (2014, 2015), Fukutani et al. (2015). Nevertheless, despite these advances, operational implementations of these methods are still under research and sometimes a much simpler approach is being used to account for this uncertainty. For example, Power mentions that a simple majoration approach has been used in New Zealand for operational purposes, and a similar approach has been put in practice in the recently developed tsunami warning system in Chile, where a larger magnitude earthquake is considered when assessing the tsunami hazard. Hence, a clear research opportunity is present. It must be noted that this is true not only for earthquakes and tsunamis, as similar considerations can be made for other fields, such as oceanography (see Combes et al. 2015; Shaffer et al. 1999; Hormazabal et al. 2004; Wainwright et al. 2015). In conclusion, we believe it is time to join these communities, as a starting point for common frameworks, paradigms and methodologies for a better assessment of these phenomena.

As noted by Cobb and Watson (1980) (see also Mateu and Porcu 2016), statistical analysis of catastrophes seems a paradoxical term, because statistical models do not, as a rule, contain degenerate singularities, and catastrophe theory is generally perceived as a purely deterministic branch of differential topology. Apparently, a stochastic approach can give a non negligible contribution to a better analysis and assessment of catastrophic events. This direction is apparently taken in the book by Woo (1999), which has an impressive list of mathematical and probabilistic approaches to be taken for several types of catastrophes, which certainly includes extreme values theory. Last but not least, forecasting catastrophes is a fundamental aspect, but the word forecast should be taken with much caution, and we refer to Geller et al. (1997) for a very interesting discussion.

2 Earthquakes in Chile

Chile has been continuously exposed to the consequences of large earthquakes. Their impact is profound: since 1900, earthquakes are responsible for more than 97% of human casualties—over 50,000—and 87% of economic damage due to disasters of natural origin. Written record of earthquakes began in mid-1500s with the arrival of the Spaniards; since then, a magnitude 8 or more earthquake has taken place every dozen of years, as an average. More than ten events with magnitude around 8 or larger have taken place in this part of world in the last 100 years. Three events with \(\hbox {M}>8\) have taken place only in the last six years. Historical records of local damage, reports of tsunami heights recorded in Japan and recent paleoseismological studies have evidenced several earthquakes of this sequence with magnitudes close to 9 and above. Among them is the 1960 event, the largest earthquake ever recorded since the beginning of instrumental seismology with computed magnitude (Mw) of 9.5. This is the consequence of the large relative plate convergence rate (6.5 cm/yr) between the Nazca and South American plates.

Several seismogenic zones are recognized in Chile based on the analyses of large earthquakes, the hypocentral locations earthquakes large enough to be recorded at teleseismic distances, and studies of smaller earthquakes carried out with recent permanent and temporary local networks:

a) Nazca South America coupling region. Large thrust earthquakes at shallow depths, because of their relatively high frequency of occurrence, are responsible for most of the damage recorded in history. They are located along the coast from Arica (18Â\({}{^\circ }\hbox {S}\), the northernmost extreme of coastal Chile) to the triple junction at Taitao Peninsula (46Â\({{}^\circ }\hbox {S}\)). These events take place as a result of the convergence of the Nazca beneath the South American plate at about 6.5 cm/yr. Farther south, there are no records of large earthquakes as a result of the approximately 1.8 cm/yr subduction of the Antarctic plate beneath the South American plate. A fundamental tectonic difference between the regions north and south of the triple junction is that the Chile Ridge has been recently subducted therefore relatively young lithosphere is being subducted south of the triple junction. With magnitudes that can exceed eight, these events are usually accompanied by notable coastal elevation changes and, depending on the amount of seafloor vertical displacement, by catastrophic tsunamis. Their rupture zones extend down to 45–53 km depth (Tichelaar and Ruff 1991) and their lengths can reach well over one thousand kilometers. The hazard due to these large events is well recognized and understood. Return periods for magnitude 8 (and above) events are of the order of 80–130 years for any given region in Chile, but about a dozen years when the country is considered as a whole. Mega-thrust earthquakes seem to have much longer return periods, of the order of a few centuries for any given region (Cifuentes 1989; Barrientos and Ward 1990). Recent paleoseismological studies carried out in southern Chile indicate recurrences of the order of 300 yr for these very large earthquakes (Cisternas et al. 2005; Moernaut et al. 2014). Last large examples of this type of earthquakes have been the M \(=\) 8.8, 2010 Maule, M \(=\) 8.2, 2014 Iquique and the M \(=\) 8.4, 2015 Illapel earthquakes.

b) Intermediate-depth earthquakes. Large intermediate-depth (60–200 km) tensional as well as compressional events within the subducting Nazca plate are a common occurrence. A suite of large magnitude events (M around 8) has been reported to occur at depth between 80 and 100 km, all of them associated with extensional faulting: M \(=\) 7.8 January 1939 (Beck et al. 1998), M \(=\) 8.0 December 1950 (Kausel and Campos 1992) and the M \(=\) 7.7 June 13, 2005 (Peyrat et al. 2006) earthquakes. The latter, based on aftershock distribution and GPS deformation data, is reported to have taken place along a 60 km by 30 km sub-horizontal plane with maximum displacements of the order of 6 m. The 1950 and 2005 events took place in northern Chile (near the cities of Calama and Iquique respectively), at latitudes \(23{{}^\circ }\, \hbox {S}\) and \(21{{}^\circ }\, \hbox {S}\), while the 1939 event, the deadliest earthquake in Chilean history, produced nearly 28,000 fatalities in the region around the city of Chillan \((36.6\, {{}^\circ }\, \hbox {S})\). Malgrange and Madariaga (1983) reported a suit of tensional events of this type within the Nazca plate in the M7+ range, of which the 1965 La Ligua event stands out for the destruction it caused. Kausel and Campos (1992) suggest that this type of event apparently feature high stress drops. On 13 September, 1945, a magnitude 7.1 earthquake took place beneath the metropolitan region of Santiago; Barrientos et al. (1997) reported a 90-km-deep focus with extensional type of faulting which generated 0.13g of peak ground acceleration. If this region behaves in a similar manner as the one further south, Mercalli intensities of the order of VIII or more, considering site amplification factors- would be reached at Santiago, the Chilean capital city which concentrates nearly 40% of the country Â’ s population, should a magnitude 8 take place underneath, at around 90 km depth. Additionally, complex stress interaction gives rise to down-dip compressional events at about 60–70 km depth, closer to the coast. These events can reach magnitudes over 7, as reported by Lemoine et al. (2001) and Pardo et al. (2002), in particular for the very damaging earthquake of October 1997 Punitaqui event.

c) Shallow seismicity. Very shallow seismicity (0–20 km) in a few places within the over-riding plate, such as the cordilleran region of south-central Chile as a consequence of the oblique convergence of the Nazca plate. Magnitudes up to 7.1 have been reported for earthquakes in this region (21 November, 1927). The southern extreme of the continent is tectonically dominated by the Magellan-Fagnano Fault System, a left-lateral strike-slip fault resulting from the relative horizontal displacement of the Scotia and South American plates at a rate of the order of 7 mm/yr (Thomas et al. 2003). Two earthquakes of magnitude 7 each, separated by 8 hours, on 17 December 1949, were reported in this region most likely associated to the Magallanes–Fagnano Fault System (Klepeis 1994; Smalley et al. 2003, 2007). Another seismogenic region that has become the subject of recent studies is located at shallow depths in the Andean cordillera in the central part of Chile. Godoy et al. (1999) and Barrientos et al. (2004) carried out structural and seismicity studies to understand this shallow active region, in which the largest known earthquake (less than 10 km depth) took place on September 4, 1958 (M \(=\) 6.9, Lomnitz 1960; Alvarado et al. 2009). Also, shallow seismicity \((\hbox {h} < 20\,\hbox {km})\) of relative large magnitude (>5.5) has been recently observed beneath the Andes main Cordillera at latitudes \(19.6{{}^\circ }\, \hbox {S}\) (Aroma; July, 2001), \(35.8 {{}^\circ }\, \hbox {S}\) (Melado River; August, 2004), \(38{{}^\circ }\, \hbox {S}\) (Barco Lagoon; December, 2006) and at \(45{{}^\circ }\, \hbox {S}\) (Aysen Fiord; April, 2007). All these events show significant strike-slip component of displacement.

d) Deeper seismicity occurs further to the east, which can reach up to 650 km depth beneath Bolivia and north-western Argentina. These events usually present extensional component along the plate downdip. The largest known earthquake in this region was the Mw \(=\) 8.2, 1994 earthquake at 647 km beneath northern Bolivia, which was reported to be felt in Canada and the U.S.

e) Apart from the seismicity associated to the subducting East Pacific Rise approximately at latitude \(46{{}^\circ }\, \hbox {S}\), outer-rise earthquakes are also present along the subduction margin. This seismicity -mainly extensional faulting- is observed seaward beyond the trench. It is particularly evident in south-central Chile, offshore of the M \(=\) 9.5, 1960 \((37.5 {{}^\circ } \hbox {-}\, 46\, {{}^\circ }\, \hbox {S})\) and more recently, offshore of the large co-seismic fault displacement associated with the M \(=\) 8.8, 2010 earthquake \((34{{}^\circ } \hbox {-}37.5{{}^\circ }\, \hbox {S})\).The largest aftershock, Mw \(=\) 7.4, of the 2010 sequence, belongs to this type of earthquake. Farther north, a Mw \(=\) 7.0 earthquake of the same type (extension) took place in April 2001, as well as some of the aftershocks of the Illapel 2015 Mw \(=\) 8.4 earthquake.

3 Special issue on seismomatics

The contributions of the special issue on seismomatics can be divided in two classes: four papers are related to earthquakes science and community and the remaining three to spatial models and extremes.

3.1 Earthquake contributions

Two papers in the earthquake class are based on volunteer data gathered through web services, while the other two are related to seismic risk mapping.

The contribution of Finazzi and Fassò (2016) is related to what has been termed above as earthquake community. In fact considering sensor networks from a social point of view, they develop statistical tools for earthquake detection based on a crowdsourced network which uses volunteer’s smartphones. Intelligence is distributed among a worldwide spread smartphone app and a central server, which is capable not only to collect vibration signals from the community but also to give earthquake early warnings (EEW) and to establish a communication channel among the social network members. The statistical detector proposed is based on a Poissonian score function whose performance is analyzed both on realistic Monte Carlo simulations and three seismic events regarding Santiago (Chile), Iquique (Chile) and Kathmandu (Nepal).

Cameletti et al. (2016) use crowdsourced questionnaires regarding earthquakes felt in Italy, and revitalize the use of Mercalli–Cancani–Sieberg scale in order to model magnitude and distance from the hypocentre. From the statistical point of view, this is done using an ordered probit model, which fully deploys the qualitative and ordinal nature of the macroseismic intensity as defined on the Mercalli–Cancani–Sieberg scale. The result is a new intensity prediction equation (IPE) for Italian earthquakes using the macroseismic data available through the HSIT survey.

Leveraging on the long experience developed by his research group at UCLA since 1970’s, Kagan (2016) considers long term global earthquake forecasting. In particular he discusses the new forecast model nicknamed “GEAR” for global earthquake activity rate model, which is an hybrid model taking into consideration both smoothed seismicity and tectonics. In fact it is based on a high resolution map of the Earth surface displacement which, after converting it to an earthquake rate, is combined with the maps based on seismicity smoothing.

Siino et al. (2016) use hybrids of Gibbs point process models to describe the spatial distribution of earthquake events (with a magnitude 4) in the Hellenic area from 2005 to 2014. In particular they compare an hybrid model based on a kernel smoothed trend in geographic coordinates with a hybrid parametric model based on a second-order spatial polynomial trend and spatial covariates. Since the area is mainly characterized by microseismic activity, they find that distance to the nearest volcano is not significant in explaining the spatial intensity, while, as expected, the distance to the plate boundary is negatively related to the conditional intensity.

3.2 Spatial models and extremes

In the second class we have three papers. The contribution by Arroyo and Emery (2016) focusses on the important problem of simulating multivariate random fields with stationary Gaussian increments in a d-dimensional Euclidean space. There are very few contributions in the literature regarding this issue, and the authors made a remarkable job. They consider a spectral turning-bands algorithm, in which the simulated field is a mixture of basic random fields made of weighted cosine waves associated with random frequencies and random phases. The weights depend on the spectral density of the direct and cross variogram matrices of the desired random field for the specified frequencies. They also show that the algorithm is computationally efficient and very accurate.

The contribution by Lenzi et al. (2016) faces the problem of prediction of wind power and uncertainty quantification . They consider two different time scales, using a hierarchical model where the spatial autocorrelation is set up through a latent Gaussian field. Specifically, they work under the INLA framework and compare their results with classical geostatistical methods.

Finally, the work by Castro and De Carvalho (2016) considers a density regression model for the spectral density of a bivariate extreme value distribution. The objective is to assess how extremal dependence can change over a covariate. They consider an extension of the Nadaraya–Watson estimator through a double kernel estimator, where the usual scalar responses are replaced by mean constrained densities on the unit interval. They show their results for both synthetic as well as real data.