1 Introduction

1.1 A Tribute to Nicholas Ambraseys

Nikos Ambraseys left the Seismological and Engineering community with a vast legacy in Engineering Seismology and Earthquake Engineering. The present work has many connections to some of his contributions. To start, I want to mention just a few of the most important contributions he made:

  • The role of Nikos Ambraseys in Engineering Seismology.

  • The contributions in Historical Seismicity in many regions of the World, particularly in Centre Europe to Middle-East and Far-East Areas.

  • The countless field missions where he could learn and interpret the observations. Citing Douglas et al. (2014), Ambraseys summarised the importance of field missions: “The site of a damaging earthquake is a full-scale laboratory from which significant discoveries may be made, by seismologists, geologists, engineers, sociologists, or economists, not to mention politicians”.

  • The first steps to organise an Archive for Strong Motion Data at the World level. This was the first initiative to treat the data spread worldwide in various countries and entities.

  • His post at Imperial College and the School of thought created there. This was a source of gathering brain power with the launching of Master Courses on Earthquake Engineering and Engineering Seismology.

  • The Attenuation Curves: several contributions at different times were essential, and his name is recognised as one of the most well-known worldwide.

  • The sea-bed tsunami interpretation. This is a topic dealing with the influence of large tsunamis on ships near the epicentral area that can suffer from the direct acoustic P-wave. This subject has never been treated in detail until nowadays.

The present review paper is organised into seven parts as follows:

  1. 1.

    Introduction

  2. 2.

    Earthquakes continue to devastate Humanity

  3. 3.

    Evolution of Seismology and Engineering Construction since mid-1700 until the mid-1900s

  4. 4.

    Main achievements in recent periods

  5. 5.

    Intensity Scales: how can we upgrade them

  6. 6.

    Changes in paradigm in Seismology and Earthquake Engineering & New lines for future developments

  7. 7.

    Final Considerations

After this Introduction, which starts with a tribute to N. Ambraseys, we will look at the evolution of Seismology and Engineering Construction from early 1700 till 1950 to understand the significant development of those two fields of knowledge. Then, we perspective the new advancements towards future mitigation of earthquake impacts and look to society with a clear proposal in the direction of sustainability and ecological challenge in the twenty-first century. We will compare the evolution in time of Seismology and Construction versus Earthquakes and how they paved our present and prospect the needs that future decades require. We will spend most of our time analysing the period 1755–1950 because not enough attention has been paid to this exciting period. We then jump to our days to look into a few significant problems that require the involvement of the scientific, technical and political communities to mitigate the real world of earthquakes. Finally, we speculate on which initiatives need to be addressed in the future. In particular, (i) We will look into the developments Intensity Scales should pursue to reduce uncertainties. More than 20 years have passed since the last upgrade, and today the information is much more extensive and reliable than in the past. Several examples will be presented to illustrate how the “frequency of motion” should be included in the main categories of the Scale and how it could be beneficial to add a few more descriptors to the Scale, namely shaking of objects and sloshing water in recipients. (ii) We will analyse the lines of development to mitigate earthquake impacts, and respond to present and future needs, concentrating on the new scientific developments that are changing Seismology and Earthquake Engineering into a more proactive science, such as:

  • EEWS (Earthquake Early Warning Systems) and health monitoring, driven by the powerful low-cost instrumentation (MEMS, Micro-Electromechanical Systems) and complemented with the citizen science. Also, the changes in Earthquake Engineering with the revolutionising health monitoring, as a precautious indicator of malfunction of structures, and as a rapid system for evaluation of post-earthquakes, will be discussed.

  • Performance-based design is a breakthrough in Earthquake Engineering, acting as a response to fight the significant uncertainties in all design processes, especially for new structures.

  • Field missions, despite all new technological tools, continue to stay an indispensable means to better understand the performance of structures, infrastructures, and mitigation practices.

  • Finally, “Machine Learning” will extract new reliable information from all the advancements merging all the ingredients into simple recommendations.

As a long text, to facilitate the reading and get to the pursued goals, each Section initiates with a summary of the topics in discussion and ends with some of the main achievements. This causes repetitions here and there, but we think they are essential to convey the ideas. To enhance some critical comments or Notes, we use the text enclosed in Boxes.

This review paper, entering into the historical times of earthquake science, will enhance the text with many references, which were selected as necessary for the discussion and for the proposals made along with the text, and as general references for completion of the materials covered. A few references with two authors, which become milestones, are referred to with the two names. Many references are books representing landmarks for the period examined in this work. The references not cited in the text are organised in "Appendix 1". A collection of images captured in a few video cameras are also mentioned, as they play an essential tool for supporting some of the facts presented and discussed.

To reduce the size of the written material, some of it is just enumerated and placed in Tables or Boxes with simplified text accompanying them. We often use direct text transcriptions of their authors or part of them not to modify the message’s meaning.

1.2 Point summary of main items

1.2.1 Seismology

  • Tentative explanation of the mechanisms of earthquakes from Chinese Times to mid-nineties.

  • The Great Disasters of history. Statistics since 1700 in the face of population growth.

  • The nineteenth century of field observations—interpretations. First instrumentation.

  • The early twentieth century on Intensity scales.

  • The significant advances in early 1950. The first World Networks of Seismological Stations.

  • The digital era—1980.

  • Massive digital instrumentation of high quality.

  • The strong motion networking.

  • EEWS. Problems and future.

  • MEMS and Citizen Science.

  • Arrays and data mining.

1.2.2 Construction and earthquakes

  • The first treaties of Vitruvio and Roman Construction.

  • The new technologies in the Reconstruction of Lisbon after 1755.

  • The Casa Baraccata.

  • The Construction Encyclopaedia of Gwilt.

  • The introduction of Steel in the nineteenth century.

  • The outstanding construction accomplishments.

  • The Introduction of Reinforced Concrete in the early twentieth century.

  • Lessons from Great Disasters till mid 1950.

  • The codes of practice.

  • The Strong motion advancements.

  • The Material Science.

  • Base Isolation (rubber bearings & magnetic fluctuation).

  • Conflicts of structural safety with patrimonial values, aesthetics, and comfort necessities.

  • Health Monitoring.

  • The use of Machine Learning and Data Mining.

1.3 A few needs for acting

  • Unification of Scales → shaking, tsunami, acoustics; introduce frequency and amplitude as an added parameter to get Intensities.

  • Qualify and better quantify the terms of the descriptors, especially up to Intensities VI, to reduce uncertainties.

  • Introduction of structural dynamics knowledge before assigning Intensities above VI.

  • Use DYFI (Did You Feel It) enquires as single vibratory data points. Use Video-cameras as an added value to describe the wave field.

  • Reduction of uncertainties in several items: conversion of Intensities into strong-motion IM's.

  • Codes and quality control. Dignify the Profession of Engineer.

  • Policies for massive retrofitting for non-conformity construction.

  • Education. Public perception.

  • Reverse the paradigm of earthquakes as disasters into earthquakes as a sustained challenge.

2 Disasters keep on causing massive destruction


Summary

Unlike other natural events, earthquakes do not increase their pace, but in some regions energy dissipation takes a very long time to occur.

Different catalogues show similar results in terms of inter-arrival times. However, it is only recently that economic losses became available; however, even nowadays, they are very difficult to assess accurately, especially when entering with indirect losses.

The impact on victims per million inhabitants has been reduced significantly in the last decades, while the economic impact per million inhabitants and GNPFootnote 1 is almost constant.

There are various indicators to introduce a metric in impacts. We discuss this topic, including multiple entries and using the concept of multi-hazard associated with earthquake occurrences.

Earthquakes are part of the class of natural hazards very similar to recent (Million years!) geological evolution, probably the oldest of them, only contemporary to meteorites or temperature.

They are a phenomenon of the release of energy accumulated inside the Earth's Crust. Tectonic Plate motion causes this accumulation for many millions of years as long as the geological processes keep the same token as in recent geological epochs. Earthquakes are part of the Earth’s evolution, and there is no way to stop their occurrence for millions of years to come. Human generations account for their effects since they could transmit this information to the younger generations. They are marked in the first objects humans produced, from tumbles to ceramic vases to keep food or ornaments to keep human traces in cemeteries. But large earthquakes also marked the landscape before humankind. This can be done with some accuracy with the help of Archaeo-seismology. This modern science can analyse periods longer than the historical human witnesses, observing the movements of objects or “strange deposits” caused by tsunami occurrences that left their “signature” in nature and link these phenomena to earthquakes/tsunamis. Nowadays, science can use many tools to obtain information in periods with poor oral or written communication (historical seismicity) or before humankind. Archaeo-seismology has opened great avenues to look into the past, which are very useful for anticipating the future. In fact, in many places, significant events occur with return periods larger than several civilisations and humanity loose this remembrance. Science has better understood these rare events of considerable impacts in the last two centuries. Even descriptions of recent earthquakes are doubtful for many scientists, which do not believe in descriptions without a number associated. The 1755 Lisbon earthquake is one of those events where science started changing its attitude towards the information of historical nature and not refusing entirely what was said in previous epochs. This earthquake is full of new indications which now no one can ignore: liquefaction at a few hundred km away, rivers spilling their water flow over the margins, seiches observed in lakes at great distances and chandeliers oscillating thousands of km away. Even nowadays, some phenomena beyond what instruments can record are neglected. In an enquire to senior people who remember well the effects of the 1969 San Vicent M7.9 earthquake in SW Iberia, the population is very explicit in saying that waves were accompanied by strong noises from underneath where they were, and the sky suddenly became illuminated. Another example: in the follow-up Sumatra 2004 event, the rescue teams knew when the next aftershock was arriving just because they got nauseated a few moments before shaking arrivals. This is to say that scientists should be humble enough to use all signals presented as observations, some already reported in ancient treaties, to understand the complex process of occurrence, wave propagation and corresponding effects caused on our environment.

To keep the reader attentive to the real world of the earthquake “drama”, we show in Fig. 1 a collection of 6 moments of great importance to illustrate the various forms that earthquake events express their effects: (a) the effect of fault rupture right over its trace, and the consequences on four similar buildings implanted around the fault trace. Resorting to simple physics, two of them collapsed in opposite directions according to the right-lateral fault motion, and a third building placed in the middle of the fault trace was cut in the vertical. (b) The damage in a late 19th-century bridge was reported very well by J. Milne. (c) The landslide expresses how shaking can have “indirect” tragic consequences due to poor or non-existent land-use policies. (d) The non-structural elements (in-fills) can be highly disruptive if not adequately treated. (e) Fire is another “indirect” effect that for many centuries accompanies the shaking and may cause more impact than the shaking itself. (f) Tsunami cannot be treated separately from shaking, and only very recently the “multi-hazard” concept has been considered in this analysis. (g) We left for the end the images of recent Haiti, August 15, 2021, to show how the earthquake occurrence keeps on with two different memories; one is the unique correlation of events occurring in a different segment of the same fault, separate by a period of 11 years. August 15, 2021, Haiti M7.2, a repetition of the 2010 Haiti earthquake, which caused 200,000 victims and great destruction, adds up to 2500 victims, 12,000 injures, and many more homeless. The second memory, attesting that we cannot eradicate it, is that we see the same type of damage in both events, denoting that rehabilitation should occur even in developing countries to avoid similar consequences.

Fig. 1
figure 1

Examples of earthquakes with significant impacts

2.1 Earthquakes as part of the Recent World Disaster Panorama

It is well known that several practicalities accelerate the poor behaviour of certain types of construction. To mention cases where “doubts” are not present, we describe a few accounts with varied performances:

  • In masonry structures up to 3–4 stories high, walls show the tendency for an “out-of-plane” mechanism, causing sizeable economic impact but a reduced amount of victims inside.

  • In reinforced concrete structures exhibiting soft-stories, the tendency is the collapse in the 1st floor leading probably to a “pancake mechanism”, with significant impact in human and economic terms.

    Furthermore, it is well settled that:

  • Construction that followed codes of practice and was subjected to good inspection tends to have much better performances than other constructions built without any attention to construction rules.

  • Earthquakes keep on causing significant impacts (human and economic) in societies not prepared to deal with earthquakes.

  • In the case of mega-quakes, even societies that are more resilient to earthquakes, are very much impacted, especially by indirect and cascade effects.

These examples are good indicators that science and technology know much about earthquakes and how to mitigate their effects.

However, the whole panorama of catastrophes is quite different. “The world lost as much as $232bn US due to natural disasters in 2019, with India leading in casualties with 1,750 deaths, said a recent report by AON (https://www.aon.com/risk-services/professional-services/default.jsp). According to the report Weather, Climate Catastrophe Insight for 2019, Cyclone Fani was one of the top 10 disasters of 2019, affecting Andhra Pradesh and Odisha, apart from Bangladesh.

News similar to these appears every year as a result of world natural phenomena with a negative impact on human life, resulting in disasters, and are reported by the various entities that nowadays are running disaster databases. We can cite the reinsurance companies like Munich RE, Swiss RE, MAPFRE RE, or entities interested in studying the epidemiology of worldwide events such as UNDRR, USGS, or EMDAT, where the CATDAT earthquake damaging data resides. GEM is also a partner that has developed many tasks in this area.

Re-insurance companies like those referred above are very interested in impact data from natural and technological (NaTech) events due to their obligations to reimburse in case of damage. They can only survive because they deal with various events and their clients come across all continents. This multiform universe can transfer debts from one place to another and satisfy their contracts. A single country with non-stationary occurrences could never play this role. However, not all assets are covered by insurance. According to Munich RE (2021), total losses resulting from natural events since 1980 sum up to $5,200 bnFootnote 2 US; more than 70% of this total was not insured. Japan Tohoku earthquake in 2011 was the largest ($210 bn US). Hurricane Katrina, which hit New Orleans in August 2005, was the costliest insured, with a total of $60.5bn US (original values).US$

Other data banks have been gathering data continuously since 1970, and their information is already critical because it covers the last 50 years. Catalogues assembled per event for periods before 1970 will be analysed later on. These Catalogues are of most importance to understanding the historical evolution of the earthquake events and are an excellent base to extrapolate to the future. Of course, historical catalogues cannot provide such reasonable information as the new database, but they are essential because they cover much more extended periods.

We are interested in understanding if earthquake events in the World are keeping their stationarity or if there is any incremental modification as has been happening in climate-change-related events. “Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the great increase of exposure”. Why is that?

Extending the period of analysis back to 1900, the most significant economic impact is the 1923 Great Kanto (Japan) ($214 bn US in the same order of value as the 2011 Tohoku (Japan) (> $300 bn US) (both values adjusted to 2011 $US market values). The other important events are shown in Fig. 2.

Fig. 2
figure 2

More important events since 1900: Magnitude-blue; Death-orange; Losses-gray

This Figure gives us several indications on how to present the impact of earthquakes by showing the magnitudes, the number of victims and the losses inflicted, and their significant variations from event to event. This brings the question of the more adequate indicators for measuring the impact of earthquakes. Are the world policies and various international initiatives helping in Losing Less? Or, on the contrary, the problem is a random lottery that no one can predict?

This section will look into the problem of occurrence from a Worldwide and European perspective and look for indicators to predict the impact of earthquakes.

Before advancing, we should have a word on Impact Indexes.

Several indexes can be related to the Disaster activity. Some are local, others regional or global. The most important is the impact on humans (victims, heavy injuries, homeless), followed by the impact on the economy and social tissue (number of property losses, indirect losses, intangible impacts). The latter are very difficult to measure; only a qualitative analysis can be made. In about all of them, the epoch of the event is the crucial parameter to examine the accuracy of the estimated values. In modern times, all numbers are available, even though uncertainties may become more prominent in certain countries or regions depending on the state of development of societies. The number of casualties is the most accessible parameter unless the affected region is the object of human mass movement due to social conflicts (wars, migrations, refugees, etc.). The economic parameters are always more challenging to obtain due to the dependence on currency’s value over time.

The market value is changing over the years due to inflation and other issues. For comparisons among regions and epochs, besides total values, it is essential to normalise the results to the existing population and the GNP (at the time) relative to the geographical unit where we are analysing the problem. It is different from normalising an impact by considering an entire country, region, etc., or only in the area affected. As mentioned before, the normalisation shall be done considering the same given time, especially at an epoch not too distant from the time of the occurrence. The economic market changes much along the time, so presenting ratios (Costs/GNP) for different times is not a correct option. If we want to consider accumulating numbers in the economic part, we always have to correct inflation when adding values. As general information, we can say that only after 1900 the accounts for losses can be considered quantitative values, and only after 1950 did these numbers become more accurate. Before 1900, with a few exceptions, only vague descriptions of economic impacts are available.

Besides the total and normalised values, a few other indexes are appropriate to mention.

The “vulnerability ratio” used to make the distinction between “developed” and “developing” countries can be defined as the number of fatalities (× 106) divided by the $US damage cost (Vranes, personal communication 2009). The vulnerability ratio for developed countries lies in the range 0.01/$−0.03/$, while for the developing countries, the ratio is three orders of magnitude worse, in the range 2/$−27/$ (Bilham 2009).

In recent times, a new composite index was proposed to evaluate the status of a region, the so-called HDI (Human Development Index), considered the leading indicator for the vision of “development as freedom”; it is a Composite Index, based on three dimensions (income, health, and education) (Mariano et al. 2021).

In a brief account of the parameters available through the ages, from ancient times to the present day, we can refer to the first observations of earthquake impacts initiated with the first archaeo-seismology information gathered with sedimentary tsunami witnesses a few k-years BC. This information is precious but only gives us, with a significant level of uncertainty, the possible time of the event and how large it might have been. The errors in the sources are enormous. The first Historical Catalogues show more accuracy on time and less on impact information. After the 1755 Lisbon earthquake, the data was transmitted better and with less uncertain. It is very curious to signal that for that event, the uncertainty in human losses is still significant, between 8000 to 20,000 according to the most recent studies, reaching a maximum of 10% of the Lisbon population at the time. In contrast, the attributed economic impact is in the range of 30–50% of the GNP (at the time). By the end of the nineteenth century, the information became more precise as the detail of observations attest, referring to the October 28, 1891, Japan earthquake causing 10,000 deaths; 20,000 injured; 130,000 houses destroyed—as Milne (1911a) corroborates.

With the more completeness of catalogues, other parameters came into place over the last 120 years. Since the end of the twentieth century, with the creation of customised data banks, initially pushed by the re-insurance companies and then by agencies dedicated to disaster studies, the information became more complete in terms of human and economic impacts.

Looking now at the world’s natural disasters from 1970-to 2020 (Fig. 3), we see that earthquakes are not among the most frequent natural events compared to “climate-change” events. Still, they are causing more deaths (58%) for a given period of time (Fig. 4). We also see that comparing two periods of 20 years, 1980–1999 and 2000–2019, the total number of global events is increasing with time (74%), the natural death is kept almost constant, the affected population increased (23%), and the economic losses increased (82%).

Fig. 3
figure 3

Natural disaster in the last 50 years(by type) (EMDAT 2020)

Fig. 4
figure 4

a Number of death per type of disaster for the period 2000–2019; b Comparison of Impacts in the two sequential periods, 1980–199 and 2000–2019 (From CRED/UNDRR 2020)

Back to Fig. 3, we can say that floods and extreme weather are the most common events; other climate-change events are responsible for 80% of total events. Earthquakes take only a 10% share but with significant impacts. After 2002, with a peak, there was a drop.

In Fig. 5, we can understand how the number of damaging earthquakes relates to HDI since 1900. If in the early twentieth century, most damage would come in low HDI countries, towards the end, destructive earthquakes would affect moderate-high HDI and higher (Daniell et al. 2011). Analysing CATDAT, a catalogue free of bugs prepared by Daniell et al. (2012) shows that damaging earthquakes and secondary effects (tsunamis, fire, landslides, liquefaction and fault rupture) are responsible for the significant contribution of total damage inflicted worldwide since the mid-1900s.

Fig. 5
figure 5

Damage inflicted by earthquake activity since 1900 (HDI- Human Development index) https://gfzpublic.gfz-potsdam.de/pubman/item/item_245308

Analysing the statistics of the last 120 years on the number of victims and economic losses (Fig. 6), we observe that the victims per decade per million inhabitants is steadily decreasing. In contrast, financial losses increase exponentially due to the increasing exposition and market value. However, if we normalise by GNP, the situation is better, and even better if we also normalise by population. If we look in more detail, we see that the 120 years should be separated into two periods, 1900–1970 and 1970–2020. Over the last 50 years, the normalised economic losses have been almost constant, meaning probably that mitigation policies might be producing some results.

Fig. 6
figure 6

Decennial number for the past 120 years of a deads and b economic impact; c normalised by inhabitant & World GNP

It is also interesting to verify an almost perfect correlation between Economic Losses/GNP and Human Losses/Inhabitants (Fig. 6d).

In a recent study, Dollet et al. (2021) proposed that the number of victims and total losses inflicted by earthquakes be normalised by the region's total population and GNP enclosed in the isoseismal V (EMS-98). For the last 50 years, the authors conclude that the evolution of the numbers suggests a steady decrease with time. These values agree with the global world numbers of Fig. 6 for the victims but not for the economic losses, which still show a plateau in the last 50 years.

The answer to the question “is the worldwide seismicity stationary in time?” is no. Based on the excellent data from the last 50 years expressed in Fig. 7, the evolution during the period 1970–2020 of the number of events per year disaggregated by magnitude range (M > 8; 7.9 > M > 7.0; …4.9 > M > 4.0) is very stable. The graph (M, λ) shows that the fitting curves are almost horizontal and vertically offset by an order of 10. The existence of shorter periods referring to more recent epochs, when looking to the lower magnitudes values, is a consequence of world seismographic networks' development and the lack of worldwide detection capabilities for the lower magnitudes.

Fig. 7
figure 7

World Seismicity 1970–2021 per disaggregated by magnitude range (https://en.wikipedia.org/wiki/List_of_earthquakes_in_2021)

Data in this figure can be extended to older epochs, and in that case, the class M > 8 can be disaggregated to other narrower classes like 8.9 > M > 8.0 or even more subdivided.

Before finishing this topic, we present the more significant events in West-Central Europe for the last 60 years, with an average interval of 6 years per event (Table 1). Looking at other ancient periods, the geographic pattern is different, indicating that short periods cannot represent the seismicity of a region.

Table 1 List of events in West-Central Europe in the last 60 years

Italian earthquakes dominated the last 60 years of earthquake activity in Europe, even though magnitudes did not surpass M7.0. To emphasise the difficulty in assessing an earthquake’s economic losses, compare Fig. 8 (Dolce and Bucci 2017; Dolce et al. 2021) with Table 1 (author compilation) to check the differences. The same difficulty applies to significant World earthquakes (Tables 1 and 2). A factor 3 or more (times) is present (vide Irpinia 1980), so only tendencies can be considered when analysing economic impacts.

Fig. 8
figure 8

Economic impact of the last 50 years in Italy (Dolce et al. 2021)

Table 2 List of significant earthquakes in the period 1972–1990 in the world with loss values and % of GNP (after Coburn and Spence 2002, in Elnashai 2002)

Keeping in mind all these limitations, the variation of per cent of GNP goes from 0.3% to 20%, but in most cases, the value does not surpass 2%. Note that the low values assigned to ex-Yugoslavian earthquakes derive from the fact that we took GNP of united Yugoslavia.

In summary, among all natural hazards occurring in the European region, “earthquakes lead to the highest number of fatalities and, after severe storms, cause the second-highest annual economic losses. From 2006 to 2015, Europe experienced 21 earthquake-related disasters that resulted in 1049 fatalities, more than 18 billion Euros in economic losses, and affected 284,000 people. Fortunately, in the last years, risk awareness and perception towards seismic threats have increased among the public and policymakers in a few European countries”. However, much more should be done. If citizens become more educated and aware of what natural hazards they are likely to face in their communities, they can press authorities to implement preventive and protective measures.

In 1972–1990, worldwide financial losses went up to 40% of GNP (Coburn and Spence 2002), showing how vital this impact may be to a country’s economy. Again, the numbers are approximate (Table 2) due to the difficulty in simultaneous assessing the losses and GNP.

2.2 Historical seismicity before 1900

In contrast to the difficulty in advancing with explanations of the origin of earthquakes, earthquake catalogues explaining the date and the size of the events have been a common practice since antique times. Many experts have dedicated their lives to looking for old events and organising them in a sequential form.

2.2.1 Differences in the syllabus

The nomenclature attributed to the word “earthquake” is quite varied from seisms, quakes, “terramotos”, “terremotos”, “tremblement de terre”, séisme, “terræ motu”, “tremor”, “tremuoti”, sismo, “tranblemann tè” and earthshaking. Except “tremor”, all are synonymous; however, some are more used than others depending on the context.

  • Earthquake (sismo) word comes from Greek (Σεισμός) and is the most used word in Science. Sometimes the term “seismic event” is associated. “Terramoto” more is used in Social Communication. “Terremoto” is identical but used in Spanish, Italian or Brazilian. It was also the form in Old Portuguese. “terræ motu” is a Hungarian word when the first isoseismal was presented. There are also dialects or translations in various languages.

  • “tremor de terra” is an earthquake that has the most significant impact on society, as it causes victims and damage to the built stock.

  • Tremor is used in conjunction with “seismic” or earthquake shaking.

Other expressions recently created, such as “mega-quakes”—high impact earthquakes, or “seismic swarm”—related to volcanic activity, with a massive series of small earthquakes occurring in a localised area. We could still add the "micro-earthquakes", which are "quakes" of very small magnitude which may precede a larger earthquake.

Earthquakes are linked to tectonic, volcanic, or "induced seismicity". The latter comes from anthropogenic loading of some expression of the Earth's Crust, as derived from filling reservoirs or opening underground galleries. An example of this case is the earthquakes in recent years in Groningen, Holland, due to the settlement in the underground gallery system constructed as a defence during the 2nd World War.

Other vibrations caused by explosions, bursts, major collapses of structures, etc., are not called earthquakes but seismic noise or "cultural noise", always present on the Earth's surface and captured by very sensitive stations. It comes from urban traffic, wind, waves hitting coastal areas or even temporary anthropogenic activities.

Tsunami word comes from Japanese 津波 and is also named “maremoto” or harbour or tidal wave. There are several causes to produce a tsunami beyond the fault rupture in the ocean. In southwest Asia, there are several local expressions to designate the tsunami.

2.2.2 Catalogues and their importance

Fortunately, there are a few Earthquake Catalogues in Historical times. They should be carefully analysed, and use modern advancements to allow cross-correlate information. This information has been gathered due to the efforts of many experts in seismology, history, geology, etc., and served to determine the main characteristics of ancient events.

Historical catalogues have been produced by individuals, namely Aristoteles, who became one of the first to launch a world catalogue. Before, the Chinese one or two centuries AD had prepared catalogues of their region; and designed the first seismoscope in 132 AD. World catalogues appeared in the eighteenth century. Figure 9 shows one published in 1722 by the Portuguese Engineer Manuel de Azevedo Fortes (Repositório 1722). Later, right after the Lisbon earthquake of 1755, the Universal History of Earthquakes was published in 1758 (Moreira de Mendonça 1758) (Fig. 10).

Fig. 9
figure 9

World Catalogue published in 1722”Repositório Universidade de Coimbra” (eighteenth century)

Fig. 10
figure 10

G–R plots for the period 1000–2000 AD (Euro-Mediterranean, EMEC)

In the nineteenth century, several new compilations of World earthquakes were made, namely by R. Mallet and his son J.W. Mallet (1858). Mallet estimated 13 M people were killed by the earthquakes from 4000 years until 1900, leading to 0.35 M casualties/century, whereas in the twentieth century, there were 1.5 M casualties/century. To present a measure for comparation these numbers we should refer that from 2000 BC to nowadays, the World population increased from 30 × 106 to 7 × 109.

From 1885 to 1907, Montessus de Ballore (1911) compiled 170,000 earthquakes, giving an average of 7.7 M events/century. According to our 2021 Catalogues, this means that he was capturing M > 4.5 events Worldwide.

An accountable number of Catalogues emerged in the late twentieth and twenty-first centuries, concentrated in some regions or made country by country. To name a few sources, for Europe, we should refer to Karnik (1971), who published the Earthquake Catalogue 1801–1900, van Gils and Leydecker (1991), Stucchi et al. (2017) in the framework of EU Programs NERIES, REAKT, SERIES, NERA and SERA, accessed in SHEEC Catalogue (Stucchi et al. 2013), and the compilations made in the framework of GEM for the World (Pagani et al., 2015), the ISC-GEM, the USGS (2021). Catalogues and statistics are also summarised in https://en.wikipedia.org/wiki/List_of_historical_earthquakes. A database of damaging small-to-medium magnitude earthquakes (Nievas et al. 2020) with data since 1900 has been compiled to analyse frequent events responsible for non-negligible impacts. The authors describe all problems in getting accurate data, especially if you want to disaggregate the information into many categories.


Some statistics


For Europe

Citing Karnik (1971) from the Earthquake Catalogue 1801–1900 complemented to 1900–2020, the following statistics can be observed:

  • Last 60 yr—55 events I0 > VII (Europe). One event per year.

  • In 1900–2000—27 tsunamis were of some importance. Inter-arrival of 3.7 years per event or 0.27 events per year and 21 in 2000–2020, 1 per year. There has been quite an increase in events in the 21st relative to the twentieth century.

From other Catalogues, it is possible to observe various numbers of importance: The Euro-Mediterranean Earthquake Catalogue for the last millennium (EMEC) (Grünthal et al. 2012) (Fig. 11), shows that the Gutenberg–Richter (1954) law of occurrences (G–R), with a slight downward curvature, predicts for Europe one earthquake for M > 8.5 and 20,000 for M > 4 in a 1000 year period.

Fig. 11
figure 11

The most important earthquakes in Europe in the twentieth century (Mw > 7.7 (ISC-GEM Global Catalogue:1904–2017) (In Marreiros et al. 2021)

And for the last 115 years (Fig. 11), we only see events 6 events Mw > 7.7 in the southern part of Europe. Maximum intensities vary dramatically for similar magnitudes due to very different epicentral distances involved.


For the World


As already mentioned, there are a few World-catalogues.

Table 3 presents the most significant earthquakes of the last millennium with victims larger than 50,000.

  • 37 earthquakes M8.5 + since 1500—Inter-arrival of 13.5 years, or 0.074 events per year for MegaEarthquakes (see Table 3)

Table 3 Most significant earthquakes of the last millennium ordered by decreasing the number of victims above 50,000

For Mega Earthquakes

  • In 20 Century, one earthquake M8 + per year.

  • - 24 Earthquakes since 1000 AD with more than 50,000 deaths each.

To refer to the initiatives of the more significant impact, we can mention the efforts made in the late 1900s by Gere (1983) for the World and separately for the American areas, Japan, China and Taiwan. An excellent example of studying historical seismicity is Guidoboni and Ebel (2009). One of the most exciting compilations made for the World and covering the last 4000 years was published by Dumbar et al. (1992) and treated with detail by Bilham (2004, 2009). Among many other statistics that Bilham (2009) presents on earthquake events and population evolution, we show in Fig. 12a) the number of fatalities per event in the period 1500–2004. For this period, we can conclude that almost 1,000,000 victims have occurred only one time; 100,000 victims 35 times and 10,000 victims 300 times. This figure has some similarities to the G–R plot if we replace fatalities by magnitude (see Fig. 12b).

Fig. 12
figure 12

World Catalogue of Significant Earthquakes (2150 BC-1991 AD): a period 1500–2004; b (USGS) M > 5.5

Figure 13 presents the World statistics of events larger than M > 8.2 for the last 115 years. It becomes clear that only one event, M > 9.4, have occurred, and the other bins fit with the G–R law.

Fig. 13
figure 13

Distribution of World magnitudes in the last century, per classes (data from Table in Hayes et al. 2020)

From all data presented in the last 120 years, it is possible to prepare Table 4 with the World record of occurrences in terms of the number of events/yr or the Inter-arrival time. World rate of occurrence for M < 8.0 is widely known with information disaggregated by integer magnitude values. With the above numbers we can also disaggregate for M > 8, namely M > 8.3, M > 8.7, and M > 9.4. The estimation of the last numbers was only possible with the help of the Catalogues of more significant earthquakes since 1000 AD, because from Fig. 12a) the prevalence of large magnitudes is not stationary in 1900–2020. Interestingly to note that events like the Valdívia, Chilean earthquake of 1960 M9.4+ occur once every century.

Table 4 World Record of Occurrences—20-21th Centuries

The values of Table 4 can be predicted from equation \({T}={10}^{-b({M}_{0}-M)}{T}_{0}\), where the parameters are taken from G–R law (Chang 2021).

The numbers presented in Table 4 are different from the ones discussed by Hough (2013), who, based on the NGDC Catalog (1994), refers to the incompleteness of the more significant events for periods before 1900 (Fig. 14). Missing events that other techniques, including large tsunamis, might identify are critical to hazard studies and may reduce the odds for occurrences.

Fig. 14
figure 14

Timeline of Mw > 8.5 for the World seismicity in the period 1700–2012 and running average (top) with moving window of 20 years (after Hough (2013)

2.2.3 The role of palaeo-seismology and archaeo-seismology in extending the historical information to older periods

As already mentioned, palaeo-seismology and archaeo-seismology can extend the period of historical information for ancient earthquakes, essentially obtained from tsunami sedimentation, back to 6000 or more years BC with the presently available technology. Silva et al. (2015), studying the ancient earthquakes occurring in coastal areas of the South and SE of the Iberian Peninsula, found three possible tsunamis (218 BC, AD 40–60 and AD 1048) and made use of “seismic palaeo-geography” to confirm the approximate dates of those events. Another example is Baptista and Miranda (2009), which estimated tsunami events for SW of the Iberian Peninsula back to more than 8000 years ago, leading to a time interval of 400 years between large events (Table 5).

Table 5 Historical Tsunamis with origin SE of the Iberian Peninsula as obtained from archaeo-seismology

Archaeomagnetic data recovered from the study of Celtiberian remains from Central Spain helps analyse the fidelity of palaeo-intensity data on ceramic pottery (Gomez-Capera et al. 2016) through some information on the reconstruction (orientation) of the magnetic field in the first millennium BC. This may be a good signal for dating Archaeo-seismological sites. The above values can constrict the time-intervals for the huge events in the SW of Continental Portugal.

In a recent study, Salazar et al. (2022) found geoarcheological evidence of a tsunamigenic earthquake ≈3800 years ago with origin in northern Chile. They attributed a Mw 9.5 due to the enormous perception area. The origin area corresponds to one of the major gaps of the planet, emphasizing the necessity to account for long temporal time scales.

Other relevant information may come from dendro-hydrological analyses of tree rings, which may give clues to earthquake events due to changes in water content after shaking. Trees ranging in age from 300 to 500 years grow in many places and can identify previously unknown seismic disturbances or better define partially known events (Jacoby 1997). A good case study is related to the 1812 New Madrid events.

2.2.4 Resolving Uncertainties on the location of ancient ocean events with Tsunamis modelling

Studies of back-analysis have been of great importance in determining the epicentre of offshore earthquakes that trigger tsunamis. This applies to historical earthquakes where Intensities are very difficult to produce good results. Historical records are very clear from the beginning of the eighteenth century to access times of arrival of tsunami waves and approximate amplitude of inundations. In addition, information on the polarity (run-up and rundown) is available in many coastal regions. Back analysis has been performed by Backward Ray Tracing, Forward simulations or Inversion of tsunami waveforms. Results are promising in approximating the location of the epicentre with errors in the order of a maximum of 0.5º, depending on how significant the event is and on the number of data points in the coastal regions. These methods permit many better-constrained solutions than working with inland intensities points away from the epicentral area and, most of the time, from a short azimuthal aperture. We can say that about the SW Iberian, a region with large seismic activity and a few tsunamis since 1700. Using this technique it is possible to locate the more significant off-shore events with great precision. We should recall 1722, 1755, and 1761 as significant successes (Baptista 2020, Fig. 15).

Fig. 15
figure 15

Overview of the study area. Beach balls represent the focal mechanisms of the instrumental events—1941, 1969 and 1975; orange stars represent the presumed epicentres of the historical events 1722, 1755 and 1761. White dashed lines follow main geological lineaments. GS Gibraltar Strait, CP Coral Patch Seamount, GB Gorringe Bank

2.3 Risk matrices

Citing Aven et al. (2017), risk problems are often complicated and multi-faceted, requiring simplifications on how risks are described and used in communication and decision-making processes. The tools used to do this include different risk matrices, ranking, rating and scoring metrics. Risk scoring and ranking systems range from simple risk indicators to more complex characterisations that consider other relevant aspects necessary for decision making, including costs and ethical concerns.

Depending on the objectives to reach out, risk matrices can be presented in different formats, as in Fig. 16a), where a two entry table (likelihood or probability) vs. (severity or impact) leads to a number/colour that expresses the Risk. The higher the number, the higher the risk. Figure 16b), which corresponds to a case of floods and landslides (Quaresma and Zêzere 2012), adds the levels of acceptance of risks in face of probabilities of occurrences. There are several limitations and methods to design risk matrices, and caution should be exercised in their applicability (Cox 2008; Bao et al. 2017).

Fig. 16
figure 16

a The type of a Risk Matrix; b F-N Curves (frequency consequences) with levels of acceptance of risks (Quaresma et al. 2012)

If the phenomenon under analysis depends on more than a simple variable, y = f(x), Risk matrices are more complex because both “input” and “output” may be multi-hazard and multi-impact, and the problem easily converts into multi-dimensional.

Figure 17a gives us a Risk Matrix for a single hazard, with two entries, for the case of a tsunami impact measured by the height of inundation and velocity of water flow (Boschetti and Ioualalen 2021). Here again, the higher the number, the higher the risk. Figure 17b as proposed by Mota de Sá et al. (2012), “SIRIUS” presents a Risk Matrix for a region or an urban block to measure the impact of future events considering the average deficit of resistance of the existing stock of buildings fulfilling a prescribed code, and the concentration of population. Probability is associated with abscissa.

Fig. 17
figure 17

Risk Matrix for single hazard, two entries; a tsunami impact measured by the height of inundation and velocity of water flow (Boschetti and Ioualalen 2021); b shaking impact measured by the deficit of resistance and concentration of population affected (SIRIUS)

Population density is also used to link fatalities to magnitudes and reduce dispersion, as shown in Fig. 18a) with data since 1900. A few other Indicators have been presented for the multi-hazard phenomenon, as is the situation of Shaking plus Tsunami (Fig. 18b). To each number, a descriptor is associated.

Fig. 18
figure 18

Relation between earthquake magnitude and numbers of fatalities for all earthquakes since 1900 (Hough et al. 2006) with lines adapted from Samardjieva et al. (2002) (D is population density per km2) b Multi-hazard Shaking plus Tsunami: increase of impact due to tsunami after damaging shaking: a sketch; after Bonacho et al. (2018)

Another form to analyse risk is proposed by Platt (2017). He measures the resilience of societies by studying the factors affecting the speed and quality of post-earthquake recovery. Based on ten significant events of the twenty-first century and on time to recover in each one (data from Kates and.Pijawka 1977), he gets a “good” correlation between disaster management quality and speed and quality of recovery. The better the management, the lower the rate and the larger the quality of recovery.

Finally, from the concept of “vulnerability ratio” and “Size of Earthquake”, we can construct a Risk Matrix (Fig. 19a) as follows:

  • “developed countries” in general have few victims and huge losses (due to inter-dependence and non-structural losses);

  • “developing countries” have many victims but more minor economic losses (because they are already poor and reconstruction is made through solidarity funds);

  • “Mega earthquakes” (M > 8 or area of perception > 500 km) are rare events that produce huge impacts over a large area of perception;

  • “Local earthquakes” (M < 8) are more specific events; if the epicentre occurs near urban areas, the impact is localised, and developed countries can take care of that.

Fig. 19
figure 19

Risk Matrix for earthquake impact: a from Vulnerability Ratio; b from resilience

Local and Mega earthquakes can be somehow associated to a given probability of occurrence. But severity is still missing!

Another way is using the concept of “Resilience” (the capacity to recover) and again “Size of Earthquake”, but in this case measured by PGA (peak ground acceleration) (Fig. 19b).

From all the risk matrices presented, SIRIUS (Fig. 16b) seems to be one of the best form of communicating risk to the population.

2.4 Points to retain

The most important points to retain from this Section are:

  • Catalogues are essential tools for understanding the past and expecting the future. As we tried to explain, the world’s seismicity is not increasing in time as it is happening with other natural events related to climate changes. But they are not easy to handle because so far, we could not predict their occurrence in time, even though all indications point to the repetition of similar events in the same places, but occurring at a pace sometimes not compatible with human generations. Periodicity might be too long that communities forget the perils of unexpected events that have already happened. Palaeo- and archaeo- seismology prologue our knowledge into the past and signals those periodicities for the “black swans events”. It is an absolute error trying to extrapolate modern seismicity of excellent quality to periods more extended than the observed ones if the period in analyses is not long enough to represent the stationarity of these stochastic series. Citing N. Ambraseys (2009): “Historical earthquake information is invaluable not only in the study of earthquakes per si but also for climate and weather, and can guide the engineers to design structures to resist the forces of nature without being taken by surprise by an anticipated event”.

  • Why are earthquakes still shocking when they occur? Occurrence is not increasing over time; however, exposition is increasing dragged by the exponential the population's growth and vulnerability. These two elements of the Risk equation (hazard/occurrence, existence, vulnerability) depends essentially on the cultural level of the society and the community perception of perils associated with the frequency of events.

  • Engineering and science know how to prepare a more resilient society. The first signs show that current policies to mitigate earthquake impacts are working if measures are taken.

  • In this section, we also observed the problem of earthquakes from a Worldwide and European perspective and looked for indicators and risk matrices to predict their impact.

Other points should be brought to the discussion on the causes still existing of the high vulnerability of world construction facing earthquakes. Contrarily to what has happened in the automobile industry or in the aeronautics, medicine or other sectors, which need more and more HighTech knowledge, the construction industry did not go through any revolution like the others. Nowadays, to perform maintenance of your car, you no more look for backyard garages. You have to take the vehicle to specialised artistry. The construction industry is in the hands of people with low culture and poor knowledge, based on the unique social-economical background that wants to recover the investment the sooner the best. Masonry construction is the paradigm of solid physical labour even though much knowledge is behind old traditions. Steel or concrete structures require much more expertise, and new materials that appear nowadays (FRP-Fiber Reinforced Materials, CLT-Cross-Laminated Timber) involve other skills. Rehabilitation is another new requirement that experts and societies demand, and public opinion and mass media claim.

In construction, especially in single homes or dwellings, each owner acts by himself, being architect and engineer and solving all problems without calling the help of a professional. Of course, things worldwide differ much from region to region, from culture to culture. However, there is an epistemic uncertainty behind the whole process of the construction industry, which we need to reduce dramatically. This concern is briefly stated in the Class Notes of Geography (Hommel and Parry 2015).

Negligence and corruption have been referred to by a few researchers (i.e. Hough 2020; Bilham 2009), as the most important cause for this alarming situation, and there are indeed many conflicting interests throughout the World dealing with the binary issue of construction industry-urban planning. Politicians do not care for long-term decisions because their term is very short, and their common position is silence or negligence as it has happened with climate changes. Nonetheless, these social-political reasons are responsible for the very poor performances of several typologies in recent events. Science, both Seismology and Engineering Construction have a share of these responsibilities, either for requiring too much or not denouncing the too low, as it will be described in the following sections.

3 Evolution of Seismology and Engineering Construction since the mid-1700s until the mid-1900s

Summary

  • Before 1900 science for studying seismology and earthquakes was not significantly different from other sciences. Very rich in theoretical terms like mathematical physics, a topic still taught nowadays. But the concepts of building safe structures against earthquakes were very rudimentary. Construction kept good traditions from ancient times based on empirical knowledge transmitted from generation to generation.

  • We look in parallel to the developments of these fields anchored into other sciences like construction and architecture that are as old as humanity.

  • For each new invention or explanation, we intend to look ahead to see the follow-ups in the twentieth century and how technology evolved 20 years past the last millennium.

  • As earthquakes/tsunamis occur in a stationary way when speaking of the entire world, there is always something happening and new visions can support new ideas.

  • Seismic-resistant construction was born with great success after the 1755 Lisbon earthquake.

3.1 A first view of the problem

The Lisbon earthquake of 1755 was the first event that awakened science to the earthquake phenomenon. Before that, many advances were made toward understanding motion and how to design structures to survive, but no one got close to the causes of shaking. The first reports on the perception of earthquakes came from China a long time ago, where the first seismoscope was developed to analyse the direction of motion and intensity of first waves.

Developments in the design of structures were connected to vertical loads. Buttresses (flying buttresses) or ties in churches are antique techniques used in the Middle Ages when influential and notorious structures were built. In the Arabian and Muslim times, we also see slender structures, namely mosques and minarets that somehow tried to survive earthquake shaking, challenging mother nature.

Earthquakes are described in ancient times as terrible events that will occur from time to time; other natural occurrences were not so critical, except plagues. As medicine was not like today, contagious diseases such as “pests” (Black Plague) were even more catastrophic in terms of human impacts. According to Cirillo and Taleb (2020), in the last 2500 years, 72 events caused significant mortality worldwide.

The use of very particular techniques helped fight earthquake threats, as in the case of timber crossing used in housing in a few locations in Europe and Asia, especially in Japan. Other areas with plenty of seismic activity were not much inhabited until the fifteenth century, like the Americas. With the arrival of more developed cultures, new information started being collected. But, in many cases, the existence of seismic resistant construction was not present. Even though concerns were current, construction not minimally surviving shaking was built over the centuries. Shaking loads are different from vertical loads, and the material used did not help increase lateral resistance. But human ambition was high, and construction in height without robustness was seen in many locations, even in areas experiencing seismic activity.

In 1755 for the first time, the understanding of waves generated at some point and arriving at different locations with different severity was observed with the help of an enquire made in Portugal and Spain to check the violence of shaking through the behaviour of structures and performance of nature (liquefaction, water flow in rivers, etc.). Reconstruction of Downtown Lisbon observed several requirements; the most obvious was the widespread use of a Pombaline cage, a Timber frame with diagonals implanted in the interior of masonry walls. The engineers at the time learned that the flexibility of elements and connectivity would help buildings survive future events. This knowledge came directly from the wooden naval technology in Lisbon shipyards, where ocean ships (caravels) were provided with enough resistance to survive strong sea waves.

All advancements until early-1800 were very empirical, but the tradition of construction passed from civilisation to civilisation.

The first steps toward a better description of the damage and the tentative measuring seismic action started with impacting earthquakes in Japan and Southern Europe. The advancements in physics and the strength of materials allowed the possibility to theorise on a few essential topics. Statics, but not dynamics, were already known in the construction of spectacular monuments, which helped make constructions more resistant.

Japan, where earthquakes occur with great frequency, caused the attention of several physicists, namely John Milne, a scientist from Cambridge University that spent a significant part of his life studying earthquakes. The same has happened with Robert Mallet member of several English Royal Societies that visited various sites in Italy struck by earthquakes.

It is evident that looking into history, the significant development in the science of earthquakes took place in the regions where these two requirements would hold: (1) Frequency of the impact of earthquakes and (2) existence of a solid culture in physics. This is very clear in "Appendix 2", where we can understand the epochs and the context of higher contributions to science. For example, on the west coast of the Americas, where seismic activity is significant from Southern Chile to Vancouver, even though with very different mechanisms of fault rupture, developed excellent knowledge in Seismology and Earthquake Engineering(SEE) but at a very late epoch in comparison with Asia (Japan) or Southern Europe, especially Italy, due to the late cultural evolution of those western lands.

The first tentative to understanding earthquakes, their origin and their impact are observed in the mid-1800 with the Treatises of The Science of Seismology. From the observed damage, physicists tried to establish a direction to the epicentre and, using trigonometry and ballistics, they obtained the possible distance to the origin of shaking. Intensity scales came at a later stage.

It is exciting to observe the development of Seismology and Earthquake Construction, the two fields of knowledge behind the earthquake problem at the time. For many years they were together, but at a certain point, they followed different paths. Military Engineering was probably the science that led to important constructions, also looking for earthquake leading expertise. Architecture also played an important role, with its initial pioneering work by Vitruvius in the first century AD. Seismology became more independent with the development of physics, while construction was not progressing much. Only when new materials like iron became available by the end of the nineteenth century did the construction practice experience a great impulse.

Masonry, together with timber, geological materials like lime, and a mixture of earth with some cement, dominated all construction materials from early history (Roman Times & before) until the end of the nineteenth century. Other materials (steel and reinforced concrete) progressively replaced the masonry construction during the first half of the twentieth century. Only the recent movements towards the rehabilitation of older construction, in a signal of cultural development, brought back the interest in old materials.

Intensity scales were a new significant advancement in earthquake science. For the first time, measuring the impact of an earthquake in different locations becomes an outstanding achievement. The first scale goes back to the mid-1800. With more refined descriptors, they are still essential nowadays to relate with Strong Motion parameter's values and re-analyse old events where only descriptions of effects and impacts were available.

Intensity is a simple measure of a complex phenomenon that has become challenging to analyse nowadays. To understand the various inter-dependences, we must disaggregate information into more specific elements.

Remarkable Seismology advancements were possible with the advent of recording instrumentation. This evolution rapidly evolved as a “quanta”, accompanying other technological advances.

Several sciences accompanied the development of earthquake knowledge along the times, particularly in Seismology, Construction and Architecture. We will try to compare the critical points that most contributed to the development of earthquake knowledge. But many other sciences, besides the ones above referred, are connected to these developments. Mathematics with Geometry and Calculus, etc., then Physics with Statics and Materials would complement the experimental observations made locally (Field Trips) to essay explanations on the effect of earthquakes.

Science and technology progress is generally achieved by small increments and contributions. It takes time for a new idea to become adopted as the state-of-the-art in the field. In the meantime, discussions and doubts are advanced among experts working in the same area. Finally, one name or a group of names forms a team that gets to the point, wrapping all knowledge and publicising an invention. This had happened throughout ancient times when communication was difficult and with long periods of delay. But even now, with all the efficiency in communication, sometimes the same happens. Frequently, two groups arrive almost simultaneously at the same theory, algorithm or explanation. Of course, there are exceptions; probably the most peculiar one was Einstein’s new Theory of Relativity (1905), which had no one doing the same and was so disruptive that it took years to be accepted. In SEE, the same has taken place. Sometimes, it only appears the name of such a person that finalises the invention or solution to a given problem. The other names, deserving a reference, were kept in the dark.

We will be touching on these topics, not in an orderly way:

  • Military Engineers: Theory, practice—military engineers would be the great “technicians” of large constructions, part as architects and part as civil engineers.

  • Earth Scientists—were mathematicians and physicists that would try to understand the laws of physics of the Planet Earth.

  • Construction Practitioners (Architects—geometry; materials) were the first to put up with engineers the housing and significant monuments of their epoch. Geometry and, in particular, trigonometry was probably, together with materials, the prior knowledge to build construction, the most critical asset for the populations.

  • Material science (earth + masonry + timber + limestone)—the knowledge of properties of existing natural materials, the sites to get them (Geology) at the closest place where the building would occur, the size, strength, the portability, etc., were among the most critical aspects critical for the type of construction that we see growing at a particular place.

  • Observation Seismology upon Construction behaviour (Field Trips)—the external threats were present in many locations, and shaking and tsunamis were probably the most demanding loads to consider. The continuous repetition of disasters always calls the attention of the population and the elites that the territories were not all the same as far as the shaking was concerned. But it took much time to understand the phenomenon and become accustomed to living with them, even now.

  • Mathematical modelling (Theoreticians)—the development of theoretical modelling always helped in looking ahead of time and proportionate the search for better solutions. Experience from past events would be the primary key to understanding the problems.

  • Instrumental Seismology (Physics)—without instrumentation, it was tough to understand the physics of the problem, the origins, the propagation of waves and the interaction with construction. With mathematical modelling, significant steps were made towards what we now understand.

  • Manuals for construction (Codes)—when human evolution turned into exponential growth, construction needs exploded exponentially, and there was no other way to control the quality of construction if codes were not introduced. These codes would reflect the best knowledge of science and practice to mitigate earthquake impacts.

Landmarks in the evolution up to the 18th Century.

  • Chinese (several Centuries BC) already dealt with earthquakes. They were the first communities to compile catalogues of earthquakes, with the first mention in the twenty-third century BC. The first collection of earthquake records appeared in 977 AD. There were 45 earthquake items between the eleventh century BC and 618 AD (Wang 2004). The invention of the first seismoscope will be mentioned in Sect. 3.1.

  • The Roman Architecture and the Vitruvius Treatises explained in 10 Books during the 1st Century BC—this was probably the first compendium of architecture. Vitruvio (Marcus Vitruvius Pollio) (Edition 1486), the father of modern architecture and construction, wrote with simple language and illustrations, many of which were lost over the centuries. He gave a lot of attention to building construction, time and movement of the stars to understand insulation and mechanics and design a lot of equipment for erecting buildings and protecting during wars. He made the principles of architecture based on “firmitas, utilitas, and venustas” ("strength", "utility", and "beauty") and respected the proportion of the human body.

  • The Materials for construction (Blocks, rubble masonry, limestone, timber, etc.)—there was not much choice for materials rather than the ones Mother Nature would directly proportionate. Vitruvius already knew the main properties of those materials and the geometric characteristics to design arches, roads, bridges, monuments, religious temples, castles, etc. These are essential assets among the Greek, Roman and Arabian civilisations (Fig. 20).

  • The disappearance of a big Metropolis near the shoreline is also a problem linked to earthquake events. The discovery of “pozzolana”, a material extracted in Puzzuoli near Vesuvius, permitted cementitious materials would harden inside water. Several massive constructions were built with this material, conquering infill land near ports and water inlets. Alexandria Colossus of Rhodes and Caesarea maritime urban harbour (Palestine) were among them. They disappear suddenly in the event of an earthquake: liquefaction sounds the most obvious explanation for the disappearance of those metropolis. The Colossus of Rhodes, which took 12 years to build (c. 294–282 BC), was toppled by an earthquake about 225/226 BC. The fallen Colossus was left in place until 654 AD, when Arabian forces raided Rhodes and had the statue broken up and sold bronze for scrap. About the fall of Caesarea, explanations are still under study. The sudden disappearance of Heraklion or Thonis in the Nilo's Delta, Egypt, ten centuries BC is probably another case of liquefaction in the following of a large earthquake event.

  • The great monuments of the Middle Ages. These are the times of massive cathedrals, bridges, palaces, castles, etc. They already benefited from having interesting solutions to solve the dead loads, which were also good for absorbing lateral shaking. This coincidence was very successful. Figure 21 Sketches of buttresses in Vitruvius to solve the lateral impulse caused by the upper elements. We understand now that this recommendation, widely used in Middle Ages cathedrals, was very helpful in resisting lateral shaking.

Fig. 20
figure 20

taken from Branco et al. 2017)

Arches, roads, monuments, and castles of the Greek and Roman epochs (photos

Fig. 21
figure 21

Sketches of buttresses in Vitruvius Treaties (in Maciel, translation 2015)

The tentative for a trial shake-table. The use of moving back and forward “stage” with construction on top of it built with some earthquake provisions, was the first tentative to analyse of the effect of earthquakes and the usefulness of earthquake provisions. This procedure was described as a probation of the efficiency of the “gaiola technique” (see Sect. 3.2.3) during the reconstruction of Lisbon after the 1755 earthquake.

3.2 Briefing the developments of Seismology and Earthquake Engineering (SEE)

Before entering the eighteenth century, where scientific developments were of great importance, we recall that the first known explanations for the origin of earthquakes go back to the Chinese culture two centuries BC, with the idea of catfish pulling the Earth and provoking shaking and tsunami waves (Fig. 22). The Great wave of Kanagawa (Katsushika Hokusai 1830) (Fig. 21b) was not made to remember the giant waves of a tsunami. Still, after the significant impact on the cultural society of Japan, it was adopted as a reference.

Fig. 22
figure 22

Chinese Culture: Cat-fish and tsunami wave—Catfish, 16th Century BC (https://www.iccrom.org/news/earthquake-heritage-examples-japan)

The first instrument to measure the shaking, intensity and direction of motion was developed in 132 AD by Cheng Hêng, an astronomer and mathematician (Dewey and Byerly 1969). Figure 23 is an attempt at a replica of that seismoscope which was characterised by “dragons-mouths” with “balls” and “toads” that would receive them when falling. Needham (1959) and Sleeswyk and Sivin (1983) described this first seismologic instrument well.

Fig. 23
figure 23

Replica of the first seismoscope in China

Until mid eighteenth century, the main theories considered causes of earthquakes were the collapse of underground caves excavated by the ocean and the volcanic theories. Aristoteles, Pliny, Chinese, and Shakespeare, among others, thought the Earth's inner fire played a central role in producing seismic vibrations. Aristotle developed an elaborate version of the volcanism model, in which underground vapours were circulated by the combined action of interior fire and solar radiation. Occasionally, the 'winds' produced would escape to the outside of the Earth, creating earthquakes. The Aristotelian theory of earthquakes was widely accepted throughout the Middle Ages. Its influence can still be found in Emanuel Kant’s essays in 1756 on the follow-up of the Lisbon earthquake. But the theories of punishment for the “bad behaviour” of populations were in vogue until the 1755 Lisbon earthquake. Rev. Malagrida was a great defender of God’s punishment advocating the capital penalty in public places. Pombal, the prime minister after the 1755 Lisbon earthquake, opposed that theory and acted right after the event asking for “treating the injuries and burying the dead”. He initiated the reconstruction of Lisbon, creating a team of Engineers and Urban Planners. The opponents of his views were left in jail.

3.2.1 Main seismological achievements in 1800–1900

As referred to before, Seismology and Earthquake Engineering have started as a single science in the mid 18th Century. They both look into explanations of the strange behaviour of Earth shaking and tied to build according to Newton’s Laws of gravity.

A few strong events in Europe, mainly in Italy, central Europe and Japan, forced people to think about the causes of earthquakes and ways to mitigate their disastrous effects. The first ideas came with the 1755 Lisbon earthquake that initialised the scientific approach to wave propagation with the theories of John Mitchel (1760). In the middle of the eighteenth century, the first theories of source of energy that radiates to many points around were starting at the same time as the first instrumentation was initiated. In the seventeenth and eighteenth centuries, the early explanations come from Physics, Mathematics, Earth, and Planetary observation. But only after the development of the theory of elasticity—was it possible to explain seismic phenomena satisfactorily through the laws of Physics. The mathematical formalism of elasticity was developed in the first half of the nineteenth century. It was applied to the study of earthquakes by Robert Mallet. He published in 1848 a pioneering treatise entitled “On the Dynamics of Earthquakes”, an attempt to reduce their observed Phenomena to the known Laws of Wave Motion in Solids and Fluids.

In the following Box, a set of points enumerate some of the most exciting “movements” that contributed to the development of these sciences.


Box—Comparisons in the evolution of Seismology and Earthquake Engineering.


The compared vision of Seismology and earthquake construction and the inter-analysis of both fields of knowledge.

•The first Earthquake Catalogue of Modern Ages with Pereira de Mendonça.

•Mallet and the great Neapolitan Earthquake of 1858. Mechanical explanations to derive epicentre and hypocenter.

•Milne and the first instrumentation. The first notion of SM record.

•Field Missions to significant events.

•The need to measure earthquakes’ effects initiates the first intensity scales. Intensity Scales and the first map of isoseismals are also contemporary of these significant achievements.

•Turn of 1900 and the first seismographic stations.

•Great advancements in understanding the origin of earthquakes, Wegener theory and plate theory in turn to the twentieth century.

The plate theory comes later after the 1906 San Francisco earthquake. The rupture slip in the fault trace was evident, even though the first signs of fault rupture were already observed after the Owens Valley (1872) and Nobi (1891) earthquakes. Following the observations of the 1906 earthquake Reid (1910) proposed the model of the elastic rebound. According to Reid, these ruptures release in a few seconds or minutes the adjustable tensions accumulated over centuries or millennia by slow processes of deformation of the crust, the Earth’s outermost layer. This model for the generation of earthquakes is still essential for understanding the phenomenon today. In 2008 critical studies based on World GPS confirmed the plate movements (Müller et al., 2008).

•After the 2nd World War, launching the World-Wide Seismographic Standard Network (WWSSN) to monitor nuclear explosions created another essential landmark.

•Great disasters are always the origin of new avenues and science opportunities.

 

The first explanation of the origin of earthquakes and the concept of epicentral location, focus, magnitude, etc., came at a later stage when American experts entered the seismological process after the 1906 San Francisco earthquake. Also, South American areas of Chile, Central American countries, etc., all the eastern subduction side of the Pacific plate, counted with the expertise of a few people, namely Montessus de Ballore (1911) and the Spaniard military (Cerero y Sáenz 1890) who was dispatched to remote areas of old colonies in the Andes and the far-east.

Montessus de Ballore, together with Perrey, Mallet, Milne and Omori, were probably the founders of scientific Seismology (Todorovska 2009). But many other names have contributed to the advancement of Seismology, through the development of more modern seismological instrumentation, with the definition of the inner Earth discontinuities. From the middle 1900 onwards, digital seismology started to be installed, and a significant jump in findings of fault rupture, propagation and site/topographic effects took place. (See Lee et al. 2002, for a detailed account of the names of persons who contributed to the advancement of this science).

3.2.2 Construction and architecture

Construction did not suffer significant changes until new materials appeared. The first tentative to build seismo-resistant structures has its considerable development in the reconstruction of Lisbon with the deployment of the Pombaline Gaiola (cage). Before that, we may consider the Himis construction in Turkey as the first tentative but with minimal expression. A few situations were tried with wood after the 1726 and 1751 Palermo earthquakes (Fig. 24), but the use of diagonals inside the masonry walls was very mild (Campisi and Scibilia 2016). We may go even to older times as (Gülkan and Sözen 2018) describes that the first prescriptive measures to build for better seismic performance could be attributed to the regulation passed after the 1509 Istanbul earthquake.

Fig. 24
figure 24

Wood as an anti-seismic function in Palermo after the 1726 & 1751 earthquakes (from Campisi and Scibilia 2016)

Going back to the Himis construction, Gülkan and Langenbach (2004) describe this earthquake resistance as part of Turkey's traditional timber and masonry dwellings.

In fact, “Hımış construction is a variation on a shared construction tradition that has existed through history in many parts of the world, from ancient Rome almost to the present. In Britain, where it became one of the identity markers of the Elizabethan Age, it would be referred to as “half-timbered.” In Germany, it was called “fachwerk,” in France, “colombage,” in Kashmir, India as “dhajji-dewari.” (Langenbach 1989). In Central and South America, a variant was called “bahareque”. Ancient Roman examples have been unearthed in Herculaneum, several involving interior partitions, and a good example involving the construction of an entire two-story row house illustrated in Langenbach (1989). The palaces at Knossos have been identified as having possessed timber lacing of both the horizontal and the infilled frame. This takes the date of what can be reasonably described as timber-laced masonry construction back to as early as 1500 to 2000 BC”.

After the Gaiola idea, we have the Baracatta construction (Vicenzio Ferraresi 1783, Stellacci et al. 2016), closely following the same ideas, which became more common in Italy the mid nineteenth century. Interestingly, in northern European countries, where seismicity was low, it is widespread in old cities to have hanging front walls made of timber diagonals. Until the end of the nineteenth century, we observe the first constructions with iron, especially those with great spans, namely bridges in London, NY, Central Europe (Praha Charles V' Bridge) and great monuments or palaces. Reinforced concrete (RC) appeared towards the end of that century and was used on the first floor of houses to allow larger open spaces. Only much later, after mid twentieth century, RC was widely used. A good description of alternative techniques to RC utilising a combination of bricks, timber or metal was described in Scibilia (2017). Simultaneously, the first modern codes and Structural Dynamics essayed the first steps. The 1st World Conference on Earthquake Engineering (WCEE) took place in Berkeley in 1954 and every four years.


Box—Names to remember.

•Vitruvius and the 3 great principles (“firmitas, utilitas, and venustas”).

•Roman technology (opus cementitious, lime mortar, pozzolanas).

•The mysteries of lost Megacities (Alexandria Colossus of Rhodes, Caesarea and Heraklion).

•Jean Rondelet and August Choisy. Nègre (2010) gives a good account of the achievements of these two great architects.

•Encyclopaedia of Construction (Gwilt-1867).

•New materials (Steel and Reinforced Concrete).

•Modern Architecture (le Corbusier, Alvar Alto).

•Engineering & Architecture (Calatrava).

•(Ferrari et al., 2005).

Etc.

 

3.2.3 Architecture and engineering

The evolution of construction was an essential piece of human creation, and basic needs form the basis of this development. Of course, the main constraints were sheltering, and it evolved to other functions as even demonstrating social and political power.

Earthquakes always struck constructions and destroyed partially or entire cities and civilisations. The case of Alexandria (365 AD) or Caesarea are examples of a lack of knowledge on how saturated sand would behave under earthquake shaking. Man always looked into the problem of understanding how to build with the materials at hand and using experience from past events. Geometry was part of the game and played an essential role in developing resistant structures to stand vertical loads and other loads, namely seismic inputs, floods and wind loads. Treaties were a clear signal that architecture was part of the game. To mention a few of them, besides Vitruvius, we should refer to Gwilt (1867) with his famous encyclopaedia, where we can understand how different types of construction were built. No one was looking to seismic loads even though many proposed solutions already contained ideas for better resistance to earthquakes. Something proportioned, elegant, and not against gravity is ideal for earthquake resistance. Examples are the existence of buttresses or flying buttresses as reasonable solutions to make high walls strong enough to stand lateral loading. Jean Rondelet (1802), together with August Choisy (1873), the engineer, in early 1800 with his “Traité de l'Art de Bâtir”, is another architect that advanced with structures requiring considerable imagination and the backup of better materials. He explains the construction lines since Babylonian times, Egyptians, etc., and talks about how it was possible to build obelisks, pyramids, etc.

As already referred to, the use of timber as a seismo-resistant strategy is an ancient tradition in several regions and cultures around the World. But only after the 1726 and 1751 earthquakes, especially after the 1755 Lisbon earthquake, wood was used in beams both in monumental buildings and in more common buildings to provide better cohesion for masonries, allowing construction with larger spans, lightweight partitions walls, or even replacement of stone vaults with vaulted wooden structures or in ceilings and stairways. Later they were used with iron and chain tie-rods. But no diagonals were used, only with the Gaiola technique!

Gwilt (1867) wrote a vast compendium of Architecture and Engineering, specialised in geometry and material properties of wood and masonry, which were essentially the unique materials until mid nineteenth century. The initial phases of the “iron” ages were also briefly handled. His encyclopaedia also dealt with costs and professional issues that embraced the “Industrial Revolution” (Fig. 25).

Fig. 25
figure 25

The Encyclopaedia of Gwilt

But not a single word or concept of earthquake resistance.

The only effective seismo-resistant interventions were the “gaiola” (Cage of Lisbon) used in the reconstruction after the 1755 disaster (Fig. 26) and a few mentions of the “Baraccata Building-(Borbonica)” (Fig. 27), a new idea following the “gaiola” and proposed in Italy after the 1783 Calabria earthquake. That structure consisted of a timber frame, with uprights, beams and diagonal members acting as bracing, “buried” in the masonry structure with box-like behaviour.

Fig. 26
figure 26

The pombaline “Gaiola” used in reconstruction of Lisbon after the 1755 earthquake (Fernandes 2021)

Fig. 27
figure 27

Casa Baraccata (Borbonica): interior timber skeleton with diagonals (after Vincenzo 1783)

Considering the possibility of a new earthquake, the Marquis of Pombal demanded that the buildings integrate an earthquake-resistant structure to avoid further uncontrolled city destruction. In response, the pombaline cage was created—a latticed structure in wood, resistant to the different directions of the seismic waves. For the first time in human history, a city has been (re) built using seismic-resistant techniques (Lopes 2012).

The pombaline cage follows the development of several types of earthquake-resistant structures over time (Fernandes 2021). The system most comparable to the pombaline cage is, until today, the so-called Baraccata house. It is a building with one or two floors in height, regular and symmetrical. The wooden porches with sleepers, filled with clay or stone masonry, ensure resistance to earthquakes.

There are multiple theoretical studies on the pombaline cage, but only a few known cases about physical models. Nowadays, more people are constructing new digital and physical models. That's the case in Fig. 25, as Fernandes (2021) explained, which used a 3D printing technology FFF (fused filament fabrication) to recreate physical models. The base scale of the models for the general description of the structure was (1:50), and the representation of details that allowed to deepen its understanding, such as the “Saint Andrew” cross, was (1:10).

Although of great regularity as seen from the roads, the pombaline buildings have varied geometrical characteristics, depending on their location on the block. The height remains regular, the length is variable, and the number of spans can vary between two and six, the most common being the presentation of three or four spans.

The structure of the pombaline building on alluvial soil in the embankment of an old shipyard required a foundation of its own, emphasising the use of two of them (Mascarenhas 2009). The first type of foundation consists of piles in direct contact with the ground. These support masonry arches are strategically located under the gap between the walls and the ground floor pillars. The second type of foundation differs from the first one in that it does not have arches but a masonry slab that rests directly on the wood railing. There is a structure of piles and logs throughout the building's implantation area. Thus, master walls and pillars share the same foundation, contrary to the older construction system.

The ground floor develops essentially on pillars and master walls corresponding to the exterior walls, creating an accessible space used in a multipurpose manner. These pillars are built by rigging large stones, carefully cut. On the other hand, the walls are formed with prominent irregular shaped stones garnished with smaller stones. This floor reveals a structure completely different from that of the upper floors, secured in two different ways: the first by masonry pillars, as described above, topped by brick arches; the second, through pillars connected by arches and also brick vaults, therefore, there is no reinforcement of the wooden structure.

The pombaline cage (“Gaiola”) developed on the aerial floors consists of a matrix in which the fit between different pieces is done strategically. The vertical, plumb elements, continuous along their length, equivalent to the dimension of the right foot of the floor, are distributed equally. Its ends fit into horizontal components, the arrows. These pieces are present only on the horizontal perimeter of each wall. The struts, and other horizontal elements, fit into the plumbs and prevent their lateral movements. In this way, a grid is created, in which the lateral movements are hindered. Still, it is necessary to introduce diagonal elements, struts, which form the “Crosses of Saint Andrew” to resist the most critical horizontal forces during an earthquake.

The “gaiola” used on the interior of buildings consisted of timber frames with vertical and horizontal timbers of approximately 10 × 12 cm, with internal braces, forming an “X” (Fig. 25). The woods for the cross are 9 × 11 cm in section. The frame was then “nogged” (i.e., filled with brick) in the triangular spaces formed by the crosses with a mixture of stone rubble, broken brick, and square pieces of Roman brick in different patterns in each panel. The interior walls were then covered with plaster, hiding the infill and the timber frame. This exterior plaster protects the building against the possibility of fire, as happened during the quake. The external façades of the “downtown Lisbon” buildings were reconstructed with loadbearing masonry walls of about 60 cm in thickness, some of which had a timber frame on the inside face.

Different types of walls can be highlighted (Fig. 25): main walls, located on the façades, present a simplified structure without diagonal elements; (2) structural or frontal walls, interior walls formed by the most recognised system of the pombaline cage—they have vertical, horizontal and diagonal elements, including the crosses of Saint Andrew and; (3) non-structural walls, partition walls. In each case, the cage works differently, and its parts are distributed differently.

Floors and ceilings vary from floor to floor. The type of floor on the ground floor varies depending on the use of this space. The floor consisted of a large irregular stone slab if intended for commerce. If designed for stables, in the case of most secondary street buildings, the floor was covered with graded pavement or even dirt (Santos 1989).

The ceilings on the ground floor are also very different from those on the upper floors. They are presented in stone or brick masonry, composed of arches or vaults, covered with a plaster of sand and lime.

In perfect harmony with the rest of the pombaline cage structure from the first floor, wooden floors connect all the other system elements. The beams rest on the brackets, through a half-wood samblage, with a top to bottom nail. To guarantee the horizontality of the floors, beams and frames, they should be perfectly level with each other.

In addition to laying on the façades and carrying the loads to the masonry walls, the beams also rested on the façades of the front walls, which ensured the connection of the floor to all the walls and kept the beam completely straight.

The cladding on the upper part of the beams was usually made of wood, with timber planks. These boards were nailed perpendicular to the framework, from top to bottom.

Regarding the floor coverings, different techniques could be applied. One of the most used was similar to the paving of the floor, where, at the bottom of the framework and perpendicular to it, wooden planks were nailed.

For lighting, the stairs of the pombaline buildings were built close to the main façade or the back façade. The staircase develops differently on the ground floor. On this floor, the first and additional stairs are made of stone masonry, limited by resistant masonry walls—representation of main walls, frontal walls, and partition walls, respectively (Nunes 2017).

On the first floor, the development of the stairs is made of wood. The stairwell consists of three front walls, dividing the stairs between floors. The legs, diagonal elements on which the steps rested, were locked to a stone block, which served as a starting point and a guarantee of locking.

The roof has a simple shape, with trusses, “madres”, poles, rows, and “counterfrechal” supporting the straw tiles. This gabled is supported by the structure of the main façade and back walls. Although the roofs are very similar from building to building, in the pombaline system, there are two types of roofs: mansard and triangular. A case study of a Pombaline narrow façade building is shown in Fig. 28.

Fig. 28
figure 28

Pombalino case study: partial sections and plans (Stellacci et al. 2016)

Another technique for mitigating the effect of intense shaking, the concept of “base-isolation foundation”, is also an old tradition from ancient times. However, it was only in the seventies of the twentieth century that the application of base isolation systems to protect structures began to be discussed and implemented. The first examples of the use of this concept date back to the fifteenth century BC in Crete and Egypt, which used a layer of sand interspersed between the walls and foundations. This layer would allow a uniform distribution of loads transmitted to the foundations and act as a protecting layer when an earthquake occurs. In the fifteenth century AD, new examples of base-isolated buildings appeared. In fact, around the year 1400, many of the palaces built in the Forbidden City in Beijing were laid on a layer of lime and rice mortar, a layer that would act as a sliding surface in the event of an earthquake (Buchanan et al. 2021). These palaces have withstood three major earthquakes during their history.

The first reference in modern times to the basic isolation principle belongs to a Scottish engineer, David Stevenson, in 1868 (Carpani 2017). Base isolation is also considered through “Balls and Rollers”, already recommended for lighthouses. Milne, years before, also tracked this solution to protect the lamps.

In 1870, and just two weeks apart, two patents were filed in San Francisco (Carpani 2017). The first, by Jules Touaillon, consists of a system of spheres to be placed under the walls of buildings. The construction of the foundation rests on several layers of logs arranged in such a way as to allow horizontal movement; the execution of a trench around the entire building to avoid restrictions on the movement-building horizontal, and the design of the low triangular structure ensures greater rigidity.

The second patent, authored by A. F. Cooper, is the first solution to use natural rubber supports. According to the author, the aim is to cushion the shock caused by the earthquake and provide the system with elastic springs (Kelly 1981).

In the twentieth century, two more examples of isolated systems appeared univocally conceived. In 1906, Jacob Bechtold, from Munich, registered a patent for an isolated system. The system consisted of a rigid plate that served as a support base for the building. A set of rigid material rollers supported it to maintain free horizontal movement (Buckle 1986) of the headlamps that affect seismic actions. For this purpose, he designed a sliding system supported by three spheres.

Tsunami prevention was recommended by Colonel Cortés, an expert on construction in earthquake-prone countries (Cerero y Sáenz 1890) looked into the tidal waves in Japan during times of a great earthquake trapping the water energy with a bamboo grove. Still, “at the time of a sea wave, people should seek refuge at a high place”. Interlacing bamboo roots prevented the opening of the ground and the sea waves from reaching high zones.

It is not a repetition to re-state the impressive knowledge in Seismology during the second half of the twenty-first century with developments of the first seismograph and the understanding the wave propagation. On the contrary, many observations were made by the experts that tried to theorise about several causes of earthquake origin, and the data gathering from a network of instrumentation is beneficial for strong motion seismology. Topics like amplitude and frequency were already present in this analysis, and impressive observations are still questioned by modern seismology. Even enquiries about citizen participation were tried to record in a DYFI-type organisation (Atkinson and Wald 2007).

Architecture and engineering have always been side by side all these centuries. The concept of military engineers came across many representations up to the eighteenth century especially dealing with structures for the defence such as towers, castles, defence walls, etc. Even nowadays, we have examples of the best representations of art constructions made by people who absorbed aesthetics and engineering principles (Calavera is a good example- https://calatrava.com). The role of competence between architects and engineers has been present as we assist in the publication “History of Architecture” made by engineers. Architectural engineering was established as a discipline in the formal realm of engineering in the late nineteenth century. The University of Illinois became the first of many universities to offer an architectural engineering program. If written by an architect, the relevance of the construction topics would emphasise aesthetics. But countries like Spain still have a standard course nowadays for Architects and Engineers.

The Construction engineers of the eighteenth century were essentially the so-called military engineers, and only later did the profession of Civil Engineers give the first steps to becoming autonomous.

A similar issue was happening between the physics-seismologists of the same century who, besides trying to explain the Earth and its mysteries and concerns with ground movement, also tried to explain the structural behaviour of housing during shaking, proposing explanations for the observed damage and recommending forms of construction to minimise the effects of wave passage. This mixing of topics arrived at our times in small/large details. Nobody questions why the assignment of intensities is attributed to seismological institutions when it is evident that considerable damages can only be explained by structural engineers, much more acquainted with that topic than that seismologists.

3.2.4 Robert Mallet Experience after the Neapolitan Earthquake of 1857: from a Civil Engineer to a Pioneer Seismologist (1810–1881)

The 19th Century started with:

  • Field missions to significant events (Mallet-Milne).

  • The first seismic instruments.

  • More Historical catalogues. Maps of World Seismicity.

  • The Intensity scale.

    And finished with:

  • The steel construction.

  • The massive monuments and bridges.

  • First Codes of practice based on experimental evidence.

Robert Mallet FRS (1810–1881) was an Irish geophysicist, civil engineer and inventor. He was known for his earthquake research and is considered by some to be the father of seismology (Fig. 29).

Fig. 29
figure 29

Mallet: 3 Volumes on the basic principles of the theory of seismic wavefield. Rotation in the belfry (Mallet 1862)

Quoting Robert Mallet:”The method of investigation which I propose to adopt is based upon the undeniable truth, that the disturbances and dislocations of various solid objects by the shock of earthquake, I carefully observed concerning their directions and extended of trouble, and to the mechanical conditions in play, must afford the means of tracing back from these effects, the directions, velocities, and other circumstances of the movement s or forces that caused them.”

Rev. Samuel Haughton, Professor at Dublin, arranged a series of workable equations to define the directions and velocity of fractured or overthrown bodies. Mallet took them to Naples Region to study the Great Neapolitan earthquake of 1857 (Mallet 1862). He found various equations to determine the overturning of different types of simple objects such as parallelepipeds, cylinders and more complex situations. With the help of hand drawings and photographs of various objects at the same site, he was able to back-trace the wave shaking and draw conclusions on the direction of incoming waves and, by triangulation, the location of the event’s origin. He analysed more and more complex structures like the rotation of statues, damage to buildings, walls, monuments, etc.; all cases he claimed were important to have multiple sources of information to complement the interpretation.

He found explanations for any physical phenomenon with great imagination (Figs. 30, 31). Seismology and Epicentre are names attributed to him. He also compiled Earthquake Catalogues.

Fig. 30
figure 30

Mechanical theory of direction of wavefield based on construction cracks and objects trajectories (ballistics) (Mallet 1862)

Fig. 31
figure 31

To obtain the line of shock from the pattern of collapse or collapse of a floor: 1857 Neapolitan earthquake- Robert Mallet with a detailed study on the structural behaviour of masonry constructions (Mallet 1862)

When I went on a field mission and looked for rotated tombstones or chimneys, or the fall of decorative objects, I realised now that 150 years before, someone was doing the same!

According to Ferrari et al. (2005), Mallet has a full curriculum; besides this great treaty on the physics of objects and construction under wave field, he prepared a catalogue of earthquakes, got the first world map of epicentres, determined the wave velocity of from explosions, etc. Their background in math and physics let him determine the velocity of thrown objects from the top of the wall or ornaments in columns by understanding the object's final position.

Nowadays, we can enjoy software computing (Rocscience-RocFall 2019) that can estimate the initial velocity of a fallen rock block or boulder that jumps and rolls on inclined surfaces, coming down a hill, until a stop. They use the same equations Mallet used 150 years ago (Fig. 32). The simulations suggest a PGV of 4–6 m/s at the site at this particular example.

Fig. 32
figure 32

Probable peak ground velocities at the rockfall sites were speculated to vary in the range of 4–6 m/s. (Rocscience-RocFall, 2019)

Another example of damage orientation of earthquake effects in pre-instrumental periods is developed by Martín-González (2021). He introduces uncertainty in the orientation of observed values, back-calculating the ground motion pulse, proposing an abacus to get the input directions (Fig. 33), following similar ideas as the ones of Mallet.

Fig. 33
figure 33

Classification of earthquake damage showing the angle of uncertainty (after Martín-González 2021)

The seismicity map of Mallet and Mallet (1858) was based on felt reports. Dark brown areas have a high number of intensity reports and the yellowish regions with a low number of intensity reports (Fig. 34). Amazingly, 60 years before the plate tectonic model was accepted, Mallet had already traced the great lines defining those major plates. Like many others, plates like the Caribbean or Nazca were already very well explained.

Fig. 34
figure 34

Word zones of higher seismicity (Mallet and Mallet 1858)

3.2.5 John Milne (1849–1913)

Milne, Ewing and Gray founded the Seismological Society of Japan (SSJ), the first Society acting as a Forum for discussion. The society funded the invention of seismographs to detect and measure the strength of earthquakes. John Milne is generally credited with creating the horizontal pendulum seismograph in 1880 (Milne 1881). He made instruments that permitted him to detect different earthquake waves and estimate their velocities (Fig. 35).

Fig. 35
figure 35

The torsion pendulum of J. Milne

He contributed to creating seismological observatories worldwide, in a total of 40. Also, he was responsible for the first recording of Strong Motion with the Japan earthquake of 1882. Generally, the first record of SM is worldwide attributed to the M6.3 Long Beach earthquake in 1933 and later seconded with the record of the M7.1 El Centro earthquake in 1940. Milne was also very interested in the vibration phenomenon and dedicated some time to analysing the vibrations of bridges and other structures.

He was a pioneer in structural dynamics, as we see this field of knowledge nowadays. To check the importance of frequency in the structural behaviour of buildings, he proposed an experiment with three weights connected to a moving base by three vertical sticks. They were three independent One-Degree-of-Freedom Systems (Fig. 36), and he performed nine experiments with this set-up:

  • three experiments with one stick each;

  • one experiment by connecting in the top the three weights;

  • three experiments connecting two at a time;

  • two experiments by adding diagonals.

Fig. 36
figure 36

Concept of frequency and resonance (each stick has its own weight)

Citing Milne (1898): “This would seem to show us that if the natural period of vibration of a house, or parts of it, at any time agree with the period of the shock, it may be readily thrown into a state of oscillation which will be dangerous for its safety”.

With that experiment, he introduces the concepts of natural frequency and resonance, using a shake table to prove the importance of his findings.

Another famous reference in his field of observations is the rotation of simple columns or parts (Fig. 37). Rotations caused by soil-structure interaction have been observed for centuries (e.g., rotated chimneys, monuments, and tombstones relative to their supports). Figure 36 shows the rotation of the memorial to George Inglis (erected in 1850 at Chatak, India) as observed by Oldham (1899) after the 1897 Great Shillong earthquake. This monument had the form of an obelisk rising over 60 feet high from a base of 12 feet on each side. During the earthquake, the topmost six-foot section was broken off and fell to the south, and the next nine-foot section was thrown to the east. The remnant is about 20 feet high and is rotated ~ 15° relative to the base. Now the study of the wavefield introduces the rotations, already known from theoretical grounds but could only be measured by “arrays” or by rotational sensors for “Point Rotations” (in Todorovska 2009).

Fig. 37
figure 37

Rotated obelisks (Milne 1898; Oldham 1899)

But as Lemos et al. (2015) demonstrated, there is no need to impose rotational components of ground motion at the foundations to inflict rotational effects (Fig. 38). The problem is much more complex but needs consideration for the wavefield's completeness.

Fig. 38
figure 38

Obelisk in Lorca, 2011 (Lemos et al. 2015)

As mentioned before, Milne gave great attention to seismology and engineering. In Japan, as mentioned before, he created the first Society of Japan in 1880, much ahead of time compared to other societies. The last two books he published (Milne 1911a; Milne 1911b), two years before he died, contributed significantly to the history of important earthquakes and seismology. Many more experts whose contributions were significant at the turn of the nineteenth century could be named: Lord Rayleigh, Omori, Oldham, Montessus de Ballore, Wiechert, Mohorovici, and Gutenberg. Many immigrated from Europe or Japan to the USA. Their names are related to some discovery, instrument, the law of occurrences, etc. In common, they all have profound knowledge of mathematics and physics. For a detailed description of the most important names related to Seismology and Earthquake Engineering (SEE), see Howell Jr. (2003).

3.2.6 Measuring shaking: maps of intensities and instrumental networks

Measuring the vibration of ground motion is an old desire of all earth scientists. The first intensity maps were drawn during the last quarter of the eighteenth century, but instrumental records only appeared more than half a century later.

Field–missions were always critical to understanding the wave field and the performance of the existing stock of buildings/monuments or nature. Even nowadays, where we can see in our office a lot through the new technological tools using photography, movies, video cameras, satellite imagery, drones, etc., there is a lot of information that only the visit to the damaged site can depict. And that is so true that Robin Spence (2014), who visited many places as a member of these missions, wrote a chapter in his “Third Ambraseys Lecture 2014” dedicated to the lessons obtained during those visits.

So, since the first advancements in seismology, it became clear that it was necessary to have a way to measure the power of an earthquake. As the instruments were very rudimental at the time, the only way to pursue was to look at the effects produced by the wave passage and have an overall metric.

The first tentative to identify the intensity was made in the first half of the nineteenth century after the 1810 Mor earthquake occurred in Hungary, which gave rise to a dissertation de “terrae motu Mórensi” (Pál Kitaibel and Ádám Tomcsányi). Probably the first map of isoseismals after the 1873 Calabrian earthquake (Schiantarelli's map) (see Varga et al. 2015) was completed by Kiteribel and Tomcsanyi, professors at Budapest University in 1814 (Fig. 39). The first isoseismal map, drawn in the way we are used to seeing it nowadays, was made by Mallet for the Neapolitan earthquake of 1857 (Fig. 40). After that, many more experts produced their maps as the case with the earthquake that impacted Luzon's island in 1880 (Cerero y Sáenz 1890).

Fig. 39
figure 39

Detail of Schiantarelli's map of the 1783 Calabrian earthquake

Fig. 40
figure 40

Great Neapolitan Earthquake of 1857: Isosseismals

The works of John Milne gave the first steps in instrumental seismology. Figure 41 shows probably the first ever-recorded event in history. On the left side, we see 10 s of the two horizontal components of the shaking (March 11, 1882), and on the right, we see the entire record (March 1, 1882) of 138 s of one component. Without knowing the existence of P and S waves, he calls them “a shock” after a preliminary tremor and then the “chief shock”, terminating the record with “concluding vibrations”. Twenty-four seconds was for the tS-tP arrivals. The two peaks, the direction of motion, the duration, and the evolution in time of the record to longer periods as their amplitude diminishes were observed by Milne. In addition, he retained that underground vibrations were smaller and much smoother than those recorded on the surface. Milne and other Japanese colleagues were able to record from seismoscopes and determine amplitude values of motion.

Fig. 41
figure 41

Record of March 11, 1882 in Tokyo; and March 1, 1882

But the more significant advances were made by Montessus de Ballore in the late 1800 and turn to 1900. The Chilean Chief Engineer revolutionised the instrumental seismology and opposed some of the theories of Mallet, which were essentially based on the orientation of the cracks provoked by the wave passage. He would make more confidence in the records obtained at multiple locations and, from that observation, try to get to the source of the shaking. A few examples merging the words of damage with the need to get instrumental information can be seen in the Box below: The formulae and the numbers were taken from Mallet (1868) and Milne (1898) and support the statements made.


Box—A few formulae to evaluate the characteristics of ground motion (John Milne).

Gravestone—In case of overturning, it gives maximum acceleration and direction of motion.

Fluid Seismometers—Tubes full of mercury;

Period of Seiches—τ = \(\frac{l}{\sqrt{gh}}\)

\(l\)– length of the recipient in the direction of oscillation;

h—mean depth of water;

\(v\)—velocity of wave propagation.

Formula to measure the depth of a lake \(h=\frac{l}{g\times {\tau }^{2}}{=v}^{2}/g\) (Russel formula).

Horizontal Pendulums indicate displacement amplitudes which are associated with felt effects:

1 to 2 mm—strongly felt

10 mm—dangerous

20 mm—Shattering of chimneys, dislocated stones

up to 50 mm—from face walls. Crack brickwork and plaster and other damages.

The concept of attenuation of motion and influence of soil conditions—period changes with distance to origin and soil were already observed.

 

3.2.7 The first codes of practice

Codes, or simple recommendations including techniques to apply, depending on the type of society you are in and how often you have to deal with earthquakes or tsunamis, pretend to fill the gap between the scientific/technical community, which is conscientious of the earthquake problem, and the construction industry which, in most of the cases, do not care about earthquakes.

As referred before, the main highlights as far as seismic-resistant construction are concerned until 1850 are as follows:

  • Himis construction in Turkey (Gülkan and Langenbach 2004).

  • Italian wood elements (Palermo, 1726, 1751).

  • The Pombaline Gaiola (1755).

  • Baraccata Building (1783, after 1755 “gaiola”).

  • Jean Rondelet and August Choisy (1838).

  • Encyclopedia of Construction (J. Gwilt 1867).

As referred to before, prescriptive measures for better seismic performance could be attributed to the regulation passed after the 1509 Istanbul earthquake (Gülkan and Sözen 2018). But we know that always after an important event, many advancements in science take place dragging updates in codes, as it had happened after the earthquakes of Torrevieja (1829) and Andalucia (1884), not to mention the Technical proposal in Italy: Proposte tecniche” (Stamperia Reale di Napoli 1788).


Box—Good design.

In soft soil, the problem is worse due to the input of waves like sea waves in the Arribas. If the building does not function as a compact assembly of connected elements, it will fall into pieces. Or, as an old say, all parts should move in synchronism. Ultimately, gravity makes structures fall when pieces are apart, but it also aggravates the effect of shaking in slender and high structures like lighthouses or chimneys.

In a building, the façade “goes out of plane”, leaving the remains of the structure, allowing the observation of the interior rooms. Otherwise, the façade stays, but the interior may collapse.

In soft basics, shaking is like gravitational waves, similar to sea waves.

There is a very similar effect with boats under the rough sea. The cage structure in a ship is identical to the proposal of a cage structure in a building.

The principles for a good design should attend to the three characteristics: (1) homogeneity, (2) elasticity and (3) indeformability. Homogeneity—to avoid disorganisation; elasticity to oppose inertia; indeformability to prevent too much load in the cage elements. Points (1) and (2) can be easily accounted for with a good choice of materials. But (3) it is more difficult to achieve and inspire the use of a cage with diagonals that disaggregate the structure into triangles that cannot deform. Here comes the RC that merges all three characteristics. A building constructed with RC elements behaves much better, as observed in Messina (1908) and San Francisco (1906) if they provide connectivity during the motion. There is an extra advantage of RC Construction that it is resistant to fire and hygienic fitness. At the time (turn of the nineteenth century), RC Construction was expensive and difficult to erect.

Site-effect (soil & topographic): construction material should be solid, compact, coherent and elastic (stones, brick, mortar) with the same properties.

Vibration hypochromic. If a “wall” falls in, the connecting element (mortar) does not have the same resisting properties.

The more homogeneous and elastic, the better! Opening windows or doors is against homogeneity and is a vulnerable region of the structure.

The first waves (P-waves) do not cause problems. The S-waves (no name yet) have less velocity and higher period and amplitude; if surpassing 20 cm, they become destructive.

The phase of arrival is another problem that propitiates the disaggregation of blocks. However, inertia also plays an important role. This causes more problems in the high locations of the structure. The wall responds as an inverted pendulum. The critical point will then be half-height because of inertia.

 

Scibilia (2017) summarises the tentative made in Italy between 1880 and 1910 to adopt Codes of practice for earthquake-resistant constructions as alternatives to RC. In fact, “The violent earthquakes that hit Italy in the second half of the nineteenth century spurred multiple reactions in earthquake-resistant construction and gave rise to specific regulations aimed at setting technical requirements for post-earthquake reconstruction. In Italy, the first earthquake-resistant standard code was issued after the catastrophic 1783 earthquake that struck the cities of Messina, Reggio Calabria and other smaller towns. The Royal Instructions for the Reconstruction of Reggio issued by the Bourbon government on 20 March 1784 validated the Baraccata construction system.” A few regulations were implemented in Italy and other countries, especially after significant events, considering the Baraccata system. Towards the end of the 19th Century, a few timber elements were replaced by iron skeletons.


Box—The first Official Seismic-Resisting Code of Practice.

The Italians claim that the first seismic-resistant regulation in Europe was elaborated and implemented by the Bourbons in 1785, following the two very strong seismic events that in 1783 hit with particular violence Calabria (the current southern Calabria) and Sicily, causing (it is estimated) fifty thousand victims and incalculable damage. The effectiveness of this construction regulation was tested during earthquakes in 1905 and 1908 (Messina). The constructions built following the Bourbon regulation, while suffering significant damage, under no circumstances suffered total collapses.

 

By the turn of the nineteenth century and the beginning of the twentieth century, several actions taken in Italy, described by Scibilia (2017), give a good insight of a Report “Building Standards for the towns struck by earthquakes”, published in 1909, that turned into “Royal Decree 6 September 1912” with the significant preoccupations on the earthquake resistant design at the time. This triggered a great interest within the international community, creating competitions for better designs. A few points are described below:

“For the traditional wood Baraccata system, it was determined that its effectiveness depended on building a well-connected timber skeleton, with light infill materials covered with metal cables or metal mesh to secure the plaster. Such a system was suitable for buildings of no more than two floors high, with a square and compact floor plan.”

For Foundations, three main categories were recommended:

  1. (1)

    Rigid foundations;

  2. (2)

    Foundations (independent) from the ground to minimise transferring vibrations from the bottom to the building through the application of rollers, balls, springs or other devices.

  3. (3)

    Foundations falling within an intermediate category between the two types above, characterised by placing an artificial layer of sand or detrital material between foundation and ground to cushion the effect of seismic shock waves.

For a short but accurate summary of the evolution of seismic-resistant construction worldwide until 1900, see Cantelmi (2017).

Base isolation was also considered through “Balls and Rollers”, already recommended for lighthouses. Milne, years before, also tracked this solution.

This technique was limited to ring beams and floors to transition to full RC structures.

3.2.8 The 20th Century—1900–1950

  • The Elastic Rebound Theory, Fig. 42.

  • The Seismological instrumentation (WWSSN –after 2nd WW) (120 stations by 1960).

  • The first Strong Motion Records (Long Beach, 1933 and El Centro, 1940).

  • Change in materials (from masonry to mixed masonry & steel; Reinforced Concrete).

  • First steps in Structural Dynamics.

  • Field missions with always something new.

  • First codes (application of standard static forces).

Fig. 42
figure 42

1906 San Francisco Earthquake: Elastic Rebound Theory (by H.F. Reid)- fence with > 3 m offset

The relative movement along the fault plane after the 1906 San Francisco earthquake was the nearby circumstance triggering the famous elastic Rebound theory, attributed to H.F. Reid (1911). But this relative movement (fault slippage) was already observed in Japan during the 1891 Nobi earthquake. The importance of this observation led the Japanese authorities to build a shelter to preserve what was designated a natural monument.

To illustrate another compelling case among the various enumerated above, we shall refresh the developments that led to the first known case of a set of forces to represent the seismic action. In 1909, after the 1906 San Francisco earthquake, the first steps were made, but only in 1927 did we have the Method of Elastic Static Equivalent (Fig. 43—provided by J. Estêvão 2005). The El Centro (Imperial Valley) 1940 was the earthquake that introduced the concept of “Response Spectra” (RS) by Housner (1952) after the Long Beach 1933 Earthquake. RS was initially introduced by Suyehiro in 1920 and developed later by M. A. Biot in 1933 (in Trifunac 2006) and, finally, by G. Housner. Suyehiro’s work (1929) on his multi-pendulum recorder is sometimes cited as being the first appearance of the idea of representing earthquake excitation by a response spectrum. Freeman (1932) is also one of the first researchers to implement the concept of RS. Trifunac (2009) makes a good summary of the first steps toward the initial developments of RS.

Fig. 43
figure 43

Proposed US Pacific Coast Uniform Building Code”, 1927 California. Method of Elastic Static Equivalent forces already in use in Italy and Japan

In 1943, a variant of the static Equivalent Method was included in the Los Angeles Code, proposing a seismic coefficient of C = 0.1.

  • Great change in buildings construction materials.

  • The arrival of Codes of Practice in construction.

  • Great disasters (San Francisco, 1906; Messina, 1908; Kwant, 1923).

    1. o

      After the great earthquakes of Messines 1909 and Provence, Montessus de Ballore realised that the wave field’s movement had a harmonic character: Longitudinal waves similar to sound waves and transversal waves as it happens with light.

    2. o

      The use of poor quality construction materials, often rubble stones, and the widely adopted construction technique known as “a Sacco”, which used bare rocks, poor quality mortar, and delicate stone façades, was blamed for the widespread collapse of many buildings. Buildings constructed with better quality materials or practices were less prone to collapse during the earthquake.

    3. o

      The Nikorai-do, an Orthodox church in Tokyo originally designed and built in 1891 by a famous English architect, was severely damaged in the 1923 earthquake. After the earthquake, it was redesigned by a Japanese architect and reconstructed considering seismic resistance. The seismic coefficient c = 0.1 came from the fact that during the Kwant Earthquake in Tokyo, PGA was estimated as 0.3 g, but the “safety factor of material strength to allowable stress was assumed to be 3.” (Dr. Toshicata Sano—in R.K. Reitherman, (2012a).

  • The Original steps of Structural Dynamics

    Structural Dynamics are part of Structural Engineering, whose roots go back to the 17th century with Galileo Galilei, Robert Hooke and Isaac Newton. Others like Leibnitz, Euler, and Bernoulli completed the group in the 18th century. In the 19th and 20th centuries, knowledge became similar to what is taught in Structural Analysis classes. Structural Dynamics adds inertia to the structural system. The first treaties on modal analysis as part of the eigenvalue problem were mentioned in 1907 with Artur Danuso (Sorrentino 2007). Still, the first accounts go to Norris et al. (1959), followed by others like John Biggs analysing wind loads and then writing one of the first treaties on Structural Dynamics (Biggs 1964). His modern approach to modal analyses gave a strong push for solving Earthquake Engineering problems. Newmark and Rosenblueth (1971), followed by the classic Clough and Penzien (1975) Dynamic of Structures, made the first contribution to a well-developed treaty. Many others followed a decade after. We can also mention Timoshenko and Young (1990) or Timoshenko (1948), but these works were more oriented to other types of structures. The first edition of the famous SAP programme took place in 1970 (Fig. 44).

Fig. 44
figure 44

First edition of SAP by Wilson (1970) and Wilson et al. (1973). (Acknowledge J. Estêvão for providing the Figure)

3.3 Points to retain

The period from 1700 until 1950 was full of developments in the knowledge of the wavefield and its consequences on the built environment. From almost no understanding of the origin of earthquakes to the way, waves radiated in parallel, no scientific knowledge of the physics of the construction under seismic waves, but good understanding based on experimental evidence. The first interpretations of those facts were settled at the end of the period. What follows is an increase and consolidation of those facts.

We perceived the difficulties of experts and the tremendous efforts to mitigate earthquake impacts. But the tools were not yet present, and, by trial and error, science advanced, and technology made the first contributions to having a more resilient society.

Many ideas that are now cemented come from concepts already perceived. Pretty essential to be aware of where the present theories are coming from and what challenges we face to solve problems. Today, tools permit us to do much more than our ancestors and render society more resilient and sustainable towards earthquake impacts.

As a note, we can cite: “The descriptions provided by Vivenzio (1783) of “Pombalino and Baraccata” earthquake-proof houses are technologically pioneering from several points of view:

  • first of all, the intuitive principle of linking the entire building together as in a single structural unit represents an accurate understanding of how structures react to earthquakes;

  • secondly, the adoption of wooden bracing diagonals along the entire construction provides "in the plan" resistance to lateral actions resulting in a highly effective and cheap solution in construction;

  • third, the symmetry in plan and elevation. This principle is in great contradiction with the design creativity that many architects claim.

4 Main achievements since mid 20th Century

Summary

  • Widespread seismic instrumentation (both seismological and SM networks), data treatment, archive and open-access.

  • Field mission and the power of earthquake occurrences to make science and technology advance.

  • Rotational Components of Ground Motion (Spudich and Chiou 2008; Lee et al. 2009; Todorovksa 2009a, b).

  • Asynchronism of input motion (still a problem to understand their effects).

  • 2D and 3D behaviour of soils and topographic effects.

  • Simulation of rupture of faults and wave propagation from source to the site.

  • Change from applied forces to structures to performance-based design (Performance-Based design).

  • Non-structural effects (initiate the modelling of infill walls).

  • Passive and active devices for reducing forces in the structure.

  • Soft engineering to estimate the impact of hazard and scenario impacts.

4.1 1950–2020

This period is full of developments in such a way we can consider that Seismology and Earthquake Engineering (SEE) got sonorities of new sciences, being taught in a large number of Universities throughout the World: Great American schools, namely UC Berkeley, MIT, Illinois, Caltech, Stanford, Michigan, Purdue, etc.; The European Schools with dedicated courses at Imperial College, University of Pavia (Rose School), Millan, Roma la Sapienza, Athens, Thessaloniki; the Japanese Courses at the University of Tokyo (IISEE-UNESCO), Kyoto, etc. Nowadays, there is an excellent selection of courses in Structural Engineering, Geotechnical, Hydraulics, Architecture, Mechanical and Electrical Engineering, Geology, Geophysics, Geography, Urban Planning, etc., and soft topics for students, alumni, practitioners, etc. Also, the excellent Laboratories connected to state-owned institutions such as the China Administration, Bristol Shaking Table, EU/JRC Elsa Pseudo-Dynamic Wall testing facilities, the Physics Laboratory in Trieste, or the large installations of the Army in various countries, E-Defense shake table, Chiba Experimental Station, Japon, facilities and organisations in the USA, Nuclear Installations in Saclay (CEA-France), etc., etc., the UNESCO Summer courses, the NATO Projects in Eastern Europe, gave a significant push in this area.

Whenever a significant event occurs, the inflicted impact (human and economic) is so high that the politicians in the affected countries do not want to “suffer” anymore. They do not want to take responsibility for the new event, so they divert resources to mitigate the impacts. If no seismic activity takes place for a certain period of time, they ignore that earthquakes are recurrent events.

In terms of University curriculum when it turns to the twenty-first century, it is exciting to verify in many institutions that the consolidated courses with more theoretical classes are kept. Still, new courses dealing with a more holistic matter are offered to the students, such as management, political policies, jurisdiction aspects, and the possibility of a more extensive selection of courses traditionally belonging to other department outside the Civil Engineering or Applied Physics. And many post-labour-hours courses are offered, many online as the pandemic times of the 2020s require.


Box—Topics in Earthquake Engineering.

Topics like: Shaking tables, force-methods, performance-based analyses, SM seismology, Probabilistic models for occurrence, attenuation, Hazard Analyses, Return Periods, Frequency and modal Identification, H/V geotechnical observation, linear and non-linear modelling, small and large displacements, new materials, Computer code development, Codes, Code paradigm, Manuals, Guides, base-isolation, Structural and non-structural Damages. Health Monitoring. Simulations: of ground motion, of scenario impacts. Field Missions to moderate and large events. Shaking, tsunamis and related hazards. Multi-hazard analysis and domino effects.

or

•Geotechnical influence (soil and topography).

•Field missions with always something new.

•Correlation of magnitude with fault properties (L, D, W- Bonilla et al. 1984).

•Probability Science (Hazard, Vulnerability, Existences).

•Extreme events (long return periods).

•Sophisticated codes (with increased requirements).

•Variability (aleatoric and epistemological; ergodic vs. non-ergodic).

•Rehabilitation (engineering is specialised and the whole process costly).

•The Societies for Seismology and Earthquake Engineering.

These ideas became commonly accepted and appeared in any conference or workshop.

 

Considering the number of topics developed in the last 70 years, we will concentrate on just a few that merit a little more attention.

4.2 Strong motion seismology

4.2.1 Evolution of strong motion stations

Strong motion instruments became available in 1950 (old SMA-1- Erdik et al. 1987) with a rapid evolution until the 1980s (Fig. 45) when they turned from optical to digital. By 2000 the number, according to Trifunac (2009), was a few thousand. The optical ones add significant problems in data treatment to determine velocity and displacements from acceleration traces, and it took much time to digitise the records. Several dissertations were made to determine the best correction to implement. The digital had fewer problems of this kind, but the dynamic range was not very large, in the beginning. With the advancement of technology, the situation became much better, including universal timing and the transmission to a central station.

Fig. 45
figure 45

Evolution of Strong Motion Stations—several sources; dynamic range (after Trifunac and Todorovska 2001; Trifunac 2009)

When the price went down, the importance of better understanding the wavefield became of most interest, and countries bought a significant number of units used side-by-side with the first generation of instruments.

Strong Motion Virtual Data Center (VDC)—5399 stations in database https://www.strongmotioncenter.org/vdc/scripts/stnpage.plx?

We now have two leading networks in Japan: Kiknet and KNET, with more than 1700 stations. 500, 3-component accelerometers that continuously telemeter acceleration data. One hundred of stations are located on the Los Angeles Unified School District campuses. About 200 stations around Ridgecrest (LA). In Sichuan, 2000 stations were implanted recently (Peng et al. 2021). Etc.

Ambraseys et al. (2002) initiated the archiving in a database of strong motion records at Imperial College. An “exponential” increase of GMPEs (Fig. 46) took place until 2010.

Fig. 46
figure 46

a “Exponential” increase of GMPEs (Douglas 2021); b SMART-1 Array for wave passage

The first attenuation curves for PGA were published by Milne and Davemport (1969), Esteva (1968) and Donovan (1972), with data available in California and the last one after the 1971 San Fernando Earthquake. With the sudden increase of attenuation curves, also designated as Ground Motion Predicted Equations (GMPE's), Douglas (2021) created in 2016 an Encyclopaedia and updated it yearly, since. This would contain all parameters about each GMPE, i.e. frequency dependence, tectonic environments, the geometry of faults, etc. If we include as many characteristics as possible known (Fault type, Seism Environment, etc., the number of different GMPEs would come to a very short count or even to a single universal law?

On the other hand, the SMART-I Array (1980) pioneered many aspects of Engineering Seismology (Bolt 1999): Coherency; intra and inter-variability of GM; Asynchronism; rotational components. SMART II and Chiba Experimental Station (Katayama 1991) were other fundamental facilities for studying characteristics of strong motions recorded in dense arrays.

Looking at the advancements triggered by earthquakes, we can organise the epochs since 1950 into three turning points. Gülkan and Reitherman (2012) developed a similar analysis based on the material presented in the various World Conferences until 2012. We will try to do a similar synthetic evolution based on events and how they provoke essential changes in Earthquake Engineering.

For an account of the events of more significance to the History of Seismology from the Chinese and Greek Civilizations to the end of the twentieth century, see Agnew (2002).

4.3 Earthquakes as turning points in earthquake engineering

Earthquakes have always marked the advancements of Seismology and Earthquake Engineering (SEE). Field missions were essential to inventory the most critical issues observed (see Spence 2014 a) for an account of the most exciting novelties detected in different Field-missions). In the years following an important event, improvements started emerging in the published literature and, later on, in the codes that incorporate that newly acquired knowledge. The lists that follow pretend to illustrate what facts were more relevant in each earthquake that led to upgrades in Earthquake Engineering. The lists are organised by epochs or three “Turning Points”, as Reitherman (2012) calls them. These Turning Points of the 2nd half of the twentieth and twenty-first centuries are similar to the “gaiola pombalina” and the “casa baraccata” seem to be the “turning point” of a gradual improvement process that, in Italy, became recognisable after 1703 L’Aquila earthquake (Carocci et al. 2021), with the introduction of a constructional system, based on wooden elements embedded in masonry works, quite distinct from the rigorous organisation of the late eighteenth century systems but having seemingly identical intents.”

First Turning Point—1964–1971

Country

Earthquake

Year

Observations

Japan, USA

Niigata, Alaska

1964

Liquefaction

Venezuela

Caracas

1967

Soil influence: pancake phenomenon

  

1968

Initial shaking tables

  

1971

Design tall buildings with seismic design

USA

San Fernando

1971

Pacoima PGA > 1 g

  

1974

New structural software, SAP, STRUDL, STRESS

  

1974

Probability Damage Matrices3

China

Tangshan

1976

Massive destruction after the unique prediction in the short run

Second Turning Point—1971–2001

Mexico

198 Mexico City

1985

Soil amplification at a considerable distance. Resonance

Armenia

Spitak

1988

Disaster in prefabricated housing

USA

Loma Prieta

1989

Impact on “life-lines.”

USA

Northridge

1994

Retrofitted structures with non-structural elements

Japan

Kobe

1995

Largest economic impact

Turkey

Izmit

1999

A multitude of effects. Impact on an industrial facility

India

Bhuj

2001

Significant impact for an M7.7

Iran

Ban

2003

Patrimonial devastation

Whitman (1998) was the first to introduce the concept of “scale of damage”, initially with 7-grades, which then evolved into 5-grades, D1–D5. These grades would be related to the per cent cost to recover buildings to their prior situation just before the event. These were the initial stages of the vulnerability and fragility concepts that became very popular for impact studies at an urban macro level.

Third Turning Point—2004–2020

Indonesia

 

Sumatra

2004

Great tsunami and great destruction

China

 

Sichuan

2008

Great effect on landslides

Italia

 

L'Aquila…

2008–2016

A series of significant events in the same active area

Haiti

Japan

 

Port-au-Prince

Tohoku

2010

2011

A developing country left alone for decades

The most extensive collection of data for shaking and tsunami

Nepal

 

Katmandu

2015

Many different constructors from different parts of the World were the builders before the event

Mexico

 

Mexico City

2017

Test on retrofitting after 1985

What is happening: Mega seldom events that cause massive damage and disruption, causing a massive area of perception (Sumatra 2004, Tohoku 2011), and small to moderate/frequent events that cause local damage, which might be disastrous, but affect a relatively restricted area (Albania 2019, Croacia 2020, Samos-Erice 2021, Les Cayes-Haiti 2021).

Code modifications in the USA. From ATC-3 (1978) to UBC (1988): introduction of soil influence; importance factor; Limitation to inter-story drift. Northridge 1994, introduced new changes to UBC (1997). First steps in ISO/FDIS—3010 and Euro-Codes (early 1990), which a decade later produced an enormous amount of material, including the EC-8 (2004). The Euro-Codes were based on the CEB-FIP Model Code 90, former Model Code 1978. (https://www.fib-international.org/publications/ceb-bulletins/ceb-fip-model-code-90-pdf-detail.html).

4.3.1 Caveats perpetuated for more than half a century

  • Soft-story concept

  • Reinforced concrete—not so great duration as thought initially and lack of confinement

  • Hazard analysis made of 1900's data only

  • PGA is the sole representative of Strong Ground Motion (SGM)

  • Deterministic formulation of SGM

  • Erroneous comprehension of soil behaviour for large amplitudes.

We selected two cases, the first and the last of the above list. The soft storey concept is generally considered a point of a significant vulnerability in a building due to the concentration of drift on the first floor instead of spreading it along with the entire height of the building. Many books and reports analysing the behaviour of structures during earthquakes attest to this statement. Nowadays, special provisions are made in codes to avoid this fragility. The idea of soft stories came with Le Corbusier (Fig. 46), who launched the concept in the mid-1900s to solve a few urban problems. In Portugal, we have a tremendous problem with this architecture, as seen in Fig. 47, because it became the fashion in the 50 s-to the 90 s, and the number of buildings is quite large with the extra aggravation that the columns are very slender, causing additional P-δ effects. The only way to decrease vulnerability is to intervene with dampers, increase rigidity, or base isolating these buildings. A few of these buildings were declared architectural landmarks for their architectural merits. This means that the mitigation of the problem needs close coordination with the architectural “bodies”.

Fig. 47
figure 47

Le Corbusier and his influence around the World: the “soft-story concept”https://upload.wikimedia.org/wikipedia/commons/a/a1/Corbusierhaus_B-Westend_06-2017.jpg

The other example of an error perpetuated during the second half of the twentieth century was the behaviour of soil sites versus rock sites.

Figure 48 shows the range of amplification of soil layer about firm rock to illustrate the problem of certain soils, which amplifies the motion about rock sites for small amplitude motion until an upper PGA. The curve presented resulted from the observations in the Kobe earthquake in 1995. A previous curve (in blue) made after Mexico in 1985 would be much lower. Many codes, prepared at that time, were wrong in what concerned soil effects.

Fig. 48
figure 48

Range of amplification of soil layer in relation to firm rock

A final short note of prescriptive measures showing a poor performance included in codes was the lack of confinement as seen in Fig. 49 (a) with the spacing recommended for stirrup. In 1971, the change became compulsory in several countries, including Japan (Ishiyama 2011), (b). Now the stirrups cross the entire connection beam-column as the example shown in (c) KTP-N.2-89 (1989).

Fig. 49
figure 49

a Large stirrups spacing (25–30 cm) until 1970, b 10 cm after 1971 and c after 1989

4.3.2 Performance-based design approaches (PBD)

All designs until the 1980s were essentially linear, spectral-oriented forces and based on the consideration of a unique ductility factor that would reduce the resisting forces due to actual non-linear behaviour. All elements in a structure would see their stresses reduced by the same amount despite the structure’s distributions concentrate only on a few points. The essential objective of the design was already stated in Hammurabi's Code (c. 1795 to 1750 BC), where it is said that “a house should not collapse and kill anybody”. This concept is also described in Vitruvius’s Books. Now, the Performance-Based Design (PBD) extends the above requirements to reduce damage to a controlled minimum and keep the functionality of the building for a certain level of seismic input. How do you do it? Defining the zones where you control the possible inflicted damage and permit non-linear behaviour and, doing so, you know where and how to concentrate your attention on the design of a structure.

Due to the significant uncertainties derived from the definition of hazard and construction materials, towards the end of 1990, a new current of thought started changing the way of looking at the goals of seismic design, making it more independent of those uncertainties. On the other hand, it makes no sense to use a simple ductility factor for the entire structure, knowing that damage to a system is concentrated in a few points of “weakness”. The idea was to define Limit States (LS) and fix Earthquake Demanding Parameters (EDP) for each one of them. The forces applied to structures were changed into EDP.

But that was done after the presentation of the Acceleration-Displacement-Response-Spectrum (ADRS), where the response spectrum was plotted in terms of Spectral Acceleration vs. Spectral Displacement (Fig. 50). And the non-linear capacity of the structure was defined in that same plot as the red line for various conditions and situations. The Intersection of those curves would reflect the Performance point. The Performance-based design approach (PBD) would come out of that.

Fig. 50
figure 50

ADRS (Acc-Displ-Resp-Spectrum) applied to various situations

Priestley (2003), with his paper “Myths and Fallacies on Earthquake Engineering Design”, was probably the first to pinpoint the PBD. He appointed 8 points that revolutionise the Design; the most important perhaps is”if we accept that displacements are more critical than forces, it is time we started basing our designs on displacement, rather than acceleration spectra”. This was the start of Displacement Based Design, or PBD, the opposite of strength-based design and ductility. We will come back to this point in Sect. 6.7.

4.4 The 21st century

The first 20 years of the second Millennium are so rich in scientific and technological achievements compared to the previous periods that significant advancements can be pin-pointed as landmarks of Seismology and Earthquake Engineering (SEE). Even mitigation of impacts might become a reality, despite the Mega Disasters.

  • Complex Codes for design are only possible to apply through complex computer programs

  • Rehabilitation with multi-hazard requirements (Energy & acoustics)

  • Simulation (fault rupture, impact on communities)

  • Inter-dependence and intra-dependence systemic analyses

  • EEWS and TEWS Systems

  • Machine learning (millions of terabytes of untreated information)

  • Mitigation of damage (pre, during and post-event)

  • Science for Disaster Risk Management (DRM)

  • Citizen Science

  • Education and Dissemination

  • The interest in Tsunami science.

The interest in tsunami science gained a new breezy with Sumatra 2004 and Tohoku 2011 events. The tsunami was a great “instrument” to reduce uncertainties on the location and mechanisms of ancient events by using back-analyses on the arrival times and on heights of arriving waves. But they were not considered as necessary as earthquake shaking even though one or two large tsunamis (Anchorage 1964 and Chile 1960) affected large important areas.

4.5 Introduction to the excel file: through ages

An Excel File in a flow chart, which constitutes "Appendix 2", pretends to show the development of science and technology through the different names of scientists who have collaborated to evolve Seismology and Earthquake Engineering (SEE) since ancient times. Mathematicians, physicists, architects, and engineers are this File’s core. Some books are also part of this historical review.

With this “aid”, it is straightforward to understand the interactions among the different fields of knowledge and how they interfere with the author’s route.

This File is completed with the list of references in "Appendix 1", where fundamental texts were selected for anyone formation. Two other artefacts should be added for the completion of information:

  1. 1.

    A Multi-Language Glossary of Natural Disasters (1997), a compilation of terms in several languages made in the context of IDNDR.

  2. 2.

    A selection of many names that made significant contributions to the state-of-the-art topics covered in this review (see Lee 2002 or Howell Jr. 2003).

4.6 Landmarks in the seismo-resistant construction advancements through the last 250 years

  • Reconstruction of Lisbon (1755)

  • Creation of the first academic society of earthquake research, Seismological Society of Japan (SSJ) (1880)

  • 1st World Conference on Earthquake Engineering (1956)

  • 1st Generation of Codes with the application of horizontal loads (1950)

  • Great advances in research and development to arrive at Modern Codes (20th Century)

4.6.1 Construction facing earthquakes. Modern codes for earthquake construction

Significant changes in seismic codes were made after the damage caused by severe earthquakes, starting essentially in 1900.

Fajfar (2018), in his fifth “Prof. Nicholas Ambraseys lecture”, analyses the past, present and future of seismic provisions for buildings. sources that are not necessary.

Table 6 summarises the highlights in the development of codes from the mid-19th Century until our days, emphasising the main issues in the discussion. We added to Table 6 a few complementary codes before 1900 and the recent EC-8 (2004) code.

Table 6 The evolution of analysis provisions in seismic codes

The first generation of Modern Codes was initiated with an ISO 3010 (1988), which produced the first documents. The first edition includes principles for determining seismic actions on structures and seismic design without any specific values for factors.

We should emphasise that many countries followed similar examples and created their own codes beyond these Codes. The most exciting evolution is seeing what has happened in the USA and Europe. In short, the USA code is not mandatory and has passed in recent years from IBC (International Building Code) with a first edition in 1997 and upgrades every three years. Many Agencies participated in the redaction of the codes, namely SEAOC, ATC, NBC, SBC, UBC, NEHRP, IBC, and ASCE-7 (Ishiyama 2011).

The EC-8 (2004), as part of a series of 6 documents (under revision to 2023–2024), was made for all the European space and had a Final part to be made by each country containing the customised parameters about each country. In some countries, EC-8 is mandatory; in others only a recommendation.

Codes dedicated to retrofitting only appeared very recently, and those parts are very “severe” for certain types of structures such as schools or critical installations, requiring, in some instances, higher seismic actions than for new structures of the same class of importance. This brings the question that this policy may not be the most appropriate measure because we may be spending resources that are not necessary.

Most science and technology developments came after significant events that denounced some new details not yet perceived. Most earthquake events, especially the larger ones, brought new seismology and earthquake engineering knowledge. Table 7 shows the events since the eighteenth century that have affected most Earthquake Engineering, according to Reitherman (2012). For each one, he assigns an increment in knowledge and points out essential items in the development, such as:

  • ductility concept; probabilistic approach; Structural Dynamics; and the socio-economic conditions of the people in the affected regions.

Table 7 Selected significant earthquakes that have affected earthquake engineering and lessons learned (Reitherman (2012)

According to Table 6, the interval between new versions of US codes is around ten years, but if we look at the countries around the World, the most updated versions are, in many cases, 20 years old. (Regulations for Seismic Design: A World List, IAEE 2020).

Table 8 details the new lessons gathered from recent earthquakes (1971–2011) more profoundly, directly impacting the US codes.

Table 8 Main advancements triggered by earthquake events mainly for the US (UBC (1994…) and IBC (2018)

4.7 Points to retain

As referred to in summary, this epoch was full of new advancements with a significant increase in human resources, Programmes and Projects in countries with higher odds of recent events. As seen in Sect. 2, earthquakes and tsunamis are still natural events causing more victims and economic losses. Many students did Masters and PhD dissertations with so many selections of topics. We just enumerated a few issues more prone to new ideas. Many more could be set up, but that was not the objective of Sect. 4. A significant part is dedicated to the achievements obtained, namely a summary of the main provisions gathered after significant earthquakes. They are listed in a few blocks indicating the turning points in research. Many papers and topical conferences offer an incredible spectrum of the subject of interest.

With the turn of the Millenium, as will be seen in Sects. 6 and 7, the research in SEE enlarged its initial base of development into many other areas from other engineering areas to management, sociology, disaster mitigation, education, serious games, etc.

5 Intensity Scales: how we can upgrade them

Summary

  • The first developments and the role of the field missions

  • The need to quantify the shaking and the various Scales

  • The new scientific advancements

  • The role of video-cameras

  • Proposals for upgrading

5.1 A short history of intensity and intensity scales

Written and printed historical records and reports have always been the primary sources of macroseismic information needed for re-evaluating historical earthquakes and organising earthquake catalogues. One of the criticisms of using historical seismicity in hazard or risk studies is that data on ancient events is inaccurate, introducing much uncertainty that any more could be accepted. The most recent attempts to re-assess the seismicity of regions have also looked for new and not previously utilised complementary data such as engravings, and precise dating, enabling new interpretations in the light of seismological and engineering knowledge. Cross-references are also used to solve difficult situations, especially when there are little original data. The present Section attempts to test the utility of all historical material to reduce uncertainties.

As referred to in Sect. 3.1.6, intensity measurements were a great discovery in seismology just before the appearance of the first instrumentation. After realising that the earthquake shaking was a wave phenomenon provoked by some unknown origin, experts in scientific matters realised that the damage to the existing stock of buildings and monumental structures would attenuate with distance from some source and that the landscape would influence it. Varga (2008) traces in profound analyses the first steps done in this regard until formally, the first scale was proposed in the mid nineteenth century. According to Vargas, everything started with the Calabrian earthquake of 1783 when Dolomieu (1884) and other investigators found that wave propagation has different effects depending on the types of rocks at the surface. Schiantareli (cited by Musson 1994) even tried to quantify the impacts based on the damage suffered on different settlements (Fig. 39).

Different authors settled the intensity scale with various levels, among which we should cite Egen (1828) and Seebach (1873). Still, it only appeared in an organised format with Rossi and Forel (1881). Intensity scales evolved with the contributions of many different authors from the late nineteenth century till now. Mercalli Modified Intensity (MMI) (Wood and Neumann 1931) was the most practised in the US and other South American countries and Medvedev, Sponheuer and Karnik (1965) (MSK) in Europe. Japan had its scale with only 7 degrees. Things calm down with the European Macroseismic Scale 1998 (EMS-92) (Grünthal et al. 1998), which is the epilogue of all others, with a more comprehensive description in the format of what was previously presented by (MSK) in 1964.

5.2 Description of intensity scales

Several researchers are connected to historical investigation written in the last 20 years with various publications in this thematic. To mention a few, Musson (2009) and Musson and Cecic (2011) present a systematic analysis of the EMS-98 Scale, explaining the history of intensity scales, the modern intensity scales, how to collect macroseismic data, how to process macroseismic data and how to determine earthquake parameters from macroseismic data. Others like Varga, Musson, Grünthal, and Ferrari made similar efforts to revitalise the idea of macroseismic intensities. The explosive increase of instrumental data reduced temporarily the importance of macroseismic studies, making it more challenging to be accepted in Engineering Seismology. Lately, the interest in historical seismicity, including archeo-seismology, etc., has increased again because hazard studies cannot reduce epistemic uncertainties unless the period for the analysis includes historical events.

We find an extensive collection of different scales to measure the size of a given event. We see a scale for measuring shaking, another for tsunamis, liquefaction, landslides, acoustic noises, impact on ships, electromagnetic effects, etc. Why so many. Maybe if scale descriptors are compacted, the uncertainties become quite reduced!

Figure 51 summarises the various scales since the first proposal in 1881, and Fig. 52 accounts for scales corresponding to other natural disasters.

Fig. 51
figure 51

Evolution of earthquake intensity scales since 1783

Fig. 52
figure 52

Evolution of intensity scales used to measure other natural disasters

If we look at the time intervals for the appearance of a new scale, we get one every 21 years. The two last were MSK (1964) and EMS (1998). Now 23 years are gone since the previous version (EMS-98). There were recent tentative to update EMS-98, and the IMS (International Macroseismic Scale) was the last trial.

In the face of the large offer of situations, we will concentrate on the most used nowadays, that describe in more detail the effects of earthquakes: the (IMM 1956), the (EMS-98) and the (ESI-2007). We also will look shortly at the (MSK 64), addressing the first three in the following and leaving for Sect. 5.3, the last case.

The Modified Mercalli Intensities Scale (IMM 1956) qualitatively assesses the intensity of an earthquake, defining degrees of intensity presented in Roman numerals from I to XII degrees. The degrees of intensity range from a minimum degree corresponding to a non-sensible shaking (I), passing through intermediate degrees, with structural engineers being recommended above VIII to validate the information until reaching a maximum degree (XII), representing practically total damage or collapse. Each degree of intensity is associated with a descriptor of the effects recorded in many aspects, from constructions to suspended and static bodies, through soils, hydrological and hydrogeological characteristics, and human reactions. Damage descriptors observed in buildings are complemented with a support characterisation that assigns classifications to A, B, C and D masonry, considering its material characteristics and earthquake-resistant capacity.

The structure of the MSK 64 Scale is constructed on the following characteristics/categories:

  1. I.

    Type of Structure (masonry, RC, Code).

  2. II.

    Definition of quantity (Few, many, most).

  3. III.

    Classification of damage (D0,…, D5).

  4. IV.

    Arrangement of Scale (persons, structures, Nature).

Each of these categories is arranged with the help of detailed descriptors. A short version is also available.

The EMS-98 is used in European countries. It was based initially on the MSK-64 with a proposal in 1992 (EMS-92), revised in 1998. The term “macroseismic intensity” is used in EMS 98 entirely to classify the severity of the movements based on the effects observed in a given area.

“Some specific problems to be addressed were:

  • to include new types of buildings, especially those considering earthquake-resistant design features, which are not covered by previous versions of the scale;

  • to address a perceived problem of non-linearity in the scale arrangement at the junction of the degrees VI and VII (which, after thorough discussion for preparing the EMS-92, as well as for the EMS-98, proved to be illusory);

  • to improve the clarity of the wording in the scale;

  • to decide what allowance should be made for including high-rise buildings for intensity evaluations;

  • whether guidelines for equating intensities to physical parameters of strong ground motions, including their spectral representations, should be included;

  • to design a scale that not only meets the needs of seismologists alone but which also meets the needs of civil engineers and other possible users;

  • to design a scale which should be suitable also for the evaluation of historical earthquakes;

  • to perform a critical revision of the usage of macroseismic effects visible in the ground (rock falls, fissures etc.) and the exposure of underground structures to shakings.”

EMS-98, as already mentioned in other Scales before this one, is based on a series of “characteristics/categories”, among which the most relevant are:

.The differentiation of buildings into various “Types of structures” (masonry with seven sub-types; RC with seven sub-types; steel; and wood, each affected by vulnerability classes A, B,…F exhibits a range of possible hypotheses.

.A Classification of Damage into five grades, with descriptors and illustrative pictures for masonry and RC structures.

.A Definition of Quantity into 3 classes: few (0% to 15–18%); many (12–18% to 55–58%); most (53%-58% to 100%).

.Arrangement of the scale would consider:

  1. (a)

    Effects on humans

  2. (b)

    Effects on objects and on nature (effects on ground and ground failure are dealt with especially in Annex 7)

  3. (c)

    Damage to buildings.

Each one of the 12 intensity degrees was assigned by combining effects that consider the above-described characteristics.

Table 9 is a Short form of EMS-98. Annexe 7 of EMS-98 deals with “Effects on Natural Surroundings” (Fig. 53), which are essential to assign intensities to various effects, but not directly contemplated in EMS-98.

Table 9 Short form of EMS-98
Fig. 53
figure 53

Intensities assigned to several types of effects not contemplated in EMS-98, only on Annex 7

The primary descriptors are presented to understand the various degrees of the scale. This scale corresponds to a great effort by the seismological community that tried to bring the most recent acknowledgements on how the wave field affects our urban tissue. It would arrive at an intensity value from the degree of damage inflicted for the first time. This was very important because since then, the notion of vulnerability and fragility could be used to estimate the physical impact of future earthquakes, and the scenario philosophy for simulation of earthquake impacts spread thoroughly in many countries.

After many years of testing the EMS-98 scale in assigning intensities to earthquakes, as described in Sects. 5.5 to 5.7, there have been several proposals to overcome certain drawbacks observed in the present Scale. We will address a few which seem to be more obvious.

Tavares et al. (2021) make a new proposal considering the effects of shaking in small rivers after reviewing data from the Lisbon 1755 earthquake. The water sloshes violently against the banks, opening up “spaces” in the middle of the river. They attributed intensities VI to VIII to the worse cases.

Comparing the (IMM 1956), the EMS-98-Annex 7 and the (ESI 2007—see Sect. 5.3), there are still gaps regarding descriptors and intensity values. Authors showed agreements and divergences among them, as well as the need to include greater detail for intensity degrees below X for rivers: flow increase with possible alterations in the river course limit—between VI and VIII; flow suppression and eventual reset—VIII or higher; unnatural current agitation, possible vertical wave movements—between VI and VIII; Stream with cloudy water—between VI and VIII. It also reports the effects of seiches and sloshing for future analysis in this framework.

In Japan, the situation was quite different from all other countries. In the late nineteenth century, they only considered 7 degrees and only towards the end of the twentieth century did they accept 10 degrees for an easy way to make comparisons with all other scales. Their doubts about the usefulness of intensity scales came from (a) Intensity was measured at the JMA office; (b) differences inevitably arise depending on the type of building in which the observer is located; (c) there are individual differences among observers; (d) sensory seismic intensity cannot be observed in an unmanned place. So they now are looking for a new form of assigning intensity values through a simple instrumental transducer similar to the Wood-Anderson seismograph (T0 = 2.0 s) (Koketsu 2021). Table 10 presents the evolution of Japan’s Scales up to 1996 for comparison with the other Scales (Kozák and Musson 1997).

Table 10 Comparison among scales in Japan (JMA)

5.3 Recent Initiatives for harmonisation of Scales

There are two new initiatives to harmonise the different scales, the (EIS-2007) and the (IMS).

Environmental Intensity Scale (EIS-2007) Michetti (2007). Comes from the area of effects caused on the natural environment (environmental or geological impact) such as landslides, liquefaction, fault rupture, area of perception, the height of the tsunami, etc. It is suitable for intensities higher than VII–VIII. The Geological Community launched it under the Archaeo-Seismological Section. It completes all other modern scales that only look at the effects on man and artificial structures (buildings and infrastructures).

International Macroseismic Scale (IMS): in the study. To extend the EMS-98 for a Global Application, especially detailing building typologies. Suitable for large intensities.

The EIS-2007 measures earthquake severity, taking into account the effects in the range of frequencies of vibratory motion and static deformations.

Figure 54 summarises primary descriptors and their values for the ESI 2007 Scale.

Fig. 54
figure 54

Graphic representation of the ESI 2007 intensity degrees (after Silva et al., 2008 and Reicherter et al. 2009)

The ESI 2007 intensity scale (Michetti 2007), built only on environmental effects, brings back the spirit of older scales that consider all impacts. This way, the ESI 2007 can extend the intensity analysis to older periods (from recent, historic to palaeo-seismic events), much larger than the period of instrumental record and can look at sparsely populated areas.

In summary, as referred to in the presentation of the ESI 2007, “for intensity levels lower than IX, the main goal of this new scale is to bring the environmental effects in line with the damage indicators. In this range, the ESI 2007 scale should be used along with the other scales. The distribution and size of environmental effects, especially primary tectonic features, become the most diagnostic tool to assess the intensity level between X and XII. Documentary report and field observations on fault rupture length and surface displacement should be consistently implemented in the macroseismic study of past and future earthquakes”.

The ESI 2007 characteristics, suitable geo-markers for a paleo-seismic assessment, are grouped based on their physical relationship with the tectonic fault associated with the earthquake’s origin (Fig. 54). Suppose these are directly related to the failure surface rupture. In that case, they will be called direct (primary effects), and the indicators that are not directly related to it will be considered indirect (secondary effects). This second class of evidence can be further subdivided into three subclasses: Type A, which encompasses seismically induced effects, namely sediment deformation (liquefaction, mud diapirism), mass movements (including falls), falling rocks in precarious situations, among others; Type B consisting of the remobilisation or redeposit of sediments (turbidity, homogenates, etc.) and transport of rock fragments (displaced blocks); and Type C, involving regional markers of elevation or subsidence (micro atolls, elevation of terraces, river channels and, in some cases, progressive nonconformities). These three types of subclasses have different underlying actions: in the first Type A subclass, the effects are generated by seismic motions; in the Type B subclass, these relate either to water masses set in motion by seismic action (sediments and loose rock blocks/erratic blocks) or to seismic motion, in which they are generally related to the propagation of waves through different materials; the Type C subclass is more directly related to the tectonic deformation itself and may vary depending on the proximity to the fault in question, from local to regional scale. There still are other secondary effects and a last topic with affected area.

The IMS was another tentative essay within the GEM project (Silva et al. 2018) trying to merge many of the items referred to above and build a single internationally valid macroseismic scale to permit consistent intensity assignments in any country (Spence et al., 2014). This Scale would have a Core Scale with several modifications to EMS-98 and a building guide. It will exert her differences essentially on the classes and topologies of construction. Li et al. (2021) compared the application of EMS-98, the MSK-81 and the Chinese CSIS-08 scale (Sun et al. 2008) for the Sichuan earthquake. Significant differences were observed among these scales. “However, the differences obtained by seismologists in assessing the intensity using the same scale are larger than those obtained by the same seismologist in assessing the intensity using different scales” (Hu 2006). It is like an intra and an inter uncertainty!

5.4 Conversion intensities—ground motion parameters

Most scales present maximum and minimum values for PGA, PGV and PGD for the different intensity values (Fig. 55) for the older nineteenth century to modern studies. We can observe from the figure that the maximum ground motion values differ approximately ten times for the same intensity.

Fig. 55
figure 55

Conversion of intensities to PGA, PGV and PGD (units: cm, s)

According to many studies that correlate intensity with ground motion parameters, this level of variation is kept still valid. To support this reality, we present Fig. 56 data from the late twentieth century and Fig. 57 data from 2020. In all cases, the scatter of 10 times is always present. It should be recalled here the testimony of records of variability—two stations 300 m apart, the average at one station is five times the average of another one.

Fig. 56
figure 56

Data from California (Wald et al. 1999; Ahmadzadeh et al. 2020)

Fig. 57
figure 57

Variation of local Intensity and ground motion: Data disaggregated by magnitude, Gómez-Capera et al. (2020)

Facing this difficulty of large scatter in the conversion of intensity to ground motion parameters and the need to use these relations to merge macroseismic observations with structural analysis of buildings, we checked if the ground motion itself would contain such scatter as observed back in the 1980s with SMART-1 experiment. Therefore, we looked to instrumental data alone from the Tohoku earthquake, which produced o massive amount of instrumental data. Figure 58 shows the attenuation of PGA and a few spectral values, differentiating the various types of soil profiles. We observe a similar pattern, concluding that the scatter is intrinsic to the wave propagation. Consequently, the conversion into intensities only aggravates the situation. The only possibility to reduce the final scatter is to reduce uncertainty in the conversion.

Fig. 58
figure 58

Variation of ground motion in Japan 2011 (Stewart et al. 2013)

Intensity has several limitations in representing well the wavefield as the Figures before confirmed. Hough et al. (2021) attribute this uncertainty to endogenous as well as exogenous factors: for the first, because it does not consider the frequency of vibration, the proper structural vulnerability of objects, etc., and, for the latter, because of the differences in geographical density of information, bias reports, etc. On the other hand, there are many advantages to having a wealth of information: the observed response of the built and natural environment to the wavefield.

Going back to the MSK (1964) scale, we verify that at the time, the need to consider the “frequency of vibration” as one of the characteristics/categories of the scale and the urge to combine various signs mentioned in EMS-98 but not incorporated due to lack of confidence in the state-of-the-art, was mandatory to include in the body of the scale. In addition, it would be necessary to consider PGA and PGV, at least, for converting intensity to SM.

Without entering details, we can see that effects on humans, objects, and nature were always present in all descriptors, liquefaction, landslides, underground pipes, water waves, ground cracks, etc. On the contrary, EMS-98 discusses at length the concepts but does not include them in the scale.

Our proposal (next Section) follows the same token as MSK-64. It aligns with the EIS, complementing the information of intensities in zones not much populated, very much linked with a hydro-geologic perspective.

5.5 Pre-requisites for a new proposal

5.5.1 Preliminary point

Engineers essentially developed codes of practice with a bit of help from seismology. The engineering community pushed the first studies of hazard and strong motion, while seismologists effectively conducted studies on intensity. In 2020 the situation was somehow reversed; strong motion is now a part of seismological topics, while intensity measurements take time to be incorporated into the engineering community. These aspects are “strange” and can only be explained by looking at the entire historical development of both earth sciences and engineering. They both require expertise in both fields, and, ultimately, Earthquake Engineering solely uses strong motion. About Intensities, even though now they consider typologies of construction and Damage States for the higher intensity degrees, a topic essentially treated by Engineering Seismological communities, intensities are in general committed to the Seismological entities which are in charge of data treatment and evaluations.

We need to merge these two topics in the hands of both fields of knowledge to make the passage from Intensities to PGA/PGV, and vice-verse, may reduce the significant uncertainties involved with these two conversions.

This problem is more critical for the lower intensities, as observed in the Italian data with both intensity data and PGA values (see Sect. 5.7).

For the Intensities, we describe effects on humans, nature, housing, and other structures. Descriptions include various outputs that are then combined to obtain an intensity value. We try to get a single value from a multi-impact output. Let's analyse the effect on people inside a building. First, we must understand how people react to oscillations inside a building that responds to the input. Depending on the characteristics of the feedback and response of the structure, the amplitude and, to a certain extend, the frequency of oscillation depend on the floor level. Very little is questioned on this topic, even in the most sophisticated scales. However, people or other objects are subjected to a different input, whether in a tall or lower building or a low or high floor level.

The earth problem is like any other environmental issue, which takes decades to enter the political agenda of those countries with extreme events. This is an immediate problem. However, to get here, first, we have to solve the inverse problem.

The inverse problem starts with Hounsner's (block) solving archeo-seismology problems like tombs in cemeteries or more complex issues of falling slender objects (chimneys, towers, columns, minarets or more stable ancient bridges and finally, our houses).

Instrumentation helps solve some of our problems, but instrumentation alone does not give us all solutions. Instrumentation provides us with good, accurate elements describing the wave field and the performance of structures, letting us know how things behave to help us in the inverse problem. It is essential to relate the Intensive Scale with strong-motion parameters; we still battle with that. Let’s look at the case of the M7.8 off-shore 28 Feb 1969 earthquake, 150 km from Continental Portugal (Fig. 11). We did not solve the problem of the source even though Seismological networks recorded the wavefield all over the world. There were a few stations in Portugal and Spain, but all of them were saturated after the P arrivals. For 50 years, this earthquake was almost forgotten, either by seismology, geotechnical or earthquake engineering. Many sources of information are available—photos of the damage and now the answers (DYFI) enquiry 50 years after the event, with many descriptions of human behaviour. Nevertheless, we cannot merge data from DYFI with structural behaviour or damage assessments.

Also, with new high-rise structures (with high per cent in some urban areas) as we have in our cities, intensity scales cannot provide accurate information.

5.5.2 Hypothesis

Several categories or items were used in this analysis to describe the scale: effects on the type of structures, the number of buildings suffering damage, and 5 degrees of damage were introduced, D1 to D5. The impact on persons and surroundings was considered in structures of all kinds and Nature. EMS-98 is very similar but shows a few differences. Even though MSK speaks about the concept of “frequency of vibration”, it was never directly introduced into the scale. And most of the observations that describe the Intensity Level depend on the coincidence of input frequencies of ground motion and the frequency of the object under analysis. For example, the lamps suspended from the ceiling only move at large amplitudes if there is resonance between the lamp oscillation and the ceiling motion. Otherwise, even for large amplitudes of the ceiling, the lamp does not move much. The same applies to water waves in recipients from small aquariums to large swimming pools.

We will show that this frequency effect needs to be contemplated when assigning intensities to moving objects! That's a reality of body dynamics.

Considering that:

  • In Seismology, “back-analysis” has been the great paradigm for solving problems with a passive attitude.

  • Having collected more and more cases that transform empirical data into laws, uncertainties generally do not reduce much, as it happens in strong-motion seismology when trying to get GMPEs or trying to correlate IM with Intensity Scales.

  • The reality of studies on earthquake impacts is that they are done separating the whole process into parts, as different compartments of the same phenomenon. Unfortunately, the synthesis is not made later on.

Think about the following scheme:

  • Beyond this, many other variables are related to the event, namely the electromagnetic field, gravity field, acoustic emission field, light emission field, etc.

  • Related fields look into the past to understand the time evolution of the Earth that originated earthquakes or energy releases, such as palaeontology, archeo-seismology, and historical seismology.

    1. o

      Assume that the rate of earthquakes worldwide has been kept almost constant over centuries: A magnitude 8.5+ occurs once per millennium in the Euro-Asiatic Nubia plate contact. If we take the example of the opening of the Atlantic on the East side (Azores-Gibraltar) and look into the proposed slip rates of Müller et al. (2008), in the last 10,000 years,

  • more than five magnitudes 8.5 + (similar to the 1755 event) have taken place and, for sure, left essential marks in the landscape, especially in coastal deposits;

  • and 50 events M8 may have also caused significant changes;

  • 500 events M7 would destroy most of our civilisation.

  • Why can't we relate ground motion parameters with intensity values; tsunami with PGA; why do GMPEs need to be based on empirical observation with significant embedded uncertainties, inter and intra station-event scatter.

  • Why can't we look at the paradigm stated in the sketch of Fig. 59 and reverse it, putting all the various consequences of the event as input for the creation of a single entity: Earthquake.

  • But we cannot dissociate the earthquake event from the consequences on the human way of treating the territory, our stock of housing, critical structures, and all the other important to humankind. Science and technology are also compartmentalised here, and some information is often contradictory.

  • Examples can be observed in areas where patrimonial values are difficult to go along with safety measures or in regions of audacious architecture, or where comfort and energy sustainability may reduce seismic vulnerability. For both examples, a great deal of imagination among the various actors intervening in the process (architects, historians, geographers, engineers, material experts, sociologists, economists and politicians) should exercise to get the right balance of parts. In creative architecture, similar interests must be integrated to achieve a balanced outcome.

This proposal follows similar terms to the ESI 2007 Intensity Scale (Audemard et al. 2015).

Fig. 59
figure 59

New Look at Intensity Measures

The way to look at the definition of earthquake effects can be reversed. The proposal is slightly different because it combines what will be presented next for the lower intensities with the EIS-2002 for large intensities (Fig. 59).

5.6 New observations that can be converted into IM

Several new observations can be used in assigning intensities. In a recent paper, Oliveira et al. (2012) use video surveillance and personal video cameras for real-time information on the mechanical performance of structures and their contents during seismic events, as well as information on the moving properties of propagating media, such as tsunamis, landslides, and water sloshing. The recorded images also provide essential clues on human behaviour during shaking. An extensive set of situations obtained from published YouTube videos involving structures, natural outfits, human behaviour, etc., are presented in that paper.

Video cameras cannot replace laboratory static tests or tests on shaking tables, pseudo-dynamic sub-structure testing, wind tunnels or the channel for water propagation of waves. However, information that is collected over time and well used is of great value, as it shows the real world without any shortcomings provoked by “similarity laws”, “boundary conditions”, or a “friction and nonlinear hypothesis”. This information should be collected even if it only serves as an "inspiration" to researchers by supporting new ideas that only visualisation can provide. This type of information can be considered a random visual health monitoring system. We will use several situations described in that paper to illustrate how this data can be used in assigning intensity values through the detailed analyses of analytical modelling, which produces quantitative information on ground motion parameters. The collection reported in Oliveira et al. (2012) is as follows:

  • Animals react faster than Humans

  • Human eyes can observe the movement of Ground in special situations

  • Geotechnical aspects easily observed

  • Better understand motion inside houses

  • Better understand the behaviour of supermarket shelves and their contents

  • Swinging lamps as a measure of ceiling response

  • Car balance as a measure of pavement motion

  • Non-Structural elements as obstacles to fast evacuation

  • Sloshing waves in swimming pools to determine input motion

  • Relate Frequency-amplitude to index Intensity Measures

  • Back-Analysis of simple structures

  • Tsunami flow velocity can be measured.

We concentrate on a few topics that are directly related to the intensity scales, namely the frequency of vibration, the resonance as a requirement to get an important amplification of motion, the effects of wave passage on humans, the swinging of suspended objects, the oscillation of water in recipients (swimming pools), the shaking of cars and people inside them, the displacement of objects dislocated from their original position and the collapse of several types of structures.

The new “video cameras” and shake tables, small size and extensive size Lab tests of components subjected to complete cycles or health monitoring of multiple structures complement the data to understand structural behaviour better.

Post-event studies such as field missions, with photos and films, satellite images and drone coverage, and the participation of citizen observatories will increase the data to understand better the wave field and the performance of the built environment.

Tsunamis can also be observed with similar tools. In this respect, it is essential to mention that until Sumatra 2004 earthquake, the majority of these new “gadgets” were not available. And video cameras were used for the first time in large numbers during the Tohoku 2011 earthquake.

In the last five years, people have placed their camera views on the internet without any censorship, and anyone can see that information on “YouTube”.

Video personal cameras are essential tools to understand new behaviour of structures and a good incentive for new ideas to advance in future considerations, namely in the upgrading the intensity scales.

In the future, archiving films constitutes a rich data resource that can be used at any time.

The first example refers to the oscillation of suspended lamps. The frequency of lights suspended from ceilings has been observed everywhere and captured by video cameras. These lamps oscillate in various forms and can attain large amplitudes if resonance occurs. Resonance frequency gives us an approximate value of the frequency of the structure. In the first case, we consider a simple pendulum (Kapitza pendulum); in another case, we consider compound pendulums (Conical pendulum), such as the chandeliers used in older epochs. Figure 60a shows the case of a simple 1-D pendulum (similar to a single degree of freedom, DOF-θ) oscillating with a large amplitude. [Video 1] and Fig. 59b) show the rotation in 2-D of a compound pendulum (“conical pendulum”) with 2 DOF (θ, φ) under an X–Y input at the suspension. In this last case, the motion is more like an “umbrella-type” movement, keeping one of the DOFs (θ) constant during the more prominent oscillation.

Fig. 60
figure 60

Diagrams of PGA of ceiling that provoke a certain angle of oscillation

If there are many similar hanging objects at short distances, they tend to oscillate in phase, attaining large angles in a clear nonlinear fashion and sometimes even touching the ceiling.

For an angle of 50°, we need approximate 1 m/s2.

Hanging paintings or frames oscillate, leaning to the wall and not out of the plane. The problem is more complex than the oscillation of lamps because the friction against the wall may be significant and need to be introduced into the analysis.

A second example is a truck entering the rotation due to the input motion. We can also obtain the PGA needed to topple the vehicle (Fig. 61).

Fig. 61
figure 61

PGA—Frequency relationship of lateral and vertical acceleration to topple the vehicle

5.7 Perception of shaking inside a car

Observers' perception of shaking inside parked and moving cars has been made through web questionnaires (Sbarra et al. 2021). These situations were compared with outdoor reactions. One of the main findings was the most incredible sensitivity to shaking for people inside parked cars due to resonance phenomena of the automobile–observer system [Video 2]. Knowing the typical frequencies of the vehicle–observer system, it is possible, using back analyses, to determine the shaking intensity (PGA) capable of causing that feeling. Just like the oscillating chandeliers or the water is spilling from swimming pools.

The slippage of an object lying on the floor can arise when the input motion provokes a contact force that surpasses the friction force that keeps the two surfaces in contact, Fig. 62. To be more accurate further studies should be pursued.

Fig. 62
figure 62

Displacement of simple objects with friction contact subjected to a ground motion defined by a RespSpectra

Fall of shelves, well illustrated in [Video 3], occurs for accelerations above 0.15 g (Candeias et al. 2019; Rupakhety et al. 2019). For these PGA's, people stop and pull over against a wall until shaking vanishes. It is common to see a mob lying against a wall, seeking a handrail for safety. For larger PGA's, they may fall or be thrown to the floor.

A similar result was obtained from a detailed analysis of slip objects during the 1969 Banja Luka earthquakes in former Yugoslavia (Manić et al., 2016) (Fig. 63).

Fig. 63
figure 63

Assignment of intensity from motion observed during the 1969 Banja Luka earthquakes

Moving now to the oscillation of water in recipients, we show how these movements can tell about the input ground motion. Swimming pools are excellent recipients to function as simple DOF oscillators. They can be located on the ground or on rooftops. In (Fig. 64) we see the water being thrown away from a swimming pool at the top of a high rise building in Manila. From [Video 4], we can check the frequency of the structure oscillating in resonance with the water in the swimming pool.

Fig. 64
figure 64

Water sloshing in swimming pools—a video-camera [Video 4—“ABC Channel 7”]

A simple analytical model calibrated with experimental data allowed us to produce (Fig. 65) a graphic relating the recipient’s size with oscillation frequency and the water height for a given seismic input motion.

Fig. 65
figure 65

Frequency and water oscillation for different recipients’ sizes and height wave for a given input motion

All these examples illustrate clearly that it is possible to recover several input ground motion parameters from the way we observe the response of objects under the input motion. The present situations only appear with such precise information when resonance takes place. If that is not the case, the motion amplitude is relatively small and cannot let us quantify the input motion.

The examples previously shown are recurrent in many earthquake effects described by simple citizens.


Box—Observations made in the nineteenth century that are replicated nowadays.

-An interesting example comes already from Milne (1898): “An eyewitness reports in a tank of (24 × 8 × 7.5 m) water raised 1.2 m in one side and 0.3 m in the other”

-Omori refers to gravestones felt toward the west while others on the other side of a plain felt in the opposite direction

-Movement reached 9″ to 12″ (from the width of fissures in walls & width of cracks formed on the soil of open plains)

-Wave Periods (T) may vary from 1.5 s \(\frac{1}{5}\) to \(\frac{1}{25}\) sec

For T > 4 s, we do not feel anything while observing the swinging of objects like a chandelier

-Many persons have seen the surfaces of alluvial plains thrown into a series of undulations (Yokohama earthquake, 1891). Waves come rolling down the street, with about 30 cm with a wavelength of 3 to 10 m

-Observers in Manila describe situations as ripples a few inches in height and a few inches in wavelength (λ). They are challenging to observe, but we feel them on our feet

Calculations

-Wave velocity V = 8000'/s × 30 cm = 2,400 m/s for distant stations or 250 m/s in explosions.

-Wavelength vary from λ = 400' × 30 cm = 120 m to 16,000 × 30' = 480,000 cm = 4.8 km.

If T = 2 s and PGA = 0.3 m/s2, PGD = 30 cm.

 

These numbers confirm theoretically that, for certain situations, it is possible to observe those effects (see Fig. 69, ahead).

In the recent earthquake that struck Haiti M7.2 on August 14, 2021, eyewitnesses described the following situations (Fig. 66).

Fig. 66
figure 66

DYFI enquiry done by EMSC-CSEM (2021)

We can see the similarity of descriptions to the ones reported before. The vibration frequency is also behind the answers as magnitude and distance influence the effects.

Human perception of ground motion is a challenging task due to the several factors intervening in the process, such as the location of the person, the position when receiving the motion (standing, seating, in bed, holding to some solid object), and mainly depending on how people are prepared to take action. The human body, in dynamic terms, is like a 3 DOF system with plenty of controlling assets that define the equilibrium. Under seismic shaking from many video-cameras films, some limits can be assigned to human behaviour. Video [Video 5] presents the movement of a mob during the M7.1 Nepal 2015 earthquake, where we see people moving in the direction of shaking recorded by a GPS station nearby (Dusi 2019). Displacements of (peak to peak) around 1.6 m at a frequency of 0.15–0.2 Hz (corresponding to PGA ~ 1.15 m/s2) were able to move the mob altogether from side to side, following the oscillation and almost causing people to be thrown to the floor.

Figure 67 shows a tentative sketch of the level of discomfort of humans as a function of the frequency of motion and amplitude. The dashed dark blue represents the limit above which people cannot walk on footbridges (ISO 10137:2007). The numbers were collected from several sources (Oliveira et al. 2012), but a solid experiment in a shaking table would give a more robust estimate of what is now presented. A short note to express that through video cameras, the old saying that “dogs feel first than humans” [Video 6] is well confirmed. Amplitudes thresholds for the sensitivity of different effects (descriptors: hanging objects, oscillating liquids, overturn of shelves, etc.) described in DYFI enquires tend to reduce with the period increase, as is suggested in work presented by Tosi et al. (2021), after analysing hundred thousand data questions collected in Italy (see Fig. 69 ahead). Davidovic (2016) also presents a similar graph of human perception of earthquake shaking. However, as shown in Fig. 67, the tendency for Human Perception is slightly different because the decrease with period reverses around 2 s.

Fig. 67
figure 67

Level of human discomfort for shaking intensities. The larger the circles, the more significant the discomfort

Understanding wave passage on the ground to inform the intensity of shaking is another complex problem, even though it is a common observation from ancient treaties to present times. From historical earthquakes, it is clear that people attest that the ground surface moves like a travelling wave, in the same fashion as if they were in the ocean. Video Cameras filming Nepal earthquake 2015, confirm that statement in visual terms for the first time. To understand the conditions of possible visibility of the earth’s surface ground motion, Fig. 68 gives the premises for that account. Suppose a harmonic travelling wave with a PGA = 2 m/s2 in a alluvial valey with Vs = 80 m/s, and a wavelength λ = 80 m. Figure 69 points to a wave amplitude (peak to peak) equals to 10 cm, which is well within our vision capabilities.

Fig. 68
figure 68

Ground motion visual account following wave equation (λ = cT)

Fig. 69
figure 69

Percent of chimneys were damaged and collapsed; to near-surface rupturing earthquake (Dowrick et al. 2008)

5.8 New additions to the EMS-98 Scale

Dowrick et al. (2008) (from damage to brittle domestic chimneys (Fig. 69), stopping of clocks, disturbance of liquids and damage to sanitary fittings; including landslides, incipient landsliding, i.e. cracks and rents in slopes and ridges, lateral spreading of natural and made-ground, subsidence of fill and embankments, and various liquefaction effects). After looking at the damaged and fallen chimneys in several New Zealand earthquakes, the statistics follow a regular pattern, and the correlation between MW and I0 is quite good.

Sbarra et al. (2020), based on an extensive compilation of data (DYFI) from Italian earthquakes over the past 50 years, suggest a few corrections to the EMS-98 Scale, precisely where the promotors’ items of the scale vacillated. Figure 70 shows a clear tendency for the descriptors mentioned in the hierarchy to have a role to assign a degree with reduced uncertainty, and impressed to see that humans can adapt very well to shaking, especially up to intensity V or in terms of acceleration up to 5 m/s2, for a frequency band above 3 Hz.

Fig. 70
figure 70

Perception of shaking by different “descriptors” (Sbarra et al. 2020)

Marreiros et al. (2021) suggest a slight correction due to the location of witnesses in the building, as observed in many cases during an Enquire (DYFI) about the 1969 M7.8 earthquake SW Portugal. Nowadays, it is well known that the higher you are in a building, the higher level of amplification of shaking is felt.

  • Tavares et al. (2021) reviewed the river’s water flow and concluded it is possible to correlate the intensity to the effects in rivers [Video 7]. The phenomenon of “sloshing” is very well evident in the case of earthquakes through video-camera footage taken at various locations in Mexico City in connection with the earthquake of September 19, 2017 (M8.1). As is clear from the YouTube images (AON 2020), the waters in a canal where several touristic boats sailed recreationally, the vibration imposed at the base significantly rippled the ships that were strolling, with frequencies of the order of 2–3 Hz and amplitudes of at least half a meter. The wavelength of the ripple in the canal is certainly about 2 m. In the engravings about views of Lisbon during the 1755 earthquake and attributed to the tsunami, this interpretation is wrong given the images from the video cameras, as they must be nothing more than local vibrations of the type represented by the “sloshing” phenomenon.

  • Signalert.net (2021) is an Application for Citizens to inform on the impact of various natural events, including earthquakes and tsunamis. It can be launched to an android and for earthquakes consists in filling a questionnaire with several aspects (from noise, length, oscillations, etc., in a total of 10), each with 3–4 grades.

In summary, we can say that a few new additions to the application of Intensity Scales (EMS-98) for Intensities up to VI-VII:


Box—New addition to introducing in the Intensity Scales (I).

•Introduction of fundamental variable: Frequency of Vibration (at the level of typology; code provision; damage state; frequency of occurrence: many, a few; etc.

•Height of building & Location in the witness in the building.

•Angle of rotation of suspended lamps.

•Rocking angle of parked cars.

•Perception of people inside cars.

•Water split in recipients, including swimming pools.

•Water split from rivers.

•Human feeling.

•Accoustic Noise before shaking. Etc., like electromagnetic fields and gravitational waves.

 

About intensities larger than VII, the strong effects in nature, the damage to structures and possibly the merging scales should be considered: tsunami, landslides, secondary effects due to surface faulting, etc. Why are we throwing away pieces of information that can fill the many puzzles of the seismic wavefield?

Starting with the last point, we can say that there are already studies correlating tsunami amplitude with a magnitude of the “parent” event (Fig. 71b), and the scatter is not too high.

Fig. 71
figure 71

a Correlation between the logarithm of the average of the maximum tsunami amplitudes in Japan (cm) and the Mw of the parent earthquake of Chilean origin. b Correlation between Magnitude and Tsunami Intensity

Fig. 72
figure 72

Risk Matrix for actions to be taken


Box—New addition to introducing in the Intensity Scales (II).

For the Intensities larger than VI-VII, we have to include the damage to the built environment in urban areas and geological hazards in urban and the unurban scenario. As far as damage to structures, the following aspects should be considered:

•Briefing the personel responsible for assigning Intensities above VI with main principles of structural dynamics. (Structural Engineer part of a team).

•Consider the Type of Structures adapted to the region under study and the Degree of Damage (D1 to D5).

•Consider the Type of structure and Non-structural Elements.

•Do not use DYFI Enquiries. They can be done only much later.

•Use video cameras whenever possible.

 

All topics above are good ingredients for bringing citizen science into estimating “input” ground motion parameters and, consequently, suggesting some intensity degree. Other more difficult markers are more emotional characteristics, like “fear”, “panic”, etc. However, animal behaviour can also be used as a good marker and a good detector of P-waves [Video 6].

From the analysis above dealing with most of these items, there is no excuse to disregard the influence of “frequency”. This is the most critical parameter under discussion for all resonance cases, which causes significant responses. Fortunately, the Health Monitoring and the numerous campaigns for determining the frequencies of the most varied types of structures allow estimating the frequencies of the response of those man-made and natural objects, even though the amplitude of motion may change frequency values due to non-linear behaviour. In the last 30 years, we have published laws estimating, through in-situ testing and analytical developments, the frequencies of housing, water tanks, bridges and footbridges, and monuments, including Ottoman minarets. Many other authors are doing the same, confirming the trends published in the literature!

These points are mentioned by Grünthal (1998) in the guidelines and background material of the latest version of the EMS scale, as “future applications or future needs might be the basis for further improvements” of a macroseismic scale.

5.9 Points to retain

Seismology advanced significantly at the turn of the 19th–twentieth century with the new instrumentation capable of measuring ground motion, especially from far away sources.

Intensity scales are the next significant advancement to quantify the effect of an earthquake. This concept not only has survived for more than two centuries but has re-vitalised lately and still is an essential indicator of the seriousness of earthquake motion at a given locality. Various upgrades have been made over the last centuries. We still face great uncertainty nowadays due to the chaotic type of ground motion and the predominant construction present at a given location.

The introduction of robust, strong motion instrumentation made it even more challenging to address. Several scales are still in use: the Japanese, the Mercalli modified the MSK and the EMS—98, a more consistent and modern scale. However, the latter was written based on the MSK scale, mainly developed for eastern European housing. It was a great revolution at its launching, but it reveals a few drawbacks that need consideration nowadays. Intensity scales were also developed for tsunami effects on coastal areas but not for ships in the ocean.

The correlation between Intensity scales and strong motion parameters is still controversial due to the need to reduce complex systems such as buildings and seismic ruptures to simple parameters. Considering that a single spectral shape can characterise earthquake motion and that the constructed stock of buildings can be typified in a few classes, some new parameters were proposed to reduce uncertainties in the conversion of Intensity to PGA, PGV or other ground motion Intensity measures. Several recent papers stress that it is possible to reduce uncertainties on the definition of scale degrees, and we called the attention to them.

The last issue in this topic is related to the possibility of archiving pictures of the damage and starting archiving videos obtained from video cameras or personal mobile cameras, which more and more will be part of our description of the wavefield and other related natural disasters.

Video cameras producing all kinds of videos started being uploaded on YouTube, some of which are “five-start” information on how buildings and people were affected during the shaking. Many different topics can be derived from this information, and sometimes quantitative data on the vibration can be achieved. They are complementary to shaking table tests, but they do not have an accurate way of measuring the motion. On the other hand, what we see in video cameras adds one more point of interest to EMS-98 upgrading, which relates the degrees of damage (D0-D5) with intensities as it is done nowadays in all simulations for impact studies. The advancement of all these studies deals with fragility curves that can now be reversed and placed at the service of an EMS-2021.

6 Changes in paradigm in Seismology and Earthquake Engineering (SEE) and new lines for future developments

Summary

  • EEWS, TEWS

  • Low-cost MEMS

  • Health Monitoring. Rapid analysis of effects

  • Citizen Science and back analyses

  • Field Missions

  • Consolidation in Performance-Based Design (PBD)

  • The new AI and ML techniques

  • New Advancements in:

    • Prediction of occurrences, GPS, Satellite, and other physical earth characteristics

    • Engineering Seismology

    • Soil influence

    • Rotational Seismology

    • P-waves on ships. Praise to Nick Ambraseys

    • Occurrence models with time and space memory; Hazard modelling, etc.

    • Effect of near-source in the shape of response-spectrum (fling)

    • Earthquake Engineering

    • Material science. Non-linear Modelling

    • Base isolation and other shake absorbers

    • Non-structural elements

    • Systemic analyses (Inter and intra-dependences)

    • New and old construction

    • Monumental construction

    • Preparedness- Logistics of Disaster intervention: Operational Research contribution

6.1 Preliminary aspects

Seismology has advanced tremendously in the last 50 years with the launching of networking instrument stations and numerical techniques, allowing the determination with great accuracy of the main parameters of earthquake source and propagation and the characteristics of Earth layers. Strong motion seismology complemented this information with understanding wave propagation in the near-field, where earthquakes can cause more damage.

Unfortunately, prediction of occurrence in the short run never gave essential steps towards a practical value for society.

On the other hand, earthquake engineering benefitted enormously from many scientific and technological advancements, many of them triggered by the various disaster which killed many people and destroyed the social tissue, even in the most developed countries whose direct losses were many times above the 50% of GNP of the affected region.

Ground records helped define ground motion levels, and many examples of experimental-analytical tests were essential to understanding the behaviour of several hundred “case studies”.

The result of the multiple types of studies, including shaking-table tests, pseudo-dynamic analyses, health monitory, etc., linear and non-linear analyses, including the soil influence, were transferred to the most sophisticated codes where good practice needs to be merged with elaborated codes. New materials, lighter and stronger than traditional, were developed. Seismic protection using base isolation and dampers was implemented.

As mentioned before, in the last few years, a new tool has emerged that entered into the “citizen observations” classification consisting of the “video cameras” movies posted on the internet by individuals or entities. We will show how this information can revolutionise the earthquake and tsunami perception of the phenomena.

All displayed data contribute to no more than solving the inverse problem. In other words, we have to discover from the effects produced and, from the main properties that led to the actual instrumental and behavioural observations, find what was the input.

The same can be said about tsunami science and mitigation of its effects that only developed after the Sumatra 2004 and Tohoku 2011 earthquakes. Before 2000, very few experts would analyse tsunamis which were seldom taught in school, and no programmes were considered to mitigate tsunami effects.

6.2 EEWS and TEWS

For a few years, it has been possible to solve one of the direct problems of seismology and contribute to the mitigation of damage inflicted to people and properties, helping a lot of stakeholders of critical structures.

We talk about what is known as “Earthquake Early Warning,” (EEW/EEWS) which use science, state-of-the-art monitoring, and innovative delivery methods to alert people before the arrival of the strongest shaking. Seconds to tens of seconds of warning can provide the opportunity to take life-saving actions such as “Drop, Cover, and Hold” and trigger “control devices” into a safe mode.

The closer to the epicentre, the shorter alert (Lead times) for individuals or stakeholders; the further these are, the longer the alert time they receive. The existence of a front instrument close to the epicentre will enlarge the alert time.

As important as any of the greatest conquests of humankind, this remarkable discovery has suddenly changed the inverse problem into a direct issue, making for the first time seismology with a direct implication for earthquake mitigation. This was the result of the fact that the first few seconds of the P-wave arrival contained enough information on the epicentral location and magnitude of the event. It is then possible to tell how long to wait until the intense shaking arrives.

Many research teams and commercial enterprises are rapidly advancing to solve existing problems, especially to avoid false alarms. This new technology will be in years “big business” with moderate costs and excellent reliability. Individual citizens can buy their equipment, and customised networks will provide these services in such a way similar to what nowadays most seismological networks provide: accurate information on magnitude—epicentral location.

The first country to launch EEWS programmes was Japan. Many EEWS systems are now in operation in several countries around the world, namely in the following locations: Seismic Alert System (SAS, SASMEX) in Mexico City (Espinosa-Aranda et al. 2009), UrEDAS and Compact UrEDAS in Japan (Nakamura et al. 2011), EEW system in Taiwan (Hsiao et al. 2009) and Rapid Response and Early Warning System in Istanbul (IERREEWS) (Alcik et al. 2009). There are similar systems applied to specific installations, such as the Basarab Bridge in Bucharest, the George Massey Tunnel in Vancouver, British Columbia, BART (Bay Area Rapid Transport—Underground Under San Francisco Bay, California) and several applications in Italy, Switzerland, Turkey and Greece (schools in Italy, long-span bridges in Istanbul, nuclear power stations in Switzerland, etc.). California decided to set up an EEWS as a public service, which happened in Japan a few years ago.

The Concept is supported by the evidence that:

  • S-waves and surface waves are generally the strongest part of the wavefield that can cause larger damage.

  • Considering that the usual values for P-wave velocity are—8 km/s and—4 km/s for S-waves (P-wave propagates at a velocity 1.77 -1.87 times S-wave), it leads to (tS-tP) ≈ D/8 (D in km; tS-tP in sec).

If the epicentre is only 75 km away from the target to be protected, an instrument located near the object gives you 10 s warning. Table 11 illustrates the relevant distances and lead times for jugging the opportunity of using an EEWS.

Table 11 Approximate estimates of relevant distances and times for earthquake early warning applications (shaking) (Allen and Melgar 2019)

Figure 72a shows a typical decision matrix with the action population may take when an EEWS is received, and (b) alert levels function of distance and expected damage.

The EEW systems give us larger lead-times in regions away from epicentres where ground motion is more attenuated. But in the case that we have a fault rupturing towards the region to protect, the larger lead-times occur for large amplitudes (Minson et al., 2018).

Significant advances were made in the topic of EEWS in recent years not only due to the perception that the first few seconds of the incoming waves at a station can be used to identify the type of shaking that follows, meaning that these first seconds carry with them the “signature of the earthquake”. It is possible to determine the epicentre and estimate the magnitude of the event almost instantaneously based on this information. Using attenuation curves previously known, it is possible to evaluate the ground motion characteristics at any point around and the time for S-waves to arrive. Much research has been done recently in these fields because of its implication on the impact of earthquakes. The EEWS initially required high dynamic range accelerometers and, if possible continuous GPS (to obtain displacements without filtering corrections of acceleration traces), rapidly evolved to techniques using AI to treat vast sets of data and continue developing to the use of low-cost instrumentation deployed in great numbers around the region to be protected.

Early warning for the arrivals of tsunami (TEWS) waves is another outstanding achievement in contemporary science, and many lives can be saved from the populations living in coastal areas. Ten minutes to hours can be obtained before seeing wave arrivals, proportionating a safe population evacuation to higher locations free of inundations and intense flooding. But all early warning technology (seismic, tsunami, floods, hurricanes, etc.) can only succeed if the authorities are aware of the existing problems, uncertainties and solutions and if the population can deal with these problems.

If we keep our memories from scientific advancements in this field, we should mention that EEWS and TEWS are already great discoveries of the twenty-first century.

The sudden change of inverse to direct problems in Seismology & Mitigation of Earthquake Losses can be observed as follows:

  1. (1)

    In 2000, Earthquake Early warning and Health monitoring changed the problem from inversion to direct importance. It was the major invention of the 21st Century, as magnificent as the discovery of electricity, particle physics and the Double structure of DNA Helix (Genome).

  2. (2)

    With Operational Earthquake Forecasting (OEF), real-time assessment for mitigation actions is becoming part of everyday life.

  3. (3)

    Science has advanced enormously in this direction, and in many seismic zones, these systems are already working with reliability.

  4. (4)

    There are still many problems on the scientific side and on the political, jurisdictional, and more adequate ways to communicate with the populations.

  5. (5)

    Submarine cables acting as “smart instruments” are a hope for the near future, increasing the accuracy and lead time for near ocean seismic active zones. This technology with submarine optical fibre telecom cables can be of great success. The number of sensor positions can significantly increase, covering zones where only OBS could be used and transmitting information in real-time. Today there are a few different technologies for effective earthquake and tsunami monitoring capabilities:

  1. (1)

    SMART, Science Monitoring and Reliable Telecommunications;

  2. (2)

    DAS, Distributed Acoustic Sensing, and

  3. (3)

    LI, Laser Interferometry, or PEM, Photonics for Earthquake Monitoring,

EEWS lead time using submarine optical fibre cables for Sines, Portugal, an industrial complex in Portugal, is significant (Fig. 73).

Fig. 73
figure 73

EEWS lead times for Sines (after Carrilho, Personal Communication, 2021) considering only the existing stations inland. The yellow lines (submarine cables and the red triangles (OBS) are hypothetical stations to be installed nearby the most active areas of seismicity (blue circle-more active zone)

However, caution should be exercised in all these advancements.

Preliminary results of a Cost–Benefit analysis for the SW of Portugal (Silva 2022) indicate a clear benefit for the next 25 years by using the submarine cables coupled with OBSs implanted in the most active zone in the contact of the Euro-Asiatic and African (Nubia) Plates.

An example of the application of DAS technology for seismic measurements can be found in the Canary Islands to monitor the La Palma eruption 2021. Sensors installed every 10 m of cable provide information that is difficult to obtain with conventional seismographs (Smart cables, the future of submarine fibre optic cables—Barcelona Cable Landing Station, consulted March 2022: https://barcelonacls.com/news).

But optical cables do not exhaust their importance with the submarine cables. They might detect earthquakes inland below cities, making records as traditional arrays. Savage (2018) analyses the loop beneath Stanford University to monitor ground movements. At the time, the quality of recordings was not very good, but we could use boxes managing laser light (as sensors) spaced as little as 5 m. This would cause a revolution in the entire engineering seismology whenever the hardware can reduce the noise.

As referred to above, Fibre optical was already used in Fiber Bragg grating sensor networking to health monitoring engineering structures (Ansari 2009).

6.3 Use of low-cost instrumentation (MEMS)

Micro-electromechanical systems (MEMS) is a process technology used to create tiny integrated devices or systems that combine mechanical and electrical components. They are fabricated using integrated circuit batch processing techniques and can range in size from a few micrometres to millimetres (Muller et al. 1991). They can measure acceleration components at a very low cost. The MEMS high-density seismic network generates data in real-time, empowering the following applications:

  • • Seismic detection (strong-motion) for “near” and “far” earthquakes (far being in the order of hundreds of km).

  • Study local events and characterise the structure of the seismogenic zone by performing waveform analysis of nearby small events and ambient noise.

  • Analyse the impact produced by human activity and cultural noise on structures.

  • Predict cumulative and progressive degradation of fragile buildings and monuments exposed to continuous urban tremors, which could cause irreparable damage to human heritage. Urban seismic noise is usually dominated by traffic and industrial activity with peak frequencies below 25 Hz.

  • Generate Shakemaps that civil protection authorities can use for a post-earthquake response, including assessing structural integrity risks in buildings and slopes.

  • Provide to the scientific community with new open-access high-resolution seismic data.

  • • Facilitate access to education in seismology, resulting from open access to low-cost technology installed in high schools and integrated into projects and activities.

Being low-cost, it facilitates their widespread adoption enabling the deployment of high-density networking providing high-resolution observation and a massive amount of data that may feed intensive processing techniques like big data and artificial intelligence. Applying machine learning techniques and pattern matching-based processing that are much more sensitive than the power detectors used in current seismic systems (D'Alessandro et al. 2017), make them especially relevant in the presence of noise and weak signals. The deployment of high-density network-enabled seismic networks represents an important step in our roadmap towards understanding the functioning of the Earth, including its internal structure and physical processes that cause earthquakes, while at the same time contributing toward a safer and more sustainable society (Scudero et al. 2018).

MEMS sensors are being developed like the prototypal network managed by the Idaho National Engineering and Environmental Laboratory (Holland 2003), the Quake-Catcher Network operated by the University of Stanford (Cochran et al. 2009), the Community Seismic Network managed by the California Institute of Technology (Kohler et al. 2013; Clayton et al. 2011), and the Home Seismometer Network operated by the Japan Meteorological Agency (Horiuchi et al. 2009).

In conclusion, the pros and cons of MEMS application in seismology can be summarised. Among the Pros are the drastically reduced cost; the price of a single sensor is quantifiable into about 5% (or even less) than a traditional sensor. The main consequence is the potential for high-density applications such as urban-scale networks, mini arrays, structural health monitoring of buildings, and on-site Earthquake Early Warning Systems (EEWS). Besides, the lightweight makes the sensors easy to carry, install, and manoeuvre. Conversely, the main disadvantage is the reduced sensing capability; the noise floor of the best MEMS accelerometer is about − 100 dB (Fig. 71). The MEMS accelerometer's noise will likely be reduced in the next generations of sensors so that also part of the seismic background noise can be assessed.

Another disadvantage is the relatively poor response at low frequencies. This is the reason why MEMS sensors are more suitable for strong-motion seismology. However, some recently developed broadband sensors offer good performances also for weak motion. Future applications could also include rotational seismology; the simultaneous measurement of both the translational and rotational components of motion would allow a complete characterisation of the strong ground motion in the near field of an earthquake. As now, the limit for developing studies in rotational seismology is represented by the prohibitive costs of implementing a dedicated network. Conversely, such an issue could be achieved by exploiting MEMS accelerometers and gyroscopes specifically designed.

The MEMS technology is developing very fast, and new products and applications are issued continuously to meet better the technical requirements for optimal earthquake monitoring (red target region in Fig. 74.

Fig. 74
figure 74

New generation of MEMS will bring the noise to values similar to broadband stations–see arrow (Scudero et al. 2018)

The international seismology community is focusing on this technology revolutionising how to monitor earthquakes in almost real-time. Also, the network efficiency is improving, making applications such as EEWS consistently more effective and reliable. MEMS need to enlarge the dynamic range lowering the (LSB) least significant bit, proportioning a much lower noise. We need to reduce noise from − 60 to − 160 dB. Also, to lower the transmission's significant latency (lead-time), we can upgrade the data loggers by integrating a low-latency data packetising function and increasing the telemetry bandwidth by substituting the cellular modem links with fibre lines. (Peng et al. 2021).

The complex problem of transmitting the data to a central station will be solved at almost no cost if instruments are directly linked to the internet (Oliveira 2021c) by fixing them to the router or an electricity counter. This new formulation will solve the disadvantage of mobile phones for not being at a fixed location—additionally, no batteries attached or no need for power outlets, and no transmission gadgets.

6.4 Health monitoring: rapid analysis of effects

The new generation of MEMS will contribute to EEWS and health monitoring due to the low cost of good instrumentation quality. This will allow the large number of instruments that can be deployed as strong motion instruments with great geographic density enabling a much better coverage of the wave field. In terms of health monitoring, many structures can be monitored at the low cost of instrumentation and data transmission. We can quickly multiply the number of buildings in a city to be monitored by thousands. The SM and Structural Health Monitoring (SHM) system developed in 2014 in Japan is installed in about 400 buildings, permitting the analysis of building performance as well as of many non-structural components (Ogasawara et al. 2021). It would be exciting to add camera videos to enhance the information on movements of large displacements such as doors, lamps, etc. (Oliveira and Ferreira 2021a).

The paradigm change already occurred in the US with the deployment of high-density seismic networks with the capability to record the propagation of seismic activity with high or low resolution (Heaton 2018): The California Institute of Technology established the Community Seismic Network, an earthquake monitoring system based on a dense array of low-cost acceleration sensors (more than 1000) aiming at to produce block-by-block strong shaking measurements during an earthquake. The University of Southern California’s Quake-Catcher Network began rolling out 6000 tiny sensors in the San Francisco Bay Area, part of the densest seismic sensors ever devoted to studying earthquakes in real-time. (See http://csn.caltech.edu/, accessed 2021/01/15; See https://quakecatcher.net/, accessed 2021/01/15).

Present perspectives of IT developments are enormous, and we can talk of instrumentation with a reasonably good dynamic performance. Their use will prompt in a few years from now even better quality. This is very important in need of speed for transmitting quality data. As reported in Peng et al. (2020), significant advancements show improvements in lead time by almost 100% and a substantial reduction in uncertainties. MEMS are already being used to compute Magnitudes.

On top of that, shortly, we need to consider providing customised services from the perspectives of an end-user, such as triggering interpretable alerts according to probabilistic risk-based estimation optimised for the preferences of a given stakeholder (Cremen and Galasso 2020) or providing an open public (event or ground motion-based) service to users for building their customised applications. This will further improve the efficiency of an EEWS and transform it into a more helpful tool.

Merge Data from “free-field” strong-motion stations with data from “health monitoring” of essential structures is a very fruitful achievement. As Thomas Heaton describes in his academic “blog” (http://heaton.caltech.edu/) “…obtaining recordings of ground motion has been facilitated by the development of crowd-sourced seismic networks. The fact that all smartphones have MEMS accelerometers means that we may one day receive seismic records from millions of cell phones. These records can give us a more detailed picture of the seismic wavefield as it propagates. The most revolutionary aspect of crowdsourced networks is likely to come from building monitoring. Someday in the not-too-distant future, there will be a time when the vibrational history of virtually every building will be recorded for significant earthquakes. An additional benefit of the Community Seismic Network is that it will send many real-time estimates of shaking intensity, enabling a fast understanding of the health of structures right after an even and helping in the post-earthquake measures”.

The evaluation of seismic damage is today almost exclusively based on visual inspection. With the advent of MEMS, which allows the installation of many instrumentations with the permanent transmission of data and other technologies already exposed, it will be accessible to seismic monitor and health analyses the data, comparing the building behaviour with the damage reported by citizens.

The introduction of MEMS in everyday life of seismology requires a great effort in dealing with an enormous amount of data. And this is only possible by using AI and ML technologies to extract essential results from the data.

6.5 Citizen science and back analyses

Citizen Science is a new source of information provided by educated people on the behaviour of structures and humans during shaking. They can consider that the seismic response of simple objects acts as proxies’ of seismological instruments. People’s behaviour, objects in movement, suspended lamps, etc., look like ideal single-degree-of-freedom systems, from which we can recover the input motion. DYFI (did you feel it) is already implanted in a few international seismological agencies. A few new corrections can be used quite accurately to monitor the earthquake wavefield.

The conveyed information is fascinating and relatively fast (displayed in a few minutes). Good coverage around the epicentre can give a good distribution of shaking and use that information for immediate intervention on the part of authorities. If the magnitude is sufficiently large to cause damage, probably the internet will go down, and a “mushroom pattern” around the critical area will prevail without information. On top of this vital information that has to be merged with all other sources such as responses to enquiries on human behaviour, we are in a much better position to understand much of the human reaction inside and outside buildings.

In this regard, our experience from conducting two inquiries related to the 40 and 50 earthquake's anniversary with significant impact on the population and the constructed housing, we would like to make the following commentary after describing each one of them shortly.

The first was the 28 February 1969 M7.8 earthquake with epicentre location SW of the Portuguese Continent (SW San Vicente) (Fig. 11a), which took place at 3 am. It caused moderate damage in the southern part of Portugal (Algarve), where several poor-quality housing was destroyed in villages located in the transition between two different morphological regions; some damage in large towns where D3/D4 type of damage was observed in 1 to 3 story high (old masonry) and also caused damage in large cities such as Lisbon and Setubal 300 km away from the epicentre, with collapses of ageing chimneys, ornaments, and a few structural failures. The stock of churches was partially affected by the opening of cracks in front walls, bell towers and a few collapses. The shaking awoke the population, and they remembered the installed panic, especially in large towns and decided to spend the night in their cars. The vibration was measured in Lisbon by one of the first strong motion instruments in Europe (PGA around 30 cm/s2). At the time of the occurrence, the intensity attributed a value of VII MM to Lisbon, while in Algarve, the intensity was VII to VIII in those places of large amounts of collapses of poor quality masonry construction. Inverse studies on masonry schools attribute 50 to 80 cm/s2 to these places, depending on the soil situation.

The back analysis (Estêvão et al. 2021) using the most updated pushover algorithms of an inverted pendulum that suffered a small crack in the single central reinforced concrete pillar reverted to a 55 > PGA > 104 cm/s2 input action (Fig. 75). Again variations of one to two are present.

Fig. 75
figure 75

Back analysis of a RC column cracked at the base

Thus, it is of utmost importance to study the effects of this type of earthquake on existing constructions, namely to estimate the attenuation of large magnitude earthquakes that could occur at sea, at great epicentral distances. The range of peak accelerations obtained now is essential to understand the propagation of the waves of this earthquake, namely its attenuation, and constitutes another point of the analysis network under study and is in line with the survey carried out for schools. This is another example of how back analysis to resolve unknowns.

The other situation is the January 1st, 1980 earthquake in the Azores Islands near Terceira Island. This M7.2 earthquake at 20 km away from the nearby rural population areas. The large city (Angra do Heroísmo) has an enormous monumental stock of partially damaged buildings (D4), causing many homeless. This town was rebuilt in five years when most of the stock was already restored, even though the social context changed significantly due to the need to install a population lacking housing. This earthquake was recorded in an old SMA station 80 km from the epicentre, and, even though the record is of inferior quality due to lack of maintenance and the quality of the 70 mm filming, it was possible to recover part of the record showing a PGA of 50 cm/s2. From inverse problems solutions, it was possible to estimate the hypothetical shaking in the zones more affected by the ground motion and fix the PGA around 150 cm/s2 for Angra do Heroísmo, a little below the present code of practice, which indicates that city the value of 175 cm/s2 in type A soil. The assigned IMM intensity at the time was VII.

The citizen enquiries made 50 and 40 years after those events were very successful, with many answers for the two events. In both cases, we adapted to the current search for events that take place now to include the possibility of people answering being direct or indirect witnesses.

The gathered information is fascinating, having many points in common and showing differences that should be accounted for, not only because the damage caused is different but also due to the level of shaking felt. Using the enquiry, it is possible to get information up to intensity VI, maximum VII. When damage is more significant than (> D3), citizens cannot provide good information because their houses are not in habitability condition. They have to leave their place, home, workplace or whatever situation they were in at the time of the earthquake. Even telephonic lines may be down for a couple of hours or more, creating difficulty accessing information (no computers at the time). Shaking greater than VI by definition may disrupt office tools that should be ready to work. Also, citizens do not care to inform what they have felt at that difficult time. So, there is a lack of information for intensity above that value. This was very clear in the answers received in those two events. There was a part in the enquiry where the citizen could tell their experience in words, not choosing pre-established multiple-choice questions. The richness of these accounts is impressive and should be analysed in semantic terms, as done recently by Contreras et al. 2022.

As explained in Sect. 5, a re-unification of scales would be grateful for the citizen and the specialist in charge of assignment intensities. Proposals which were identified then and described above about citizens’, support entirely those suggestions that go back to the eighties of the twentieth century in California to classify intensities based on the way artefacts existing on shelves at supermarkets would fall.

6.6 Field-missions

Field missions have been reported many times along this text because they maintained the interest since we have information about earthquakes. Even though the resource of various technologies can inform us in the short term of the state of damage of an impacted zone, as reported in several sections of this work (health monitoring, satellite imagery, citizen observatories, video-cameras, drones, etc.), there will be nowadays placed for field missions for several reasons:

  • Many zones around the World are not prepared to use those technologies due to a lack of resources (knowledge, human and financial).

  • A field trip will enhance several aspects that virtual information will be not able to track.

  • It will always be a place to meet and discuss, giving a human dimension to the problem.

  • Past centuries have shown that we learn if we are in the field, feeling the needs and urgencies that local people require.

  • Check if the locals have pretty understood the knowledge from other events.

  • Experts should organise themes with a background in geological and geotechnical aspects, specialisations in Earthquake Engineering, Sociology, Psychology, urban planners, and possibly politicians. And many more!

Dissemination Workshops should now complement Field-missions, as we have seen recently in EERI initiatives. They should be done several times after the event to understand how the recovery is progressing, the needs of the population and the needs for upgrading the repair works, participating in the risk management.

6.7 Consolidation of performance-based design (PBD)

In the last 100 years, significant developments have been made in the philosophy for safer constructed buildings. Preventing collapse (“Live-saving”) was the primary concern of the first code legislation in the seventies last century. But structures were allowed to undergo ductile plastic deformations, which would create significant damage, and buildings could not be used after the earthquake. Recent earthquakes have shown that many facilities complying with code requirements (Prevent collapse attitude) were no longer functional and were later demolished. As Takagi and Wada (2017) put it, based on mechanical, automotive industry or even looking to the human skeleton, “the earthquake-resistant design philosophy in the previous century should now be revised to meet modern social and economic requirements and sustainable goals, changing the “Life-saving” to “Business continuity”. Structures should be designed to be quickly restored to full operation with minimal disruption and cost following an earthquake.”

In the same line of thinking, Dusenberry (2019), with advancements in Performance-based design approaches (PBD), states that “While processes are routine in many engineering disciplines, they are unfamiliar to most of the stakeholders in the construction industry. The process demands more structural engineers, including a better understanding of risk assessment and management. Peer reviews likely will be vital to the validation process. However, performance-based engineering approaches encourage research, development, and innovative engineering processes. The result is the freedom to solve harder problems with better structures.” PBD appears to respond to the significant uncertainties in defining design ground motion, leading to a design less dependent on ground motion levels (Table 12).

Table 12 Fundamental Goals for Seismic Design

However, Performance-based design approaches are not needed for most structures. We could easily have dual-code methods for structural design in the future. Design of routine structures could default to prescriptive requirements, with a performance-based option for those interested in exploring its benefits. Ferry Borges (LNEC, 1955) proposed such a concept with simple rigid measures in the fifties. However, performance-based design processes should become an accepted protocol for complex, high-value, and mission-critical structures (e.g., hospitals, emergency facilities and shelters, high-rise and iconic buildings, etc.),

“The notion of fully elastic behaviour was also ruled out by construction economics and architectural convention. Had codes been changed to require an elastic response in large earthquakes, the necessary structural configurations and member sizes would have severely impacted the building industry. Fairly standard building types would have suddenly become unwieldy and architecturally inefficient, if not prohibitively expensive. In all likelihood, political forces would have resisted such substantial changes, just as the adoption of early earthquake codes had been (Olson 2003), as many code changes still are nowadays. (in SEAOC Blue Book 2019).

Various analytical approaches to performance-based earthquake engineering are in development, even though most advanced codes have already taken care of these approaches.

Porter (1995), after SEAOC (1995), summarises the approach being pursued by the Pacific Earthquake Engineering Research (PEER) Center. It works in four stages: hazard analysis, structural analysis, damage analysis, and loss analysis. The hazard analysis (McGuire 2008) evaluates the seismic hazard at the facility site, producing sample ground-motion time histories whose “intensity measure” (IM) is appropriate to vary hazard levels. In the structural-analysis phase, a nonlinear time-history structural analysis is performed to calculate the facility’s response to a ground motion of a given IM in terms of drifts, accelerations, ground failure, or other “engineering demand parameters” (EDP). In the third damage-analysis phase, these EDPs are used with component fragility functions to determine the facility components' damage measures (DM). Finally, given Damaged Matrices (DM), one evaluates repair efforts to determine repair costs, operability, repair duration, and the potential for casualties. These performance measures are called decision variables (DV) since they can inform stakeholder decisions about future performance. Each relationship involves uncertainty and is treated probabilistically, from location and design to IM, IM to EDP, EDP to DM, and DM to DV. Based again on a Risk Matrix, called herein Vision 2000, the various Limit States (Performance Levels—Fully operational, Operational, Life Safe and Near Collapse or D1 to D4) were confronted with the probability of occurrence, giving rise to the recommendations to achieve three strategic objectives (Fig. 76): A Basic Objective, Essential/Hazardous Objective and Safety–Critical Objective. In terms of EC-8 (2004), these objectives correspond to levels of importance, and there are four of them. They are related to the Performance level (Severity of impact) and Design Level (Likelihood of occurrence) (see Fig. 16).

Fig. 76
figure 76

Vision 2000 recommendations for design (Poland et al., 1995)

In other words, the new form to look at the meaning of limit states is:

  • Conform to local building codes providing "Life Safety," meaning that the building may collapse eventually but not during the earthquake.

  • Design for repairable structural damage, required the evacuation of the building, and acceptable loss of business for a stipulated number of days.

  • Design for repairable non-structural damage, partial or complete evacuation and acceptable loss of business for a stipulated number of days due to repair.

  • Design for repairable structural damage, no evacuation required, and acceptable loss of business for days due to repair.

  • No structural damage, repairable non-structural damage, no evacuation, and acceptable loss of business for a stipulated number of days due to repair.

If not correctly anchored, no structural or non-structural damage and no loss of business caused (excluding damage to tenants' equipment such as file cabinets, bookshelves, furniture, office equipment etc., if not correctly anchored) should occur.

  • Definition of Return Periods

Return Period (RP) associated with the probability of exceedance of some ground motion intensity measure (IM) should be considered only once during the design process, commonly associated with a design level or importance criteria. And the same concept should be achieved for all actions linked to the occurrence of the earthquake, say, shaking, tsunami, liquefaction, etc. If we are interested in the scenario approach, the same should prevail again.

6.8 The new AI and ML technologies

The moment we are collecting a massive amount of data, there is no other way to interpret data if we do not use the new technologies of big data, which are already used in many fields of knowledge. Not much is done yet in Seismology, Earthquake Engineering, and the Construction Industry. The first courses on this topic are being given in other departments outside classical Civil Engineering. Still, these technologies are essential to merge all the concepts referred to in this work. Various sections mention the acquisition of terabytes of information from recording stations, video cameras, pictures, metadata, citizen science, etc. In a state-of-the-art review paper, Xie et al., 2020 present in Fig. 77 a set of various AI and ML techniques to solve engineering seismology and structural analysis problems in 4 topics of Earthquake Engineering. Still, many more can be approached for the uses of these big data tools—namely, optimisation problems, pattern recognition, virtual reality, disaster risk mitigation, etc. A straightforward case with great importance is using ML to determine the construction period of the building based on the façade details such as type of windows, size, number, position, type of finishing, ornaments, etc. Another topic of great interest is identifying EEWS indicators in the shortest time possible.

Fig. 77
figure 77

A set of AI and ML techniques applied in Earthquake Engineering (after Xie et al. 2020)

In seismology AI (Jiao and Alavi 2019) can offer a potential solution to the fields addressing massive seismological data, such as identifying seismic responses from noisy data, revealing unseen patterns and features from detected seismic data, and proceeding to the seismic analysis.


Box—A curiosity.

In the talks we had on Fridays afternoons during my sabbatical at UC Berkeley Seismographic Station in early 1980, Prof. Bruce Bolt would question the faculty on duty and grad students with the idea of looking to the first seconds of a record and immediately say, from years of experience, that we were in the presence of an earthquake coming from Pacifica or any other specific seismic zone. He was anticipated introducing “AI” and “Machine Learning” into seismology. This idea is shared by many different people that spent their lives looking at wave shapes and tried to make some sense out of that, beyond locating phase arrivals. But now, we are in the turmoil of EEWS, which will contribute actively to preserving lives of research and reducing losses.

 

Moreover, research challenges and the associated future research needs are discussed, including embracing the next generation of data sharing and sensor technologies, implementing more advanced ML techniques, and developing physics-guided ML models. AI algorithms can only manage the terra bits of daily recorded information. The visualisation of mal-function or conspicuous behaviour of some of the multi-components that constitute the entire system can only be detected by powerful algorithms. Applications of AI to sectorial parts have already been made with good results, but the physics of the process needs to be present all the time. Extrapolations outside the learning set are difficult or impossible to make.

6.9 A few other realisations in Seismology and Engineering Seismology

6.9.1 Advancements in Prediction of occurrences, GPS, Satellite, other physical earth characteristics

One of the main goals of Seismology would be the prediction of earthquakes. Before progressing, we should clarify the meaning of prediction because it can be seen at three different levels: long, intermediate, and in short temporal horizon. We already make long-term predictions (decades, centuries) essentially through hazard analysis. In the intermediate-term (weeks, months), we may see an increase in probability. But what society expects is the see prediction in the short term (hours, days), and we cannot do it yet. Great doubt if we would ever get there. The following Box considers that prediction may be accessed through a multi-task job.


Box—Prediction of earthquakes is a multi-task job that needs:

•High precision GPS.

•Merge multi-task information (physical variables measuring any meaningful changes).

•Check with probabilistic modelling in the near-time interval. Time and space interdependences.

•Might be possible in the near future if good instrumentation can detect anomalies in the general behaviour of a rupturing fault.

•A MegaQuake needs to accumulate considerable energy before rupturing again. Of course, zones with high slip rates take less time to accumulate stress and will generate more frequent earthquakes with lower magnitude values—Local Earthquakes.

•Lower slip rates take much longer to accumulate energy and initiate the phenomenon. And generally, magnitudes are larger.

•Wright (2020, 2021) has developed a multivariable seismic instrument based on frequency modulation (FM) and a two dimensional (2D) Fourier transform that combines a broadband seismometer with four other variables (electric field, electromagnetic spectrum, short period resonances and acoustic/seismic strobes). All five sensors detected possible precursors of the seismic events that followed them. It is not yet claimed to be a means for reliably detecting oncoming earthquakes (certainly not). Instead, it is viewed as a research tool for studying the relationships between earthquakes and possible pre-quake phenomena, including correlations among various pre-quake variables. Much discussion followed this work, some of which consider the impossibility of ever predicting earthquakes because it is against the laws of physics (Bychkov 2020).

 

Acoustic emission (AE) in fracture processes (Toader et al. 2019) is part of a multidisciplinary network that analyses precursor phenomena (atmospheric aerosols, ions, CO2, radon and clouds in relationship with temperature, humidity, atmospheric pressure, wind speed and direction, variations of the telluric currents, local magnetic field, infrasound, atmospheric electrostatic field, electromagnetic and seismic activity, radio waves propagation, and animal behaviour).

Researchers have developed a global earthquake monitoring system that uses the Global Navigational Satellite System (GNSS) to measure crustal deformation. Using SAR, InSAR, etc., techniques, already part of the Copernicus Land Monitoring Programme in Europe, can obtain the deformation of Earth's surface deformation at a large scale through remote sensing images, informing on the proximity of a new event. This information has been used in past events, accessing the rate of deformation, but in the case of volcanic eruptions can serve as an important indicator, together with other indicators, of the proximity of an eruption. Within seconds, the monitoring system can rapidly assess earthquake magnitude and fault slip distribution for earthquakes of magnitude 7.0 and more, making it a potentially valuable tool in earthquake and tsunami early warning for these damaging events.

An example of the use of high-rate GPS to estimate magnitude is presented in Fig. 78, where the magnitude obtained traditionally is compared to the values of displacements from various GPS stations for multiple types of earthquakes (Mw 6 to 9) (Melgar et al. 2015). Exceptional values of 7 m were recorded for Mw9 at 100 km from the hypocentre. The high-resolution continuous GPS instrumentation is the only tool to obtain accurate displacements of the wave field. Displacements from velocity or acceleration transducers always have integration problems, introducing uncertainty with the numerical algorithm. And more and more, the knowledge of a correct determination of displacements is crucial to numerical analyses of engineering structures that need an excellent estimative of displacements, such as EEWS, base-isolated systems or structures that may exhibit large displacements.

Fig. 78
figure 78

Scaling of PGD measurements (Melgar et al. 2015)

In the seventies, everybody thought Cancer solution and Prediction of earthquakes in 50 years could be found. Up to 2020, only a few hopes. But as messenger mRNA is making miracles, or the ctDNA (Berkeley Engineer 2021), it might be essential to identify cancer cells early; the same expectations can go to earthquake prediction! These hopes were above referred through the multi-task analysis of various physical components, but the regions showing higher slip rates (Fig. 79) are more prone to earthquake events. However, never forget that the rupture occurrence process is very different from region to region.

Fig. 79
figure 79

Strain rate exhibiting the regions of higher stress concentration (Kreemer et al. 2014)

6.10 Other vital topics in Engineering Seismology

Now we number a set of issues that may be important to address in the future, some already mentioned.

  • Strong Motion Seismology

    1. o

      Improvements in instrumentation, data treatment, arrays, and characterisation of records to improve the simulation of fault rupture, wave travel and site influence.

    2. o

      Effect of near-source in the shape of response-spectrum (fling).

    3. o

      Wave propagation in buildings may affect the spatial distribution of damage. With synchronised sensors in height, it is possible to check the velocity of the wave. Still, this parcel is residual in setting the equations of motion in structural dynamics, which considers this velocity as infinite. For tall buildings, it may influence a small amount, but the difference in value between the velocity and the period of the first modes might be negligible.

    4. o

      Wave propagation scatters in urban centres are caused by the interaction of many different buildings.

    5. o

      Rotational seismology has been the object of critical theoretical developments since the 1970's based on arrays of stations, but unfortunately, instrumentation to directly measure these components only recently came to life (Lee et al. 2009). It is still too early to say how important rotational components are necessary to seismology and earthquake engineering, especially near-field. For tall buildings, the rotational components around the horizontal axis may increase the entire rotation of the building, essentially in irregular-plan facilities with additional eccentricity.

    6. o

      Propagation effects of ground motion (asynchronism) were another topic of great importance at the end of the twentieth century when the first strong motion dense arrays were launched. (Bolt, 1979).

    7. o

      Spatial-temporal interdependences of earthquake occurrences may also introduce some further understanding of the complex rupture mechanisms (Rodrigues et al., 2020).

    8. o

      The development of maps to characterise the various parameters influencing the occurrence and ground motion at a world scale to visualise the regions where most resources should be concentrated to minimise future impacts of earthquakes. The work done by international agencies in this regard is of most importance. Maps of geological fault slip rates are good examples, as well as the work developed by GEM on Word map for hazards (Pagani et al., 2015) and a first phase that was completed with the World map of risks (GEM 2021) (Silva et al. 2018). For World Hazard maps, prior realisations were made in GSHAP (Giardini et al. 1999), etc.

P-wave on ships—A praise to Nick Ambraseys

Returning to Ambraseys, that looked to a topic that very few researchers focused on the effect of P-wave on ships near an epicentre located in an open ocean (Ambraseys 1985). The tsunami (the Rudolph-Scale is not appropriate for this situation) effect on vessels is similar to the collision with a mass. The effect inside the boat is similar to shaking on suspended objects, displacement of books, or fall of dishes, a direct incident wave damped by the water layer. This phenomenon is linked to hydrodynamics and has nothing in common with traditional tsunami propagation. It causes enormous pressure, killing fishery in the neighbourhood.

This description coincides with the effect of the tsunami wave that followed the M7.8 1969 earthquake in SW of Continental Portugal, for which eyewitnesses report comparable effects in a few ships navigating around the epicentral area. Ambraseys tried to compute the energy necessary to create what was reported. No one has looked into this problem that may affect a large number of ships travelling in high seismic active zones or offshore towers, wind turbines, etc. During the Tohoku earthquake, no single ship was nearby the fault rupture. So, they were essentially affected by the propagating tsunami wave that made them rotate in a very short time as measured by GPS instrumentation on board.

A few comments on the implementation of Hazard Analysis:

  • Occurrence models with time and space memory;

  • Hazard modelling, having in mind the rare events that occur with very large Return Periods;

  • Solve the conflicting problem of using more than a scenario-based for a particular event and another scenario for the same event (e.g.tsunami and shaking)

    Two points deserve special attention about the topic of the definition of ground motion to be used in a site:

  • In case there is no available data for selecting one or more GMPEs compatible with the required magnitude value, it does not make sense to use a Logic-Tree option composed of GMPEs that do not have anything to do with the tectonic environment. Adding several and placing a-priori weights to each one is worse than using one GMPE that can reproduce any given instrumental record, even obtained with a small magnitude value. Bodda et al. (2022) try to solve this problem of estimation weights using a Bayesian approach. Anyway, simulation of fault rupture might be a solution.

  • Intra- and inter-event uncertainties are not yet well perceived due to the influence of the urban environment and the intrinsic uncertainty in general wave propagation, as became clear in Sect. 5.5.

Never forget the historical events even if they are not assigned with great accuracy. Much work should be done in the chapter on historical seismicity to support this need.

6.11 Modelling in Earthquake Engineering

Materials and non-linear mathematical modelling of structures. Uncertainties

The basic principles of the Theory of Elasticity, Stress–strain relations, Constitutive laws, Equilibrium and Partial Differential equations, etc. (see "Appendix 2"), accompanied the evolution of Mathematics and their theoreticians, Pitagoras, Euclides, etc., with significant developments in the 18th Century. FEM, Linear, Non-linear modelling: material and large displacements. Concentrated plasticity, discrete models 3-DEC, THM-carbon, Cosserat material, Material Point Method (MPM); Boundary Element Methods, Lagrangean and Eulerian formulations, not to speak of Fluids, Hydrodynamics, Thermodynamics, etc., and their numerical counterparts, only appear after 1950 and polished after 2000 with the help of immense available computer power which permitted a fine discretisation in time and space.

Lagioia et al. (2016), working with constitutive laws dealing with non-linear behaviour and failure criteria, mathematically demonstrated that classical yield and failure criteria such as Tresca, von Mises, Drucker–Prager, Mohr–Coulomb, Matsuoka–Nakai and Lade–Duncan are all defined by the same equation, enabling the use of an efficient implicit integration algorithm which results in a very short machine runtime even when demanding that boundary are analysed.

  • Great advances were made in this area of knowledge, with sophisticated models, huge computer software, measurements during events, supported by experimental testing, etc., but uncertainties are still too large.

Even though enormous advancements have been made with modelling elasticity, viscosity and plasticity as well as with non-linearity due to large displacements, in the most varied geometric objects, there are still many problems to attain consensus in solving the response of a simple building due to issues of calibration of property values such as modulus of elasticity, friction, or boundary conditions. In a blind test to compare different software and their use, a masonry building with three stories (Fig. 80) was presented to 13 research teams (Fig. 81). This building was subjected to a given ground motion. No physical test was made to compare results with an experimental model. The idea was to compare models among themselves. Results were compared in terms of capacity curves, predicted failure mechanisms compatible with the fulfilment of limit states of near collapse and damage limitation, and corresponding minimum values of peak ground acceleration (PGA).

Fig. 80
figure 80

Masonry model under analysis

Fig. 81
figure 81

The 13 concurrent models

Parisse et al. (2021) compiled all the results of the analysis (Fig. 82). They concluded that despite an overall good agreement for damage patterns and collapse mechanisms, the scatter of predicted capacity curves and critical PGAs is very high.

Fig. 82
figure 82

Classification of predicted failure modes

Some good news are coming from the back-analysis of the collapse of a tetrastyle canopy (Oliveira and Lemos 2021b) in the centre of Tripureswor Plaza in Kathmandu, Nepal. It has suffered the destruction of its monumental structure, which was captured by films made with two video cameras [Video 8a, 8b] shooting from almost opposite sides. The structure is a four colonnade topped by a cap with a statue in the middle (Fig. 83). Columns are made of concrete with a minimum amount of steel. They are built-in at the bottom and connected to four beams forming a square ring at the top. On top, there is a thin slab supporting a type of hollow pyramid made of concrete blocks. A nearby GPS station recorded the ground motion, which we input at the foundation of this structure. Results of the analytical model predict the response observed in the two video cameras in excellent terms.

Fig. 83
figure 83

Back-analysis of a collapsed tetrastyle canopy

Among all these good news, there are a few Caveats that deserves attention from all of us:

  • Ambraseys, Bilham, Entities and experts are calling attention to society, in general, and decision-makers, in particular, that countries with rare black swans and developing countries in active zones, will be later or earlier the object of mega-disasters.

  • PSHA models are essentially constructed using strong shaking recorded in the past four decades with events M < 7.5. More significant earthquakes are so infrequent that there are too few records to characterise the range of shaking in great earthquakes (e.g., the 1906 San Francisco earthquake). Therefore, PSHA must extrapolate relationships between ground motion parameters and earthquake magnitude too large magnitudes.

  • Near-source motions are also not well known and might be a problem.

  • In some cases, projections may underestimate the severity of shaking that buildings in several US West Coast cities are likely to undergo during earthquakes. In some areas of Los Angeles County like Century City, Culver City, Long Beach or Santa Monica, the new projections nearly double the previous estimates for the type of ground shaking most threatening a tall building.

  • Codes and quality control. Dignify the Professions linked to construction.

  • Policies for massive retrofitting for non-conformity construction.

  • Education. Public perception.

Only by working on these various fronts will it be possible to reverse the paradigm of the earthquake as a disaster into an earthquake as a sustainable challenge.

Other points of great interest to be further developed can be found in many publications presented in various conferences, journals or books referred in "Appendix 1".

  • Design details and ways to deal with practical cases

  • Special Structures

  • Infrastructure and network interdependencies

  • Structural Dynamics and Earthquake Engineering

  • Structural Vibration Control Using Passive, Semi-Active and Active Control Systems

  • Structural Health Monitoring (SHM)

  • Large scale simulations using advanced methods

  • Reliability and Vulnerability of Structures

  • Non-structural elements

  • Systemic analyses (Inter and intra-dependences—Multi-hazard as discussed by Wang et al. (2020))

  • New and old construction

  • Monumental construction

  • Preparedness- Logistics of Disaster intervention: Operational Research contribution (Çoban et al. 2021).

6.12 Base Isolation Systems for reducing the dynamic response of the structure

Although it was only in the late twentieth century that base isolation systems were implemented to protect structures, we saw in Sect. 3.2.3 that the idea is ancient.

In many countries, it is taking time for base-isolated structures to enter the daily life of engineering offices, even for critical facilities that should not shake much fduring a significant input. A few ideas are referred below.

  • At ancient times: the construction of the foundatirestedsts on several layers of logs arranged in such a way as to allow horizontal movement;

  • The execution of a trench around the entire building to avoid restrictions on the movement-building horizontal;

  • Design the low, triangular building to ensure greater rigidity.

    Nowadays, base-isolated buildings behaving much better than rigid-based structures (Fig. 84) are recommended in several countries as an excellent fundamental technique to reduce the seismic action transmitted to the upper structure. There are problems associated with the larger displacements that occur between the ground and the structure, being necessary unique gadgets to avoid collisions or other large displacements. Dampers are used in complement to reduce these displacements.

    Tuned mass-dampers, based on the anti-resonance principle, are other devices applied.

    Magnet at foundations to support structures through floating electromagnetic field, in case of shaking, can be set to operate just before the S-wave onset. Coupling with EEWS, this system can be used as a “base-isolation” device that drastically reduces the wave field's shaking transmission. In an extreme case, the power needed to create the magnetic field can be “extracted” directly from the initial strong motion hitting the foundations.

Fig. 84
figure 84

Repair costs for various types of structures (Takagi et al. 2021)

Intensities and soil influence

Looking at the evolution of science from the early nineteenth century to now, one earthquake “property” is accepted as essential but never well resolved. I am talking about the intensity of ground waves as the best measure to assign wave passage effects. As referred to in Sect. 5, the concept of intensity goes back to the eighteenth century and suffered many changes throughout the decades, but it still contains several drawbacks. Intensity is a fundamental concept that everyone can immediately evaluate and transmit to a central entity for processing. A few minutes later can inform you of the state of damage caused by the event. Citizen Science can give a great push (DYFI) on intensity assignments (Bossu et al. 2017; Wald et al. 2012). Unfortunately, as ground motion is intrinsically very scattered due to a series of characteristics not yet well understood, the intensity measure suffers likewise from this contingency.

But, some of the problems facing the assignment of macroseismic intensity points (MIP) are easy to overcome. So the uncertainty in evaluating the level of shaking can be significantly reduced if a few new variables are introduced in the assignment process. Why did the Intensity concept not contribute to the understanding of wave field in the event of mass instrumentation? On the contrary, it rejuvenated as strong as ever, and one branch of the scientific community dedicated full attention to the problem. Intensity is a sound measure that is more or less subjective and is empowered at the hand of every one of us. A second reason for the importance of the Intensity concept is that it deals with the environment where we live our everyday lives, i.e. the stock of buildings where we habit, work, and leisure. Therefore, it is close enough to the most important artefacts of our human beings.

The importance of intensity is to reduce uncertainties in the process of assignment and so reduce uncertainties in the characterisation of the wavefield.

We further need to develop a kind of DNA of our living space, which will be uniquely linked to our environment’s X, Y, and Z.

This line goes precisely with what is proposed by Di Giulio et al. (2021) for site characterisation where not only one variable, such as Vs30, but seven parameters are used: “(1) fundamental resonance frequency; (2) shear wave velocity profile; (3) time-averaged shear wave velocity over the first 30 m; (4), (5) depth of both seismological and engineering bed-rock; (6) surface geology; and (7) soil class. Two more parameters beyond these seven could be added to topographic effects: (8) near the ridge and (9) middle of the basin. The new revision of EC-8 will introduce some of these issues.

Di Giulio et al. (2021) calibrate the method using three indices that consider the reliability of the available site indicators, their number and importance, and consistency defined through scatter plots for every single pair of indicators.

To illustrate another two of the above points, we selected the analysis of Inter and Intra-dependences and the applications of Operations Research topics.

Interdependences

We mention the phenomenon of Inter and Intra-dependences that complex urban areas are more and more interconnected due to their various functions. Mota de Sá (2014) presented a Disruption Index that gives the first quantification of measuring this interconnectivity with intra-urban functions. [Video 9] shows a clear example in urban areas of an intra-dependence situation leading to the collapse in cascade, starting with the break of a tie-rod of a retaining wall and ending with the generation of an urban landslide next street.

The problem of interdependences is very much linked with the need to get scientific support for decision making, which, in final terms, is attributed to political power. For example, in terms of decisions on reinforcing vs. reconstructing a patrimonial valued construction, there are multiple dimensions to take care of, not only in the strict cost–benefit analysis but also in including non-measurable entities. It is becoming common to assign a “score” to these activities or “multiple-scores” and then use a Risk Matrix of the type shown in Sect. 2 to check for a solution. Perception is also part of the game, and without “bottom-up” and “top-down” social policies, it will be difficult to convince the politicians on how to proceed well before the next event strikes.

There are recent texts in “Applications of Operations Research in Earthquake Engineering and Logistics of management of disaster crises”, such as by Wang (2020) and on Hazard and Multi-hazard Risk analysis (Wang et al. 2021).

The State-of-the-Art paper by (Çoban et al. 2021) presenting the applications of Operations Research in Earthquake Engineering to reduce human losses and minimise social and economic disruption caused by large-scale earthquakes gives a good summary of the topic. Effective planning and operational decisions need to be made by responsible agencies and institutions across all pre-and post-disaster stages. Operations Research (OR), which encompasses a broad array of quantitative and analytical methods for systematic decision-making, has gathered considerable attention in the disaster operations management literature over the past few decades. This review aims to highlight and discuss the main lines of research involving the use of OR techniques explicitly applied to earthquake disasters. As part of the review, existing research gaps are identified, and a roadmap is proposed to guide future work and enhance the real-world applicability of OR to earthquake operations management. Emphasis is made on the need for (i) developing models that are specifically tailored to earthquake operation management, including the need to contend with cascading effects and secondary disasters caused by aftershocks; (ii) greater stakeholder involvement in problem identification and methodological approach to enhance realism and adoption of OR models by practitioners; (iii) more holistic planning frameworks that combine decision making across multiple disaster stages; (iv) integration of OR methods with objective- and near real-time information systems, while confronting the problem of dealing with missing and incomplete data; (v) more effective use of multi-methodology and interdisciplinary approaches, including behavioural OR and Soft OR techniques as well as seismology and earthquake engineering expertise; and (vi) improved data generation defined at appropriate scales and better probability estimation of earthquake scenarios.

6.13 Points to retain

The manuscript presents the evolution of Seismology and Earthquake Engineering (SEE) from the 1700s up to the actuality, giving some insights into the future of the related fields. However, to complement the manuscript we further develop, in the following, a few topics to include in future Research Projects, PhD Thesis or simply new lines of implementation.

  • The effect of earthquake action (vibratory and tsunamigenic) on ships passing in the near-field. P-waves in water may cause a substantial impact on a floating object.

  • Investigate the effect of strong vibration on OBSs and SMART cables landing on the ocean floor. These “apparatus” are not fixed to the floor, and the fact that they are immersed in water, for strong shaking, there might be some disconnection between the apparatus and the moving ocean bottom floor, introducing the need for corrections in the readings.

  • EEWS and TEWS should be given the maximum attention as, for the time being, they are the only proactive mechanism for anticipating the arrival of the wave field. Uncertainties still existing should be reduced to acceptable levels. Civil protection authorities should start immediately preparing the regions and communicating with the population to foster learning times and superseding the last mile.

  • Occurrence models, including time–space dependences, may introduce additional uncertainties in the final hazard assessments. Very little attention has been given to this topic which may contribute to OEF (Operational Earthquake Forecast).

  • Low-cost seismic instrumentation is a by-product of material and hardware sciences and triggered by the mobile phone industry. The more this instrumentation gets lower noise, the better to help a massive coverage of engineering seismology objects and sites, enlarging the applications in health monitoring and “rapid-damage assessment significantly”.

  • Video-Cameras are new gadgets with great possibilities to inform research on real-time observations. A Project to create Database information should be of unbelievable value to science. As more and more video cameras are installed, each new earthquake or natural event would give a large amount of data that should not be lost in public media.

  • Communication and education are probably the best tools for perception and DRR (Disaster Risk Reduction). Seismology forum pays significant attention to the various forms of communication through games (board, serious games), media, films, and press…, as seen in the topics selected in conferences. But Engineering does not pay much attention. Only for student competitions with small-scale model constructions or placing shake-tables to instruct the public. More attention should be allocated to the development of this area of knowledge.

  • According to the importance of a structure and the human load at stake, different requirements in building codes should be imposed. The new generation of Euro-codes may follow these ideas, but more scholarly research on the end-user experience should be developed to support the threshold levels. Simplified models should be exempted from complex structural safety verification.

  • Intensity scales should be analysed to incorporate the maximum engineering and structural mechanics insight, not only the descriptive levels. The same applies to historical seismicity.

  • More attention should be given to coordinating research in small countries to avoid conflicts of interest.

  • Data-banks on Damage and Impact from earthquakes. CATDAT is an excellent example, but other more focused projects could be settled.

  • New technologies for mitigating the risks should be given priority, both in the construction area (base-isolated, dampers, etc.) and in the post-earthquake actions (Drones, Satellite images, etc.).

  • Attention to the areas of conflict of interest, such as increased energy efficiency (from climate changes) that might decreased structural safety. The same applies to saving patrimonial values but creating structural problems or incentivising architectural creativity and causing deficiency in structural solutions.

  • Machine learning can be used more commonly in many different fields of the SEE area. Attention should be given to data treatment of gathered data from the myriad of sensors installed and to be installed. The first steps are ongoing.

7 Final considerations

Summary

Many questions can be asked at the final of this set of topics, discussions and recommendations. In this Section, we will refer to initiatives by entities, organisations and individuals which keep up significant interest in the lines for future developments towards mitigation of earthquake impacts. We will look into think-tankers, implementation actions to mitigate the impact of earthquakes, schools of thought and dissemination.

Good examples are the EU Projects like TURNkey (2020) (‘Towards more Earthquake-resilient Urban Societies through a Multi-sensor-based Information System enabling Earthquake Forecasting, Early Warning and Rapid Response actions’) that merge all aspects of ‘Decision Support Systems (DSSs) for real-time (OEF and EEW) and near-real-time (NRE) disaster prevention and risk communication’. The final result will be the draft of maps for the earthquake warning phases of Operational Earthquake Forecasting (OEF), also called time-dependent hazard assessment, Earthquake Early Warning (EEW), and Rapid Response to Earthquakes (RRE). In addition, graphs were made of estimates at points of interest, such as historical buildings, bridges, hospitals, and First Responders: Civil Protection and Municipality organisations.

7.1 Think-tankers

Looking at modern times, various people and schools of thought are working as “think-tankers” to define a new order for the future. Seismology and mitigation of earthquake impacts are not an exception, fortunately. Great ideas have been published since 2020, emphasising certain aspects that could be seen as lines of future research. Repeating experiments with other geometric and material properties, for sure, will be needed to understand the behaviour of constructions better. Still, new things appear to solve unresolved problems or intrinsic uncertainties that should have already been solved.

The example of COVID-19 pandemics is that a new world is in front of our eyes. Still, we could not yet create a post-pandemic era where many sciences converge, and contemporary theories of humankind emerge. One hundred years ago, the last significant pandemics caused more death toll than the 1st WW but only gave rise to a new 2nd WW, much more sophisticated than WW1. As already said in Sect. 6.9.1, the “mRNA-messenger” or the “ctDNA-circulating tumour” (Berkeley Engineer 2021) may be a significant breakthrough in attacking problems not only of the SARCOV-family” but also great hope for the cure of oncologic plagues. The same could apply to other issues such as earthquake prediction and preparedness.

Some people, actually the majority, think codes will be the solution to the various existing problems (the hard way). Looking into more soft skills, other experts believe they will be the basis of success, such as education and promotion of culture; even others think of a new order to understand manhood and how it works in society (politics). All think they are part of the solution to the main existing problems. In the ‘80 s and ‘90 s, Frank Press launched the idea of dedicating 1990–2000 as the Decade to think large and try to make simple the science at the hands of a few elites. This would recommend a massive campaign for instrumenting the houses, sites, whatever, etc. We have billions of terabytes of information on ground motion, tests, and material properties. Still, we are unable to make any sense of that. The same can be said about DYFI. We need to transform those data into pieces of knowledge that help not commit the same errors as in the past. That's where mature AI, new genetic algorithms, etc., will probably help us solve many of our problems.

EEWS are by far the most exciting achievement of seismology in recent years. Seismological aspects are solved even though false alarms and positive failures may still exist. Also, both lead-times and shaking intensity are probably the most difficult characteristics to estimate with a good confidence level. Uncertainties are part of the whole game and are a function of a list of variables that span the geophysical environment we intend to protect to the number of resources we are willing to provide. A cost–benefit analysis with great in-depth was not yet been made but, just looking at the quantitative gains we may obtain, indicate that benefits surpass widely the losses that may occur. Of course, this statement also emerges from the frequency of events we are looking for. Uncertainties in the process may be reduced significantly when more data is assembled and AI techniques understand the process.

However, there are many other problems beyond the strictly seismological advances leading to the so-called “last mile”. In this case, as Velazquez et al. (2020) pointed out, in a review paper on the matter, several aspects that need to be addressed in a top-down and bottom-up appraisal.

The system to be implemented needs great discussion on responsibility and liability of wrong decisions, on participating actors from the simple citizen to the entities and stakeholders.

Experience has emerged from the 20 years of development upset by many caveats related to unsuccessful results. But the population, in general, lives well with these uncertainties and considers false alarms as new experiments useful for training. Nowadays, people may want to know more than what present programs give, such as arrival time, shaking amplitude, and what can be done. So much information should be conveyed. A second problem derives from the fact that no single EEWS can succeed without explanation, formation and active participation of all actors involved.

Another issue is that EEWS could be associated with how buildings perform during shaking. A health monitoring system should be coupled with EEWS to inform the user of what level of damage may have been inflicted to the building, critical facility, or network.

About tsunami risk communication and management, see a state-of-the-art by a large team of experts (Rafliana et al., 2022) where they discuss gaps and challenges.

7.2 Among all these good news, there are a few Caveats that deserves all our attention


Box—A few caveats to be analysed.

•Rare black swans.

•Extrapolation GM to large return periods from short periods of time.

•Near-source motions are not well known and might be a problem.

•Codes and quality control. Dignify the Professions linked to the construction industry.

•Policies for massive retrofitting for non-conformity construction.

•Education. Public perception.

 

7.3 Earthquakes as sustainable events

I left to the final part of this work the problem of future seismology and earthquake engineering developments to change the present paradigm of earthquake risk mitigation into a sustainable energy producer. Tremendous kinetic energy characterises the vibration of a building during a moderate to strong shaking. Why don't we revert the idea of negativism related to earthquakes into a positive and environmental friendship? The idea of profiting from the shaking energy by converting it into potential energy is already possible at the bridge supports in the longitudinal direction by using electric generators instead of dampers.

The movement of a tall building during the Tohoku earthquake is an excellent example of how much a tall building can oscillate (Fig. 85).

Fig. 85
figure 85

Shinjuho Center Building, Tokyo. The arrow has the same length [video 10]

A 56 storey-high building moves 90 cm (peak to peak) at the top level with a period of 5.5 s, as seen in [Video 10] and after some trigonometric elaboration. A few km away, an SM recorded a PGA = 184 cm/s2, which agrees with the video camera estimates.

Suppose buildings are designed to transform that kinetic energy into potential energy. We could produce an enormous amount of energy to be stored and released later on in a few seconds. How can that be done?

There are several problems related to the problem. First, we have to think about the rate of earthquakes. Only places of high seismicity like Japan, Taiwan, Mexico or Chile can be candidates for this initiative. In places of moderate seismicity, it might not be attractive. But only a cost–benefit analysis can give a possible answer to the issue.

Two additional problems are anticipated: i) Transforming kinetic energy into energy production, and ii) how can we store the energy produced in such a short amount of time. Let's look at the point involved and transform it into relative displacement. The other problem is how to store a peak of energy power into a battery or a potential to elevate water or something similar. We will have solved the problem by thinking of a device connecting the top of buildings oscillating out of phase. But the tendency is that all of them fall into synchronism, and consequently, there is no energy creation. The design of buildings has to consider this new paradigm and accommodate this effect.

  1. 1.

    Compute the translational energy of a specific building typology with a number of stories.

  2. 2.

    Now multiply by the number of buildings in a Mega-City. It is a lot of energy even if the efficiency is low!

EEWS will play a vital role in this analysis because an early announcement of seismic arrivals will be critical to initiate the preparation of devices and give the population time to evacuate and get ready.

Seismology and earthquake engineering are changing from an inverse problem to a direct issue of much more interest to the population in general, saving lives and creating the possibility of sustainability.

An excellent energy flux will be transferred to the net in a few seconds. Figure 86 shows the possibility that connecting buildings to a top may be a way to solve this difficult but challenging problem.

Fig. 86
figure 86

Solution for sustainability

A similar mechanism already proposed in bridges can be extended to base-isolate structures. Passive “absorbers” for wind loads can also use similar principles to retain elastic energy.

7.4 Implement actions to mitigate the impact of earthquake (Table 13)

Table 13 Approaches to risk (from Krimsky and Plough 1988)
  • Risk perception (Communication).

  • Codes Compliance (Quality control: New and old).

  • Post-earthquake risk management.

  • Sound Funding Strategy Requirements.

The practical actions to mitigate the impacts of future earthquakes constitute an essential part of the success of all these efforts. Shah (2006) called it the “last mile”. Transcribing his thoughts on perception:

“Perception of risk is an important issue. Without society’s understanding of the type and level of risk, it is tough, if not impossible, to develop and implement strategies for earthquake risk reduction. Many developing societies live their daily lives with many different risks. Unless it is clear how earthquake risk fits into their hierarchy of risks, it is tough for them to either ‘get excited’ or do something about that risk. So the first and foremost requirement for a developing society to implement needed risk reduction strategies is to understand the earthquake risk and how it relates to other human-made or natural risks”. At this point, we think that education at all levels might be an excellent policy for future success. Still, it will take time until changing the current “status-quo”.

Table 13 shows the differences in risk approach between scientists and the general public and how lack of communication occurs.

One of the most critical prevention measures is related to engineering design features to protect buildings against earthquake forces. National building codes usually regulate this. The aim of most earthquake-related regulations is, first and foremost, to save human life. They ensure structural stability whilst accepting that some property damage may occur. However, the new generation of building codes, such as those in use in the United States, Japan, New Zealand and Europe, also endeavour to reduce property damage. Planning measures such as selecting appropriate sites to avoid highly exposed areas are often ignored. But codes, both in developing or developed countries, without quality control, render all efforts for better seismic performance into a tremendous unsuccessful exercise. Communication here is also fundamental.

7.4.1 Communication under uncertainty

The day by day life requires taking decisions. The topics discussed in this work are always tricky due to uncertainties: announcing a prediction, launching an Early Warning, and certifying the state of performance of a building or a critical structure. The consequences of a false-positive or false-negative communication are tough to deal with, involving two items: responsibility and liability in the act of communicating. The seismological scientific community was considerably shaken up by the outcomes of the L'Aquila earthquake (2008) (Jordan et al. 2011). It took a decade to solve jurisdictional matters, and many advancements in science were somehow blocked.

7.4.2 Post-earthquake risk management (Emergency Management)

The time for solving post-earthquake response (rescue) and recovery without prior risk management knowledge has passed even for developing countries. Nowadays, a great deal of expertise constitutes an emerging applied science that informs decision-makers of the best policies. Improvisation should now be reduced to a minimum.

7.4.3 Funding Requirements

In many societies, voluntary workmanship has been the solution to help in devastated areas. But preparedness should be developed with adequate funding for all the preparations required.

7.5 International initiatives and Schools of Thought

Several initiatives try to discuss the future of seismology and earthquake engineering.

There are the ones linked to the organisations of conferences (Fig. 87) where not only papers are presented but also round tables with discussions on various topics, the initiatives of United Nations and other large institutions, but now we see more and more small groups discussing the future of earthquake engineering in workshops, using blogs, preparing “provocative” papers or state-of-the-art summaries, etc. A resume of some of these initiatives is mentioned below.

Fig. 87
figure 87

Photo 1st WCEE 1956

7.5.1 International initiatives

  • International and National Associations of Seismology and Earthquake Engineering (the mid-1950s): IAS(1902)-IASPEI and ESC, since 1951; WCEE since 1956, EAEE, EERI, AGU

  • Data-banks (Seismology, SM Seismology; Damage pictures; Camera-Archives)

  • IDNDR (1990–2000). Frank Press 1990 (International Decade for Natural Disaster Reduction) (Multi-Language Glossary on Natural Disasters 1997).

  • HYOGO (2005–2015). (Basabe 2013).

  • Building the Resilience of Nations and Communities to Disasters. International Strategy for Disaster Reduction, United Nations.

  • SENDAI (2015–2030). It aims to substantially reduce disaster risk and losses in lives, livelihoods and health, and the economic, physical, social, cultural and environmental assets of persons, businesses, communities and countries over the following years.

SENDAI outlines seven clear targets and four priorities for action to prevent new and reduce existing disaster risks: (i) Understanding disaster risk; (ii) Strengthening disaster risk governance to manage disaster risk; (iii) Investing in disaster reduction for resilience and; (iv) Enhancing disaster preparedness for effective response, and to "Build Back Better" in recovery, rehabilitation and reconstruction. (Table 14).

Table 14 Example of the revised Science and Technology Roadmap with four outcomes for Sendai Framework priority 1 to 4 (

Several political and individual Initiatives have already been launched with this primary objective. Two of them originated at the EU Disaster Risk Management Knowledge Centre (DRMKC), that developed in detail a series of discussions of great interest to mitigate disaster Risk:

  • Science for Disaster Risk Management 2017 (EU, Knowing Better and Losing Less) (Poljanšek et al., 2017).

  • Science for Disaster Risk Management 2020 (EU, Acting Today, Protecting Tomorrow) (Valles et al., 2020)

In parallel, the EU developed within Civil Protection and Emergency Response Coordination Centre (ERCC) to inform citizens on initiatives, similar to other centres around the World.

7.5.2 Schools of thought

  • groups of individuals

  • University Blogs

  • Individuals of excellent reputation

  • International Associations for Seismology and Earthquake Engineering

  • Insurance, Open Archives for World Data (Seismological, SM, SM in Buildings, GPS, Citizen Information, data on faults, geological downhole, etc.)

  • Need to create a data archive for Damage inflicted, structural tests, for vídeo-cameras (papers already require this type of information).

In these groups, it is worth mentioning the thoughts made by several authors, among which we could name the following:

  • Project cities of Future (Galasso et al. 2021) (Lead by Carmine Galasso)

  • Future human behaviour during earthquakes (Lead by G. Cimellaro)

  • Use of AI and Machine learning to help interpret big data (Xie et al., 2020)

  • Soft Computing, namely Fuzzy and Neural networks (Falcone et al., 2020)

  • Contribution of Operational Research to the management of disaster crises (Çoban et al. 2021) as referred above.

  • IAEE Brainstorming Sessions for Future Directions of Earthquake Engineering (17WCEE)

It is exciting to recognise that the topics discussed in Sect. 6 are aligned with the guidelines presented in a review paper (Freddi et al. 2021) with the conclusions presented at the workshop in 2019 in London Innovations Earthquake Risk Reduction and Resilience’. The group of experts from both academia and industry that meet there identified both cutting-edge ‘soft’ risk-reduction strategies (e.g., novel modelling frameworks, early warning systems, disaster financing and parametric insurance) and ‘hard’ ones (e.g., use of innovative structural devices, sensors, novel structural systems for new structures and retrofitting of existing structures), for the enhancement of structural and infrastructural safety and resilience.

G. Cimellaro also dedicates his research to many of the topics mentioned in this work, emphasising the Quantification of Resilience of systems. Moreover, he gives great attention to solutions for a better understanding of sustainable use and resilience of systems that often challenge collaborating teams of scientists, social scientists, engineers, lawyers and extension specialists across a broad spectrum of disciplines. His significant contribution has been the quantification of the concept of disaster resilience for analysing critical facilities (e.g. hospitals, military buildings, etc.) and utility lifelines (e.g. electric power systems, transportation networks, water systems etc.) and towards the definition of more complex recovery models that can describe the process over time and the spatial definition of recovery.

Xie et al. (2020) and (Falcone et al. 2020) present a literature review on the new AI, Machine Learning and other soft computing techniques applied in structural analysis and Earthquake Engineering, as already referred to in Sect. 6.8. We want to add that Falcone et al. (2020) add many applications, namely “mechanical properties of materials, optimal design of structural systems including the optimal retrofit project”.

The “IAEE Brainstorming Sessions for Future Directions of Earthquake Engineering” (Calvi 2022) that took place during the 17WCEE approved several Resolutions which endorse the following ideas: supporting the PBD for new and old construction, sustaining strong earthquakes with minor damage at little additional cost, paid attention to non-structural elements, use of seismic protection to reduce long term unit costs of hardware, costs of mitigation should be evaluated for long–term performance, monitoring and inspection should be considered as ordinary maintenance. It also stated that changing and implementing public policies takes a long time, contrarily to people's concise memories, and simulated scenarios have effectively influenced human perception and public policies.

Seismology and Earthquake Engineering (SEE) should look to a few decades to understand the challenges associated with climate change and its influence on setting priorities for mitigation in the long term, further longer than the Sendai horizon. One of the problems under scrutiny is the interaction between construction provisions for energy efficiency (with infill walls) and the increased vulnerability resulting from the need for those non-structural elements.

Moreover, the topics proposed for the following 3rd European Conference on Earthquake Engineering and Seismology (3ECEES 2022), a joint event of the 17th European Conference on Earthquake Engineering and the 38th General Assembly of the European Seismological Commission, to be held in September 2022 are very much aligned with the issues discussed in this paper.

7.6 Dissemination

There were times when the leading schools of thought initiated a collection of books dealing with different aspects of Seismology and Earthquake Engineering. To mention a few, the EERI produced, between 1979 and 2004, ten Monographs on relevant topics, running from Dynamics of Structures to Hazard and Risk Analysis. In addition, the University of Pavia, Italy, followed a similar trend by publishing the results of European Union projects in a collection of various books. The case of the LESS-LOSS Project in 2007 is an excellent example of the dissemination of results. Many more could be added to this list. Not to count on the publication of PhD dissertations, as Reports of Research Centres. That's the case of UC Berkeley, Stanford University, Caltech, University of Tokyo, IUSS Press (Pavia), etc., following the same principle, not producing Books as it happened in other areas of scientific advancements. The leaders wrote the only Books published in the second quarter of the 20th Century on the topic.

Contrarily to what we observed on the lack of treaties in fundamental knowledge, during the 21st Century, a large number of books sprung from the most varied publishers with similar titles. A list of the titles we found during a long survey is presented in "Appendix 1".

As far as publications in Journals, the increase in paper production is tremendous, not only due to the large community related to these topics, which is publishing more, as well as new journals can publish papers in large numbers.

Proceedings of World Conferences, European Conferences, and National Conferences, with a periodicity of 3–4 years, started in the mid-twentieth century and are still alive. Their papers, accessed by digital copies, are still considered good science and reflect in many ways what seismology and earthquake engineering practices around the World look like. AGU, EGU, and IASPEI are international entities that serve as forums for the largest concentration of experts and young scientists worldwide in earth sciences.

We should give credit to the institutions that organised field missions to various strong events, such as EEFIT (UK), EERI (USA), and AFPS (France), not counting the individual initiatives.

In countries prone to large but rare events, the problem of mitigation of earthquakes has to do with “perception of the problem.”

Bilham (2009), after presenting the evolution of the impacts of earthquakes in the past 4000 years and the need to build an enormous amount of houses to comply with people’s growth shortly, enumerate quite a few statements to reduce the impact of future events. Quality and knowledge of work power, and a good culture of all engaged in the construction process, should stay in the first line to achieve this goal.

But he also mentions a series of uncertainties and difficulties in controlling studies or implementations, such as relying totally on hazard maps, primarily if they were based on a short period of information lacking the existence of “black swans completely”. The politician's attitude to the problem ignoring the expert knowledge is another crucial matter because they never like to take responsibility after the disaster. Ignorance is a major problem in fighting natural disasters. The Covid-19 pandemic is the most recent case reflecting this reality!

There is a tremendous difference between developed and developing countries, between mega and local earthquakes, and world policies should be adapted to this reality.

Epilogue


Box—The evolution of Seismology and Earthquake Engineering (SEE).

Seismology and Earthquake Engineering started as a single science in the 18th Century. They both look into explanations of the strange behaviour of Earth shaking and tied to build according to Newton’s Laws of gravity. In 2000, Earthquake Early warning and Health monitoring changed the problem from inversion to direct importance. It was the major invention of the 21st Century, as magnificent as the discovery of electricity, particle physics and the Double structure of DNA Helix (Genome).

 

Along these pages involving historical processes, we tried to convey information on the main developments of Seismology and Earthquake Engineering from the middle of 1700 until now, emphasising the advancements and the possible ways to understand the whole process wavefield from source to site and its impact on society.

We started looking at the initial developments of seismology as the interpretation of magnitude and possible origin of earthquakes through the type of damage observed in buildings. The recommendations on how to proceed to avoid extra damage in future earthquakes are also briefly reviewed.

And the latest advancements are significant in terms of scientific landmarks showing that many things can be done to reduce the suffering of humanity when important events strike.

In the last four years, a set of state-of-the-art papers were published framing many of the ideas presented in the previous Sect. 6 on EEWS, the MEMS technology, the need for a new philosophy for Earthquake Resistant Design, the role of Operational Research, more than once in the applications of AI and other ML technologies, Soft Computing, future of codes, and techniques for rapid evaluation of the damage. New state-of-the-art papers are missing on the last mile to implement the scientific advancements, periodisation on the application of resources, etc.

But society and the agents responsible for the urban design, construction, and the inspection of code regulations, need to cope with all those advancements and implement them the sooner, the better.

The presented views in these pages were biased toward Engineering Seismology, and no detail on code applications was targeted. This is a topic for another presentation in the future. Essentially, concepts were brought into a discussion along with several proposals, here and there, to understand better and improve the knowledge of the complexity of the situation.

In the twenty-first century, from all the discussed aspects, we are still not free from earthquake damage. Still, hopefully, the fundamental goal of seismic design, which is to protect human lives and minimise the impact on communities, will probably advance significantly.