Keywords

1 Introduction

Landslides are natural hazards of significant impact worldwide (Vervaeck 2013). In relation to slope lithology and morphology, several kinds of landslides exist, see Cruden (1991), Cruden and Varnes (1996), and Hungr et al. (2014).

A variety of physical processes could be responsible for triggering landslides, such as rainfalls, changes in slope hydro-geology, changes in slope geometry, earthquakes, rock/soil weathering, and anthropogenic activities, see www.safeland-fp7.eu. Among other geophysical and geotechnical techniques, the geodetic methods may give a fundamental contribution to the knowledge of both the surface topography and the kinematics of landslides, providing data useful for geologists, geomorphologists, and geotechnical engineers for interpreting possible triggering mechanisms of slope and rock failures. In the analysis of landslides (prediction, monitoring, and alerting for early warning, monitoring for change detection, etc.), a variety of geodetic techniques can be used, ranging from the most known and consolidated to the innovative ones. Geodetic techniques may provide three-dimensional information of the ground surface with various levels of accuracy and spatial resolution, enabling new ways of investigating the landslide phenomena. The choice of the method to be used depends on the morphology of the slope, on its extent, and on the typology of the landslide that has to be characterized or monitored. Any techniques present some critical issues and strengths as far as landslide monitoring applications are concerned, depending on the specific characteristics of the considered site.

Traditional surveying instruments like theodolites, global navigation satellite system (GNSS) receivers, and optical leveling allow the knowledge of the position (in 3D for the theodolites and GNSS, in 1D for optical leveling) of a few single control points with very high precision. Consequently, these instruments are suitable for precise monitoring of a landslide through a limited number of well-defined control points, but do not yield a complete description of specific areas or the evaluation of mass movements and mobilized volumes of soil or rock.

Notwithstanding GNSS receivers and theodolites can also be used for surveying a large number of non-signalized points on the ground, the availability of terrestrial laser scanners offers an unprecedented opportunity to quickly measure the terrain topography at high resolution and precision. Monitoring the surface deformation of wide landslides could be also accomplished with interferometric synthetic aperture radar (InSAR) processing from satellite or ground-based data, but the displacement direction has to be along the line of sight (LOS) of the sensor, see Metternicht et al. (2005) and ESA (2009). In order to detect also the other spatial components of the displacement vector, it is necessary to know their spatial orientation or to consider SAR image series acquired from ascending and descending orbits.

Apart from classical differential InSAR techniques, slow-moving landslides could be monitored also with multi-temporal InSAR techniques (MIT—Wasowski and Bovenga 2014) that allow a great precision in movement detection when several scenes are processed (Colesanti and Wasowski 2006; Fastellini et al. 2011). Among MIT, persistent scatterers InSAR (PSI) considers the evolution over time of a number of scatterers that retain the ‘coherence’ (in a few words, ‘coherence’ could be defined as the degree of similarity between corresponding pixels in multitemporal scenes) along a stack of repeated SAR images, while small baseline subset (SBAS) could be considered as an evolution of classical differential interferometry, allowing an unwrapping in time of the interferometric phase. In general, all InSAR techniques are not suitable for vegetated landslides, since vegetation does not allow sufficient coherence.

A promising technique is ground-based InSAR (GBSAR) (Monserrat et al. 2014), which could allow high precisions in the detection of movements along the LOS (Teza et al. 2007, 2008; Luzi et al. 2006; Lingua et al. 2008; Bozzano et al. 2011). GBSAR, however, presents some limiting factors:

  • Such as satellite InSAR; it is not suitable for vegetated slopes;

  • For distances greater than a couple of kilometers, the atmospheric effects become not negligible at all;

  • The equipment is still very expensive; and

  • Despite the millimeter accuracy in displacement detection, spatial resolution is low (about 50 cm, depending on the distance from the sensor).

Airborne laser scanning (ALS) provides precise numerical models of wide areas, with the capability of filtering vegetation, thanks to the multiecho beam (in particular if the surveying instrument is mounted on an helicopter, which allows flying at low altitudes, e.g., a few hundreds of meters), see McKean and Roering (2004), Barbarella and Gordini (2006).

High-resolution stereoscopic satellite images (Kliparchuk and Collins 2011) allow a good level of accuracy in the production of digital terrain models (DTM’s) of the whole slope and therefore to monitor wide areas within a revisiting time of a few days. This technique may have further developments with the declassification of imagery at a spatial resolution of 25 cm, decided by the US Government on June 2014.

In this chapter, we describe the approach we used for TLS surveying and data processing in a ‘real-world’ case study, i.e., a large landslide that presents many criticalities in the design of a monitoring project over time. Our aim is to present a methodology that is ‘sustainable’ so as to provide a number of numerical and graphical products useful to allow both the geomorphological interpretation of active landslide phenomena and to monitor the surface shape changes occurred over time.

1.1 Terrestrial Laser Scanning

Among the geodetic surveying techniques, terrestrial laser scanning (TLS) has become quite popular for landslide monitoring owing to the easy use and the ability to collect a huge amount of data at high spatial resolution in short time.

Data coming from TLS can be processed for producing accurate and precise DTM’s of surface topography and then to provide a precise and detailed description of geomorphology, see Slob and Hack (2004), Carter et al. (2001), Slatton et al. (2007), and Heritage and Large (2009). A TLS application to monitor a slow-moving landslide, removing returns from vegetation using a decision tree filter, is reported in Pirotti et al. (2013).

TLS seems indeed to be very suitable for landslide deformation measurements, especially for terrains with difficult access and steep slopes, since repeated surveys enable rapid 3D data acquisition of the topographic surface.

A detailed overview of LiDAR techniques applied to landslides, including TLS, is reported in Jaboyedoff et al. (2012). Main applications to landslides range from mapping and characterization (Derron and Jaboyedoff 2010; Guzzetti et al. 2012; Abellán et al. 2014) to monitoring (Abellán et al. 2009, 2010; Prokop and Panholzer 2009).

To achieve reliable and accurate results in landslide characterization and monitoring using TLS, it is necessary to follow a careful survey methodology and processing pipeline. In the case of rockslides, an interesting and exhaustive review is given by Abellán et al. (2014), which provides some principles for the quality assessment of TLS outputs, suggesting which operational steps have to be preferably accomplished.

The design of the survey is determined by the nature of the landslide to be monitored: its size, typology, speed, and geomorphology. The presence of obstacles may result in occlusions which often require the acquisition of multiple scans in order to complete the survey of the whole area, see Giussani and Scaioni (2004) and Soudarissanane et al. (2008).

Moreover, it is necessary to link the landslide DTM to some benchmarking areas which are stable over time; otherwise, the result of monitoring cannot be correctly interpreted. Indeed, in the case of reference points affected by displacements, the measurement of deformations will be influenced by the apparent movement due to geo-referencing errors.

Surveying of multiple point clouds, partially overlapped, requires coregistration as a first step in data processing. To carry out this specific task, a number of procedures have been implemented in scientific and commercial software packages, such as a rigid-body transformation of one point cloud into the reference system of another, by means of natural or artificial targets, or the well-known and effective iterative closest point (ICP) algorithm developed by Besl and McKay (1992), and with a different scheme for the research of corresponding points, improved by Chen and Medioni (1992). This algorithm aims at minimizing the 3D distance between partially overlapped surfaces in a set of 3D point clouds in order to improve the best image alignment with respect to the other 3D point clouds. Many variants of ICP have been proposed for improving the computational efficiency and to make automatic the pre-alignment stage. A detailed examination of the various approaches is given in Grün and Akca (2005), Monserrat and Crosetto (2008), and Pomerleau et al. (2013).

A limiting factor that greatly reduces the precision of point clouds derived from TLS surveying and specific processing algorithms is the presence of vegetation. While trees represent only an obstacle to the visibility of the terrain from the point of view of the laser scanning station, bushes and low vegetation can hide the actual shape of the soil.

Vegetation filtering step consists in the classification of data into terrain and off-terrain points, and due to the huge amount of 3D points acquired by the scanner, there is the need to automate the process. A detailed overview of several filtering algorithms that have been developed for this task and a representative selection of some methods are reported in Briese (2010).

The up-to-date TLS instruments allow the acquisition of multiple echoes. For example, Riegl VZ (www.riegl.com) series instruments are full-waveform systems. They can provide, if properly equipped with software tools, a full-waveform analysis very effective for filtering out the overgrown vegetation layer (Mallet and Bretar 2009; Elseberg et al. 2011; Guarnieri et al. 2012).

An interesting approach, in the attempt to effectively filter out vegetation, is represented by the analysis of a vegetation index computed by means of the combination of spectral bands gathered by the integrated RGB camera and a modified low-cost near-infrared camera, see Alba et al. (2011).

Data editing requires a large amount of work, but it is essential to extract the bare soil data from the whole laser scanning data set. Without this operation, it is not possible to use such data for a quantitative precise analysis of terrain displacements.

The next step of data processing chain involves the conversion of scattered point clouds into a regularized digital elevation model (DEM) that can be arranged in a grid data structure. Elevation values at the grid nodes have to be generated by appropriate interpolation methods, see Kraus and Pfeifer (2001), Vosselman and Sithole (2004), Kraus et al. (2006), and Pfeifer and Mandlburger (2009).

The accuracy of the DEM and its ability to faithfully represent the surface depends on the terrain morphology, the sampling density, the adopted interpolation algorithm, and on the grid spacing, see Aguilar et al. (2005) and Barbarella et al. (2013).

A DEM is appropriate for describing the surface of the landslide and to study its characteristics. Such model can be also exploited for deriving standard contour lines, but also to generate other important by-products like slope, aspect, and curvature maps.

If the goal of the survey is to measure ground deformations over time, two or more DEM’s have to be compared (Fiani and Siani 2005; Abellán et al. 2009). Here, a number of different approaches can be used, see also Scaioni et al. (2014). For example, one can directly compare the DEM’s obtained over time by simply fixing a number of points belonging to particular objects visible in the two different point clouds, see Ujike and Takagi (2004). Then, the estimated transformation parameters can be used to transform all the points of a cloud in the reference system of the other, making possible a comparison between them, see Hesse and Stramm (2004).

A more general approach allowing a full three-dimensional analysis useful for point cloud comparison is based on the algorithm called 3D least squares surface matching (LS3D) proposed in Grün and Akca (2005).

2 Case Study and Surveying Operations

The landslide we are considering is taking place in Pisciotta (Campania Region, Italy), on the left side of the final portion of a stream. This slope instability already resulted in extensive damage to both an important state road and two sections of the railway mainline connecting North and South of Italy.

Not long time ago, the railway has been interrupted by landslide dam-break just upstream: mud and soil brought by the landslide dam-break into the river have caused the blockage of both tunnels and the interruption of rail traffic on the mainline. In addition, a long stretch of the road appeared to be completely disrupted.

From a geological point of view, in accordance with the widely accepted classification of landslides already mentioned in Sect. 1, the active Pisciotta landslide is defined as a landslide compound system with a deep seated main body affected by long-term roto-translation, slow displacements. On surface, secondary earth slide flows with intermediate sliding rupture surface and shallow, rapid earth flows occurred.

It is highly possible that the geomorphological evolution of the landslide will cause the lowering of the steep slope above the road and the swelling of the lower part up to the bed of the stream. Such deformations drastically would disrupt the morphology of the basal part of the landslide where the railway bridges and the tunnel gates are located.

From the point of view of the geodetic surveying, the setup of the instruments is easier because the stream separates the two flanks of the valley: The right one is considered stable, while the left one is unstable. The upper parts of the two flanks are far away several 100 m from each other, while the lower parts are less than 100 m (Fig. 1).

Fig. 1
figure 1

Panel above planimetric positions of the ‘near-reference’ pillars (red triangles), the TLS stations (orange stars), and the targets (green circles) taken in a typical survey campaign. Panel below altimetric scheme

Unfortunately, from the top of the stable part, the toe of the landslide cannot be seen. On the other hand, the slope toe is of great interest because in this area the slid debris accumulates. The TLS scans should therefore be acquired from multiple stations placed both on the upper part of the stable slope, from which the visibility toward the opposite site is very wide, and at the bottom, in order to make a very detailed survey of the lower part. The vegetation limits the choice of the placement of the instrument standpoints because of the presence of a few isolated trees. The visibility of the bare soil of the upper part of the unstable slope is, however, prevented because of the presence of shrubs.

For some years, the part of the landslide surrounding the road was monitored (by a specialized company on behalf of the authority for national road management) by using fixed sensors, like weather stations, piezometers, inclinometers, strain gauges, and by geodetic surveying. This was carried out by materializing a couple of pillars (denoted in the following as I1 and I2) located on the top of the stable slope and measuring with theodolites the position of several control points materialized on the sliding slope. In the period 2005–2009, two survey campaigns per month were carried out. The results of the periodic surveys showed that the amount of planimetric displacements in a period of about 4 years resulted in about 7 m, while the altimetric displacements were in the range between 2.5 and 3.5 m. The average horizontal daily speed of the landslide was approx. 0.5 cm/day, with peaks up to 2 cm/day. In elevation, it resulted in approx. 0.2 cm/day with peaks up to 1.5 cm/day. The displacements were referred to the couple of pillars (I1, I2) placed on the opposite flank of the valley. With this kind of survey, it was possible to determine quite accurately the displacements of any single points and to obtain information on the kinematics of the process, but it was impossible to compute the correct amount of the moving material or to precisely define the area affected by the landslide.

The use of TLS would enable the acquisition of large amounts of data and therefore potentially the knowledge of small details of the landslide. The goal of the survey campaigns that were run from 2010 to 2012 was to numerically describe the terrain morphology of the landsliding slope and its variation over time. For this purpose, we used the TLS surveying methodology in order to obtain a reliable 3D model of the landslide topographic surface (DTM) for each epoch and to derive the mass movements of the slope landslide, see Barbarella and Fiani (2013).

The rugged morphology of the surveyed area has been a strong constraint during the choice of the sites where the stations could be established. In all survey campaigns, we recorded laser scan measurements acquired from several TLS stations, located on the stable slope at different altitudes, in order to measure both the upper and the lower parts of the landslide body. Here, the shorter distances involved permitted the use of a medium-range laser scanner instrument, characterized by a higher measurement precision and spatial resolution that resulted in a larger point density.

At the toe of the landslide, there is a railway bridge connecting the two tunnels, and the terrain above the tunnel gate can be measured with high accuracy; not far from this, toward the valley, there is a longer bridge and a tunnel.

When the purpose of a survey is to monitor ground deformations, it is essential to identify a set of points that are stable in time and set up a stable reference system to which the repeated measurements over time could be referred.

In the case of a TLS surveying, if within the point cloud there is a sufficient number of recognizable and stable details, these may set up the reference frame and may be used to carry out the coregistration of the point clouds collected in subsequent measurement campaigns. If this is not the case, one needs to set up a reference system that remains stable over time, where the points measured in repeated surveys may be referred to. Such fixed points should be placed outside the monitoring area, but close enough to make easy and accurate the connection between them and the points to be surveyed on the landslide body.

Since in the case study the only recognizable and stable object is the tunnel gate, unsuitable to provide a stable frame, we used the couple of pillars previously materialized on the stable slope to set up the so-called near-reference frame. We carried out a GNSS survey in order to connect such pillars with some targets placed on both sides of the valley and used them in order to geo-reference the TLS scans. The distance between targets and pillars ranged from a few dozens up to a few hundreds of meters.

As we need to be sure that the points belonging to the reference frame are stable over time, it was also advisable that such points were themselves monitored with respect to some additional reference points located in areas not affected by the landslide. An efficient and not expensive solution was to use as ‘remote reference’ frame a number of GNSS permanent stations (PS’s) belonging to the IGS global network or to its local (national) densification, whose coordinates are known in a geodetic international frame.

The two pillars of the ‘near-reference’ frame were themselves connected to a couple of PS’s belonging to a network real-time kinematic (NRTK) for real-time surveying service managed by the administration of Campania region. Such stations were located at 26 km (Castellabate) and 38 km (Sapri) far away from the landslide site, respectively. Due to the great distance between the landslide area and the PS’s, we made continuous daily GNSS observations. By using them, we were able either to link the ‘remote reference’ stations to the ‘near-reference’ pillars or to calculate the GNSS coordinates of the targets placed on the landslide. Both the TLS stations and the targets were surveyed by GNSS geodetic-grade receivers in static mode and referred to the ‘near-reference’ frame, materialized by the two steel pillars firmly placed on a concrete wall in the stable area.

In Fig. 1, you may see both the survey setup and the measurement scheme for a ‘typical’ surveying campaign. We used different types of targets, mostly spherical with known diameter, but also cylindrical or planar as suggested by the manufacturers of the adopted laser instruments.

We carried out the first TLS survey campaign on February 2010. About 4 months later (June 2010), the second measurement campaign was accomplished. In both cases, a long-range instrument Optech ILRIS 36D was employed. A third surveying campaign was carried out one year later (June 2011) by using two different instruments, a long-range Riegl VZ400, and a medium-range Leica Scanstation C10 laser scanners. The second instrument was stationed on the lower part of the landslide (near the tunnel). The last campaign dated back to June 2012. Both Optech and Leica instruments were adopted by following the same organization of the previous surveys.

3 Data Processing

Data coming from TLS required several processing steps. Many of them were quite crucial in order to output reliable results. Once these steps were accomplished, a number of point clouds that described the landslide surface at different epochs were available.

GNSS data processing aims at determining the position of both targets and TLS stations in the selected reference framework. Since all the surveys have been registered into the same reference system, it will be possible to compare the DTM’s derived from the point clouds obtained at different epochs.

3.1 GNSS Data Processing

The GNSS data adopted during the surveying operations concerned the Global Positioning System (GPS) only. We can separate the elaboration of GPS data in three processing steps.

The first step concerned the connection between the two PS’s we used, 26 and 38 km far away from the landslide, respectively, taken as fixed points and the two pillars of the ‘near-reference’ frame. The downloading of the PS observations acquired during the survey campaign allowed us to calculate the long GPS baselines between both pillars and the PS’s.

The next step was the baseline adjustment. In such a way, we were able to obtain the adjusted coordinates of the pillars in the chosen geodetic datum that was the European terrestrial reference frame (ETRF00). This reference frame was selected because of its high stability over time (intraplate speed of the order of a few millimeters per year), see Altamimi et al. (2012a, b).

The third step of the GPS data processing concerned the computation of the short baselines between both targets and TLS stations and pillars, which were assumed as fixed at this stage. The length of these baselines ranged from 200 to 800 m approximately.

It is very important to acquire redundant observations to perform a rigorous least squares adjustment. In our case, two blunders in the last survey were detected by using ‘data snooping’ Baarda’s procedure, see Teunissen (2000). Table 1 shows the mean values (in centimeters) obtained for the standard deviations of the adjusted coordinates of the two pillars I1 and I2, and a summary of the values obtained for the standard deviations of the targets.

Table 1 Standard errors on pillar and target coordinates as computed from the GPS data processing

The level of accuracy we have achieved for both the surveys of 2011 and 2012 is lower than the one for the two 2010 surveys. This is probably due to the discontinuous signal acquisition by the network NRTK in 2011 (which suggested not to use the point I2), and the noise induced by vegetation growth. The standard deviations of the position of the targets, however, are on average smaller than a couple of centimeters in planimetry and about twice in altimetry.

3.2 TLS Data Processing

The result of the four surveying campaigns carried out on the unstable slope consisted in a huge amount of data. In any survey campaign, we acquired from three to four scans taken from different stations. The scans are partially overlapped to help the alignment and to guarantee the coverage of the whole landslide (more than 10 m2). In the last two campaigns, a scan from the lowest station was also acquired using a short-range instrument characterized by higher scanning rate.

By processing all these data sets, we were able to obtain a number of DEM’s of the landslide area at different epochs. This allowed us to make a comparison between them in order to evaluate the landslide evolution.

Processing of TLS data followed multiple steps. Alignment of overlapping scans is a very crucial step to achieve a good metric quality of the output. One should take special care in the scan pre-alignment phase, especially in the choice of the overlapping area between adjacent point clouds and in the selection of the homologous points for coregistration. This task is mathematically operated by using a six-parameter rigid-body transformation. In a first phase, a pre-alignment was obtained from the manual selection of a few corresponding points. Then, a best fit alignment was carried out with ICP algorithm. We applied standard registration procedures based on point matching. The points in the scans that have been selected by the operator are used by the algorithm as starting points for the alignment. We used both natural points and targets as tie points.

Both the software packages PolyWorks (Innovmetric) and Cyclone (Leica GeoSystems) use the least squares iterative algorithm ICP to optimally align a set of 3D images by minimizing the 3D distances between the points of overlapping portions of the ‘reference’ point cloud and the ‘data’ point cloud. The strategy used by RiscanPro software package in order to minimize these errors is very similar, since the software has a plug-in function called ‘multistation adjustment’ (MSA) that tries to improve iteratively the registration of the scan positions. To compare the scan positions, both tie points, tie objects, and poly-data objects (reduced point clouds) are used.

A crucial step in TLS data processing consists in removing those points that have not been reflected from the ground but from vegetation or obstacles, as well as the outliers caused by ‘noise.’ To isolate the points that belong to the bare soil from others is important especially in deformation measurement applications, where the real ground movement must be detected.

In order to filter out the vegetation and to edit data, we used different software tools according on the adopted laser instrument. Almost all the software packages implement semi-automatic filtering procedures. They are often based on the same algorithms that have been designed for aerial laser scanning (ALS) applications, see Axelsson (2000) and Vosselman and Sithole (2004).

The editing tools that we used followed different approaches: Some of them are based on manual inspection, while others implement semi-automatic procedures. We used PolyWorks version 10.0 (Innovmetric) to edit both the February and June 2010 scans and Cyclone to edit the 2011 scans from Leica C10.

While PolyWorks uses a completely manual procedure, in Cyclone a semi-automatic approach can be applied for slicing the point cloud in a number of stripes of length specified by the operator. On any single stripe, thanks to the visualization in 2D, we can manually delete those points that do not belong to the bare soil. Such a procedure is very time-consuming, yet the result is quite accurate. The thinner the strip, the better the result.

In Fig. 2, we show the 2D views of a number of stripes having different thickness. Editing is still to be done. We edit the whole scan by using strips 5 m large.

Fig. 2
figure 2

View in 2D of some examples of vertical stripes having different thickness a 1 m; b 2 m; c 3 m; d 5 m; e 8 m; f 10 m

For editing the 2012 scans acquired from Riegl Z400, we used RiscanPro version 1.2 (Riegl) editing tool. We applied two different spatial filters that are implemented in the software package: ‘octree’ and ‘iterative filtering.’

Our experience teaches us that great care must be put in the use of semi-automatic procedures because there is a risk of accidentally deleting some portions of soil. Only by comparing the surfaces derived from different epochs, some interpretation errors may be detected.

Once an overall cloud that represents the bare soil is available, it should be registered in the chosen absolute reference system by measuring the target coordinates. All software packages allow the geo-referencing of point clouds. The procedure consists in assigning a set of 3D coordinates to a number of points that can be recognized in the point cloud.

In particular, when there are no terrain details that can be considered stable and that are placed in a suitable position within the point cloud, a precise geo-referencing is a quite critical task. For monitoring purpose, a correct geo-referencing in a reference system that is stable over time allows us to make an effective comparison between scans in an absolute reference system.

In our case study, no natural features located on the stable slope or in the surrounding area were recognizable in the point cloud, to be used to set up a reference frame stable over time. Consequently, some artificial targets were adopted. Their position with respect to the chosen reference system is known since they have been measured on the ground.

Software tools generally allow the automatic recognition of the target shape (cylindrical, spherical, or flat) at least on the type of targets suggested by the manufacturer of the associated TLS instrument. In order to accurately estimate the position of the center of the targets, we need to border the portion of points of the target to be used for the fitting calculus.

We made a seven-parameter (similarity) transformation in order to geo-reference all the scans. A six-parameter rigid-body transformation gives almost the same results. We note that usually the laser data processing software packages do not provide a rigorous analysis of the quality of the standardized residuals of the transformation. The process of geo-referencing has led to values of 3D residuals ranging from 5 cm to 6 cm.

We computed the geo-referencing of all the surveys in the ETRF00 Geodetic System. In order to separate more easily the planimetric from the altimetric components, we transformed the planimetric coordinates in the cartographic system UTM_ETRF00. For the altimetry, we continued to use the ellipsoidal heights. In the following, we assumed y = North; x = East; z = height.

There are many algorithms to generate a DEM of a surface starting from a point cloud. Most of them are designed to produce a grid DEM. In the landslide analysis, the choice of the algorithm is not trivial, depending on various factors as terrain morphology and presence of discontinuities. The assessment of the surface deformation occurred over time is strongly influenced by the DEM generation mode.

Once all the raw DTM’s, i.e., the geo-referenced and filtered point clouds, have been built, we gain a number of numerical models of the surface of the landslide at different epochs (four in our test case) that can be compared between them. Since all the surveys have been framed in the same reference system, such a comparison allows us to estimate the landslide displacements.

3.3 DEM Processing

Starting from the geo-referenced point clouds, we were able to derive a number of DEM’s which described the whole surface of the landslide body and allowed us to analyze the deformation occurred over time.

In order to calculate the height to be given to each node of the grid DEM, one of the many existing interpolation algorithms had to be selected. The choice of the interpolator greatly may influence the accuracy of the DEM. In landslide characterization and monitoring, the choice of both the interpolation algorithm and the grid step is not trivial, since their choice depends on various factors such as the point density and distribution pattern, the terrain morphology, and the presence of breaklines.

To study the ability of some different algorithms to reconstruct a surface adherent to the data, a preliminary test was accomplished in a test area. Such region featured a quite complex morphology but did not host any overgrown vegetation. The test point cloud consisted of about 4 million points. Both an artifact (the gate of a tunnel) and sharp discontinuities were in the test data set, see Fig. 14 in Sect. 5.

For evaluating the adherence of the DEM interpolated surface to the input point cloud, we considered the height difference between the laser points and the interpolated surface, i.e., the difference between the height z k of the point and the interpolated height z int(x k , y k ), in the same planimetric position. We applied a cross-validation method, extracting from the cloud a test sample of 1 % of points and then calculating the height differences between the points belonging to this sample (containing a number of points N = 40,000) and the interpolated surface generated by means of the remaining 99 % of the original data. Thereby, the points of the test sample did not contribute to the construction of the DEM.

For the modelling of the surface, we tested several well-known interpolation algorithms, which are widely used in surface modelling (Mongillo 2011) and implemented in commercial software packages as Surfer (Golden Software) and ArcMap (ESRI):

  • Inverse distance to a power, second degree (Idw2);

  • Kriging, linear variogram (Kri);

  • Local Polynomial, first degree (LoPo1°) and second degree (LoPo2°);

  • Natural Neighbor (NatNe); and

  • Radial Basis Function, multiquadric, smoothing factor c 2 = 0.0001 (RBF0001), c 2 = 0.0013 (RBF–sug), c 2 = 0.01 (RBF01).

A few checkpoints gave very large residuals for all the algorithms tested. These points are mainly located in areas characterized by abrupt breaklines such as the tunnel wall, the steep slopes, and the like. Those residuals exceeding 1 m in absolute value were considered as outliers and discarded from the testing sample. Their percentage varies within the tested algorithms, ranging from the smallest values of 3.8 % for Idw2, 3.9 % for Kri and RBF0 to a maximum of 5.5 % for RBF-sug (the value of smoothing factor suggested by the adopted software package).

Once the outliers were removed, the frequency of the residuals at intervals of 5 cm was computed to assess the level of closeness between the points and the DEM. Figure 3 shows, for all the nine algorithms we tested, the curve of the frequency of absolute residuals exceeding a threshold. For both the algorithms IDW2 and LoPo2, the 5 % of the absolute residuals exceeded 14 cm, while for the algorithm RBF01, the 5 % of the absolute residuals exceeded 24 cm.

Fig. 3
figure 3

Frequency of the absolute residuals that do not exceed a given threshold (after removing outliers, i.e., points with absolute residuals larger than 100 cm)

As a global measure of a DEM’s accuracy, Yang and Hodler (2000) proposed the use of the root mean square error (RMSE) of the residuals, a parameter that can be easily statistically interpreted if the residuals follow a normal distribution. However, the samples of the residuals seem to be characterized by a different behavior, regardless of the type of interpolator used. When intervals of 5 cm have been considered and we computed the frequency of absolute residuals x for each interval, we obtained the histogram shown in Fig. 4, built for the results obtained when using the Idw2 algorithm. The trends we obtained from the other algorithms were almost similar to the one in Fig. 4.

Fig. 4
figure 4

Step histogram of the relative frequency of residuals on 5-cm interval

It was very hard to consider the residuals of the interpolations to be normally distributed, regardless of the algorithm tested.

To interpret the statistical behavior of the data sample, we normalize the residuals considering some other parameters, namely mean, RMSE, median, skewness (g1), kurtosis (g2), and median absolute deviation (mad).

The mathematical expressions of the statistical parameters adopted for normalization are the following ones:

$$ m_{x} = \frac{1}{N}\,\sum\limits_{i=1}^{N} {x_{i} } \quad {\text{rmse}}=\sqrt {\sum\limits_{i=1}^{N} {(x_{i} - m_{x} )^{2} } /(N -1)} $$
(1)
$$ {\text{mad}}= {\text{median}}(|x_{i} - {\text{median}}(x)|) $$
(2)
$$ g_{1\,} = \frac{1}{N}\,\sum\limits_{i= 1}^{N} {(x_{i} - m_{x} )^{3} } /\left( {\frac{1}{N}\sum\limits_{i= 1}^{N} {(x_{i} - m_{x} )^{2} } } \right)^{3/2} $$
(3)
$$ g_{2\,} = \frac{1}{N}\sum\limits_{i= 1}^{N} {(x_{i} - m_{x} )^{4} } /\left( {\frac{1}{N}\,\sum\limits_{i= 1}^{N} {(x_{i} - m_{x} )^{2} } } \right)^{2} - 3 $$
(4)

The numerical values of the parameters computed on the test data when considering different interpolation algorithms are shown in Table 2.

Table 2 Some statistical parameters for the tested interpolation algorithms

The statistical parameters of the various algorithms look similar: The value of RMSE ranges from 13 to 15 cm, the mad ranges from 2.9 to 3.7 cm, and the mean value is always positive and less than 1 cm. If we compare the algorithms in terms of the ability to describe the terrain, the best results are given by Idw2, LoPo2°, and RBF0 (c 2 = 0). The relative differences between them are nevertheless quite small. As a result of our tests, we chose the Idw2 algorithm that gave us the best overall performance.

This interpolation method allowed us to introduce the breaklines that are needed to better characterize the trend of the ground especially in the case of artifacts, see Lichtenstein and Doytsher (2004). At a glance, we built the grid DEM’s used for the subsequent analysis of movements by using the Idw2 algorithm. We drew the breaklines just in correspondence of the mouth of the tunnel, which is the unique artifact in the scene.

4 Results

In order to follow the evolution of the landslide morphology under investigation, it was devised to consider as reference the surface derived from the survey in February 2010. For this reason, as a convention, given two surveys acquired at different epochs, differences between the corresponding DEM’s were always calculated by subtracting the latest to the oldest surfaces.

The landslide process was studied through three methods: (1) points versus mesh comparison, (2) DEM subtraction, and (3) cross-sectional analysis. Results are discussed in the following subsections.

4.1 Points Versus Mesh Comparison

In laser scanner software, surface generation is obtained as a mesh through a Delaunay triangulation (this means that it is not possible to choose which interpolator to use, like in a GIS environment).

The ‘Compare’ procedure, as implemented in PolyWorks, consists in measuring the distance of each point of a point cloud with respect to the mesh obtained from the other point cloud, chosen as reference. Here, the distances along the shortest path from the point to the mesh were considered. The sign of the distance depends on the relative position of the point cloud and the mesh: If a point is above the mesh, the distance will be positive (in this case, it means that some debris accumulated); otherwise, if a point is below the mesh, the sign will be negative (this means erosion). At the end of the process, input data points are represented in a chromatic scale whose entities correspond to the distance values.

Comparisons between the four point clouds have been carried out by considering as a reference surface the interpolated mesh in February 2010. The computation of distances between single points of a generic epoch and the mesh surface is suitable for a first look identification of stable/unstable areas, both on the whole slope (Fig. 5) and across small areas. February 2010 scan covers an area slightly wider than June 2012; this is the reason of the gray parts in the figure.

Fig. 5
figure 5

Comparison based on the distance between points and mesh surface (see Sect. 4.1) of the point cloud gathered in June 2012 with respect to the reference mesh obtained from the point cloud in February 2010, which is represented in gray. Points with colors from dark to light green correspond to erosion areas, points from yellow to brown to accumulation areas

It has, however, to be clearly pointed out that the result of a ‘Compare’ procedure is not a volume variation calculation.

From an operational point of view, in PolyWorks, it is possible to perform calculation of volume variation only in more than one step:

  • First, it is necessary to create an horizontal plane to be taken as a reference;

  • Then, for each of the two mesh surfaces, the volume with respect to this reference plane is computed using the ‘Surface-to-Plane Volume’ tool; and

  • Finally, volume difference between the two mesh is obtained indirectly by manually differencing the volume results previously read.

Moreover, working in vectorial domain, PolyWorks does not allow direct map algebra operations. For these reasons, it has been chosen to calculate volumes directly in the raster domain, using a GIS software.

4.2 DEM Subtraction

In the study of a landslide, the evaluation of terrain mobilized during a certain period is very important. In commercial software packages oriented to TLS data processing, this result is often achieved by differencing the volumes comprised between the interpolated surfaces and a horizontal reference plane. In order to get a quantitative estimation of mobilized volumes in terms of loss and gain, here it was chosen to perform this computation in a GIS environment, i.e., ArcMap 10.1 (ESRI).

The grid DEM’s derived from both scans were used for differencing any couples of epochs. It was considered a cell spacing of 50 cm, devised accordingly to the mean span of the points in the fours surveys. Consequently, all the derived products were obtained from the DEM’s interpolated with this cell spacing. Differencing the heights between cells of corresponding DEM’s provided the information about height variations over time.

Results obtained by differencing the last survey (June 2012) and the first survey (February 2010) are shown in Fig. 6. In Fig. 7, we show the DEM differences in the interim periods. This output should be equivalent to the one shown in Fig. 5, even though different palettes (due to different software packages) and adopted measures (shortest distance between any points and the mesh, difference along the vertical of the elevation values in correspondence of grid nodes) were used.

Fig. 6
figure 6

Difference map obtained by DEM subtraction method (see Sect. 4.2) in February 2010 and June 2012; results are displayed in meters

Fig. 7
figure 7

Subtraction of DEM’s in the periods February 2010–June 2010, June 2010–June 2011, and June 2011–June 2012 (from left to right, respectively)

Since the most interesting information is that about volume variations, it is advisable to perform such calculations by considering homogenous areas with respect to the relative position (below/above) of the surfaces. To distinguish areas in erosion from those of accumulation, the resulting grid has been classified into three classes, see Fig. 8:

Fig. 8
figure 8

Automatic segmentation into three classes of the results from grid DEM subtraction obtained in the case of February 2010 and June 2012

  • From −0.1 to 0.1 m: areas which could be considered as characterized by the balance of loss and gain (for sake of simplicity in the following, these points will be defined just as ‘stable’);

  • Equal or greater than 0.1 m: areas in accumulation from February 2010 to June 2012; and

  • Equal or lower than −0.1 m, areas in erosion.

The volumes mobilized over the years have been calculated on the entire part of the landslide common to the four surveys, resulting from the intersection of all the boundaries. In Fig. 8, in green are represented cells corresponding to accumulation, in red those corresponding to erosion, and in yellow those that could be considered as stable.

This method allowed determining that from February 2010 to July 2012, the mobilized volume resulted as 58,820 m3 of total accumulation and 48,851 m3 of total erosion, mainly due to excavation. The total volume corresponding to points considered as stable (from −0.1 to 0.1 m of difference) was about 235 m3.

The term ‘erosion’ here should not be interpreted only as landslide displacement or as weathering, but also as an effect of artificial removal of landslide debris which could dam the stream and the nearby gravel road.

Even if this classification into three classes represents a sort of rough segmentation, this approach is reductive because information related to the global volume does not describe the relative displacements between single portions of the landslide.

Considering the landslide as a whole body does not allow an analysis of the differential kinematics between the several bodies of the landslide.

4.2.1 Landslide’s Bodies’ Identification

To make a classification of the different homogeneous parts of the landslide, the advice of geomorphologists played a fundamental role. Experts of the landslide area, indeed, were able to identify and draw the different polygons to be analyzed on the basis of the visual inspection of map output from laser scanning surveys.

More in detail, the geomorphologists were provided with contour maps with small intervals, up to 0.25 m (Fig. 9 left, shown with a contour interval of 2 m in order to improve the readability of the figure), slope and aspects maps (Fig. 9 center and Fig. 9 right).

Fig. 9
figure 9

From left to right contour line map with vertical interdistance of 2 m; slope map; aspect map. All maps were derived from the TLS survey in February 2010

The geomorphological interpretation allows us to recognize the landslide bodies shown in Fig. 10 (Guida 2013, personal communication).

Fig. 10
figure 10

Landslide bodies identified considering the survey in February 2010, overlapped to the corresponding contour lines (drawn with an interval of 10 m). The shapes of landslide bodies have been clipped with respect to the boundary common to the four surveys (represented in dark gray)

The morphological analysis showed that the slope under study (orographic left bank) is affected by landslides that differ in type, age, and stage of activity.

On the main body of the landslide, indeed, it is possible to recognize landslides of second and third generation. The continuous progress of the landslide gradually shrinks the bed of the creek, either along the horizontal or the vertical directions.

It is possible to identify six polygons that represent the following:

  • A landslide of second generation indicated by the abbreviation ‘1_3’ in Fig. 10;

  • Two landslides of third generation indicated by the abbreviations ‘1_1_2’ and ‘1_1_3’; and

  • Three landslides of fourth generation classified as ‘1_1_0_1’, ‘1_1_2_1’, and ‘1_1_3_1.’

The main body of the landslide, which would have been indicated by the abbreviation ‘1’, is not represented in Fig. 10, as well as the third-generation landslide body classified as ‘1_1_0’, because their extension exceeds the area detected by some of the acquired TLS surveys.

With regard to these different landslide bodies identified throughout the landslide, volume calculations have been performed considering only two classes, i.e., erosion and accumulation. More in detail, for each landslide body have been calculated both its aggregated volume and both volumes corresponding to erosion and accumulation zones. Results of this analysis are reported in Table 3.

Table 3 Mobilized volumes from February 2010 to July 2012, expressed in cubic meters; positive values mean accumulation, negative ones mean erosion

In general, there is a good correspondence between volume losses and gains among the several landslide bodies. For instance, the niche ‘1_1_2’ shows a volume loss/gain balance which could be related to the volume gained by the areas below.

Moreover, human interventions that took place in 2011 in order to stabilize the toe of the landslide do not allow a clear interpretation of its behavior: Area ‘1_1_2_1’ shows a total loss of about four hundreds of cubic meters in spite of a noticeable loss of the above landslide bodies.

4.3 Cross-Sectional Analysis

Along with contour level and DEM-derived maps shown in Fig. 9, cross sections are tools of great interest for the study of the morphology of a landslide and for monitoring topographic changes over time.

A quantitative analysis of the variations in terrain surface is also possible to compare cross sections of the surfaces resulting from the interpolation of point clouds at different epochs.

A critical aspect of this method is represented by the choice of the orientation of the cross-sectional profile along the landslide, which basically defines which components of the displacement vector can be observed. Indeed, a wrong orientation of the profile could lead to unnecessary or misleading interpretation. For this reason, in comparing cross sections taken at different positions along the slope, it should be considered that the landslide orientation could have changed between each other.

Here, it has been devised to take profiles along the axes of landslide bodies, measuring the differences along the vertical direction between the four profiles. From February 2010 to July 2012, the variation in height (thickness) measured in some points has nearly reached 3.5 m, as can be seen in Fig. 11.

Fig. 11
figure 11

Profiles of the slope interested by the landslide, from the top to hillside, taken into consideration epochs in June 2012 and February 2010. The section corresponds to the axis of landslide body ‘1_1_2_1’, which is in the nearby of the right side of the railway tunnel

Of course, an exhaustive description of the whole landslide would require a lot of vertical cross sections, representing therefore a time-consuming operation. Indeed, while the computation of each profile is an automatic procedure given the DEM, the selection of the location and the orientation of any profiles require careful analyses. For these reasons, the extraction of cross sections is limited to a few narrow areas where a deeper investigation is needed, rather than to be worked out on the entire slope.

5 Discussion

Among TLS applications, landslide monitoring is one of the more complex because it is not limited to the description of a surface but requires also quantitative comparisons, essential to determine the kinematics of the phenomenon and for planning subsequent works on the ground. As a matter of fact, the morphology of a slope does not obviously correspond to a well-defined geometric pattern, recognizable in its positions at different epochs.

The final data to be processed—the shape of a surface obtained from a point cloud—are derived from a series of field operations and a processing pipeline that necessarily involves approximations and interpolations of the original data and that may affect the comparison of the surfaces.

As pointed out by several authors (e.g., Abellán et al. 2014), it is therefore necessary to perform TLS surveys by carefully following a set of procedures for data processing and calculation.

Some critical issues concern, for instance, the problem of geo-referencing the survey into a frame stable over time and how to perform the comparison between surveys acquired at later ages.

The significance of the identified displacements depends indeed on the reliability of the framing of the different surveys in the same reference frame stable over time. In the case of the survey of stations and targets performed with a satellite GNSS method, it is advisable to consider a reference frame solid with the tectonic plate of the investigated area, even if the locations of the reference stations are usually far away from the ‘near’ reference, and therefore, the position of these points has already a not negligible uncertainty which propagates to rise up the standard deviation of the surveyed points. As a matter of fact, what is apparently loose in precision is largely gained in reliability of the absolute positioning.

It has, however, to be taken into account that the precision of point cloud position obtained from a survey depends also on the coregistration and geo-referencing steps. An assessment of the overall geo-referencing precision of the several point clouds can be operated by exploiting the presence of artifacts that are stable over time and thus allow the evaluation of the survey repeatability. In the application reported here, these are the tunnel and the railway bridge, which are continuously monitored on site with an interferometer because of their relevant function. The interferometric measurement did not show any movements in the observation period.

Thanks to the very high point density, allowed by the proximity of the slope with respect to one of the stations, it was possible to identify in the four surveys several corresponding details on the vertical wall (oriented north–south) of the tunnel.

The average differences of the locations of these ‘homologous’ checkpoints with respect to the first survey on February 2010 are reported in Table 4.

Table 4 Shifts of checkpoints with respect to the first survey in February 2010

Since the total shift observed is always less than 10 cm, it does not seem appropriate to apply any translation on the basis of this local result. To assess the repeatability of the height component, it is more convenient to consider the height differences in the points belonging to the horizontal plane of the bridge.

Figure 12 shows the difference between June 2012’s and February 2010’s DEM’s. Contour levels corresponding to DEM subtraction are shown in Fig. 13.

Fig. 12
figure 12

Contours extracted from the 2012-to-February 2010 difference grid, considering the tunnel main geometries as breaklines (in black), with a contouring interval of 5 cm

Fig. 13
figure 13

Contours extracted from the 2012-to-June 2010 difference grid; here have been interpolated only points comprised inside railway tracks. For this reason, the palette is different from the one in Fig. 12

Here, the interpolation was carried out by considering the main geometries of the railway tunnel as breaklines (in black). In the figure, it has been chosen a contour interval of 5 cm and to represent in yellow difference values around zero.

Contour lines corresponding to height differences of 5 and 10 cm lie almost exclusively in correspondence of the tracks and are therefore to be considered probably linked to the different acquisition geometry of the crossties (e.g., support elements) of the tracks in the two surveys. The area around the tunnel has not been considered in calculations, in order not to affect the results with spurious differences deriving from the different acquisition geometry of any scans.

As already stated, there are various ways for comparing surveys acquired at different epochs; the choice of which method to use depends on the particular context and on the object being monitored. Moreover, details are often better interpreted considering cross sections than comparisons or grid differences. Consider for instance the retaining wall in the area nearby the tunnel, see Fig. 14.

Fig. 14
figure 14

Test area; mesh of February 2010 scan, in red the profile line taken perpendicular to the retaining wall

The comparison between scans acquired in February 2010 and in June 2012 does not allow clear results, as shown in Fig. 15, which shows a detail of the retaining wall. This is probably because the surface of the reference mesh is continuous and more or less the blocks are positioned along the steepness of the slope. Only a few anomalies are pointed out by some patches clearly different from the surrounding areas, with irregular shapes not corresponding with the geometry of an ashlar. Also the map obtained from DEM subtraction, shown in Fig. 16, does not allow a better interpretation.

Fig. 15
figure 15

Comparison of June 2012 point cloud with respect to February 2010 reference mesh, close-up over the ashlars nearby the railway tunnel

Fig. 16
figure 16

Map obtained from DEM subtraction in the area of the railway tunnel portal

A clear evidence of human intervention in body ‘1_1_2_1’ area can be found by comparing profiles taken over the four scans: The blocks of the retaining wall on the right side of the railway tunnel were removed and repositioned, passing from three to five rows. In Fig. 17, the cross sections taken along the blocks of the retaining wall close to the railway tunnel are shown (the position of the section line is indicated in red in Fig. 14). In better detail, an analysis of the cross sections points out that the intervention at the retaining wall occurred in the time elapsed between the June 2010 scan and the June 2011.

Fig. 17
figure 17

Cross sections of the four scans; Latin numbers indicate the rows of ashlars of the retaining wall in February 2010 and in June 2012

6 Conclusions

In this chapter, an application of multitemporal surveys based on TLS techniques for the analysis of changes and deformations of a landslide has been presented. The case study is located in Pisciotta (Campania, Italy), where an active landslide compound system with a deep seated main body affected by long-term roto-translation, slow displacements, as well as secondary earth slide flows on surface.

The use of GNSS permanent stations as a frame stable over time, despite their distance from landslide area, has, however, allowed a reliable geo-referencing; controls on a stable artifact allow to assess that the repeatability level of positioning is comprised between 5 and 10 cm, corresponding to the threshold of significance of coordinate differences.

It is also advisable to make a preliminary analysis about how to perform the interpolation of digital elevation models (DEM’s), because, given a certain density and distribution of points, the several available algorithms may return models with a different degree of adherence to the input data. It is symptomatic, for example, the great influence of the value of the smoothing factor when using radial basis function algorithm.

In principle, the choice of the method for surface comparison strongly depends on the nature of the phenomenon to be observed.

Among the three methods adopted for the monitoring of this landslide (distances between points and a mesh, DEM subtraction, vertical cross-sectional analysis), it would not be correct to state that there is one more appropriate than others because each one is suitable for appreciating a different aspect of the observed phenomenon. DEM subtraction allows volume calculations, also over wide areas, while cross sections allow to detect displacements occurred in a restricted area along a certain profile. The evaluation of mobilized volumes is not a trivial task and requires the expertise of geomorphologists in order to define homogeneous areas from geomorphological point of view.