1 Introduction

Safe and soft landing on a celestial body (planet, moon, asteroid, comet) has been and will be a central objective for space exploration. For current and future missions, pin-point landings are planned which require a high accuracy in absolute navigation. This is achieved by combining inertial measurements and measurements from optical sensors like star trackers, laser altimeters and processed navigation camera images. This combination of sensors is common to many missions and is subject of research and development within the project ATON (Autonomous Terrain-based Optical Navigation).

1.1 Motivation and goals

The German Aerospace Center (DLR) has been active on planetary science for decades, and it has been involved in many interplanetary missions providing instruments and technologies. Technologies for landing have been developed for the lander Philae of the Rosetta mission which was landing 12 November 2014 on the comet 67P/Churyumov–Gerasimenko ten years and eight months after departing Earth [27, 31]. Similarly, the asteroid landing package MASCOT (Mobile Asteroid Surface Scout) was developed by DLR in cooperation with the Japanese Aerospace Exploration Agency (JAXA) and the Centre National d’Études Spatiales (CNES) [13]. It is traveling onboard JAXA’s mission Hayabusa 2 to its landing target the asteroid 162173 Ryugu. The landing is foreseen in October 2018.

As an evolution of the landing technologies which were applied to Philae and MASCOT technologies for precise and safe landing, new methods are in the focus of DLR’s research and development activities. One element is the project ATON. The project was initiated in 2010 and started from several already available technologies in the domain of image processing, optical navigation and state estimation.

The overarching goal of ATON is the development and demonstration of an optical navigation system for exploration missions and its technologies which are allowing a precise and safe landing on a celestial body. The goals of the project are:

  • Development of a flexible system concept allowing tailored solutions for different missions,

  • Development of image processing and optical navigation techniques for absolute and relative navigation,

  • Development of navigation filtering techniques fusing all available sensor data and image processing outputs,

  • Verification of all algorithms implemented as software in MiL, SiL, PiL and HiL setups including the development of software and hardware tools for realistic simulation,

  • Verification of the navigation system performance in open-loop and closed-loop control environments,

  • In-flight demonstration of the navigation system in terrestrial test environments.

ATON was set up as a technology research project without a concrete mission to be served. This provided more degrees of freedom than in mission-driven developments, and it allowed to explore different approaches to the same problem in parallel: to start with new ideas, and to more thoroughly investigate different solutions. One of the main differences to many agency-driven technology developments in the same area is that all elements of the optical navigation system were continuously researched and developed by the same enterprise and the same team. This allowed to have a broader view and to get a deeper understanding of the optical navigation system and the underlying principles.

1.2 Assumptions and decisions

ATON is targeting the navigation system development for a landing on solar system bodies in general. The navigation system shall use the surface (respectively the terrain) of the target body for obtaining the navigation solution (position, velocity, and attitude in a target body-fixed coordinate frame). Although there is a high diversity in size and structure of solar system bodies, there is only one criterion which has a high impact on the navigation system architecture—the atmosphere. Based on this, the class of potential targets was narrowed down for ATON to celestial bodies with no or very thin atmosphere. This allows to observe features and landmarks on ground already from high altitudes, limits the influence of optical effects of the atmosphere, and makes topographic features more stable.

The selected class of targets includes the Moon, asteroids, comets, and other small planetary moons like Phobos and Deimos. Out of this class the Moon was selected as the reference target. The Moon is one of the largest bodies without atmosphere. Since the dynamics of a descent and landing are driven by gravity, the most challenging requirements can be expected for a landing on the Moon. A second aspect for choosing the Moon is that it is well known and well mapped with a lot of data publicly available.

When neglecting cooperative targets such as landing sites equipped with beacons, current optical navigation techniques based on image processing can provide absolute and relative navigation information within the local reference frame of the target celestial body. Thus, the work in ATON assumes that the following sensor suite is available for implementation in a future exploration mission using ATON’s technology:

  • Inertial measurement unit (IMU) providing measurements of the angular rate and the non-gravitational acceleration,

  • Star tracker (STR) providing inertial attitude information,

  • Laser altimeter (ALT) delivering the distance to the ground along its line of sight,

  • Monocular monochrome navigation camera taking images of the target body and terrain which are subject to further image processing, and

  • Flash LIDAR (light detection and ranging) providing 3D-images.

This assumption is based on the review and analysis of past and currently developed missions and technologies as well as on preliminary analysis at the beginning of the project.

2 Requirements and reference mission

To define a goal for the technical development within the project ATON a review of historic and planned missions (at the time of the start of the project in 2010) was done. From that the targeted navigation accuracy was derived. Furthermore, a reference mission for the project was defined including the sensor suite and assumed characteristic performances.

2.1 Navigation requirements

To achieve the project goals defined above, the navigation system must support the guidance, navigation and control (GNC) system with sufficient state-vector data. In case of ATON, the state-vector shall be autonomously determined from the beginning of the landing maneuver at the descent orbit injection (DOI).

Table 1 Required navigation accuracy for a 200 m (\(3{\text {-}}\sigma\)) lunar landing for ATON

To help in the definition of requirements, data from a covariance analysis of a lunar landing navigation system [8] have been used as reference. The goal of the analysis was to find the requirements for a terrain-relative navigation (TRN) sensor to achieve a 100 m (3\(\text {-}\sigma\)) navigation accuracy at landing. For that purpose, the navigation data was fed into a proportional derivative controller which controls position and attitude.

A major outcome of the study is the determination of the main errors of propagation. One error is represented by the accuracy of the initial state-vector. The second major source for propagation error is the quality of the gravity model.

For a better understanding of the discussion of navigation requirements, a short overview on dispersion control is given in this paragraph. Any navigation error leads to a dispersion of the lander’s position from the reference path. With a high probability, the dispersion is at least in the same size as the navigation error. Nominally, the spacecraft is designed for an optimized reference path which represents the most fuel-efficient way to land on the Moon. The earlier the dispersion can be measured, the more efficient it can be controlled by slight changes to the reference path. The later the dispersion is measured, the higher the modifications of the remaining reference path have to become. This leads into increased fuel consumption. The main part of landing dispersion is mainly in the downrange direction. An excellent possibility for downrange dispersion control is at the powered descent initiate (PDI). By changing the time of thruster ignition by several seconds and by slight modifications to the powered descent (PD) reference path, the downrange dispersion can be reduced down to the navigation error with very little fuel cost.

Based on this analysis for ATON, the following assumptions are made:

  • The IMU and STR used in ATON are of equal quality like in the study in [8],

  • An initial state-vector precision comparable to [8],

  • Utilization of absolute position measurements in parts of the descent orbit (DO),

  • Altimeter utilization not before PDI,

  • During PD, altimeter and velocimeter function is performed by the optical navigation system, and

  • The 3D imaging system is working after the landing site becomes visible.

Based on these assumptions, the required navigation performance for ATON is shown in Table 1. At the DOI, the navigation accuracy corresponds to the capability of the ground station network. During the coasting in the DO, the landmark navigation system shall provide several measurements with an accuracy of 1% of current height or 100–1000 m for down-range and cross-range and 0.5% of current height or 50–500 m for altitude. This enables the propagator to determine the SC position at PDI within 100 m.

During PD, the optical navigation system will perform altimeter and velocimeter functions. Due to the lack of position measurements the navigation error will grow during this period. The task of the navigation system is to keep the propagation stable and the error growth small.

After the pitch over and as soon as the landing site becomes visible, the 3D imaging system will start to take measurements. The resulting data will possess an initial resolution in the order of 50m and continuously grow during the descent. The 3D data will be compared with an onboard 3D map of the landing site, gaining a navigation knowledge in the order of 50 m. The purpose of the 3D imaging system is also to deliver the necessary data for the evaluation of the landing area. When a safe landing site is found, the GNC system must be able to place the lander inside the safe area. The size of the safe area is assumed to be in the order of three times the diameter of the lander. Thus, the allowed landing error is in the order of half a lander diameter. The navigation requirement for the landing is, therefore, set to 2m. This should be possible when considering the 3D data requirement at the later stage of the landing. The needed 3D resolution is in the order of 15 cm per pixel. This data will become available in an altitude of \(\approx\) 400 m.

2.2 Reference mission

For generating simulation data a, reference mission has to be defined. The general sequence of approach and landing is defined as:

  1. 1.

    Start in an initial 100 km \(\times\) 100 km quasi-circular orbit around the Moon,

  2. 2.

    Execution of descent orbit injection (DOI) maneuver to reach a 100 km \(\times\) 10 km orbit,

  3. 3.

    Flight along the elliptic descent orbit to pericenter,

  4. 4.

    Start of powered descent (PDI) close to the pericenter of the descent orbit,

  5. 5.

    Achieve an almost vertical descent for the last 100 s,

  6. 6.

    Final conditions: altitude \(\approx\) 1 m above landing site at < 1 m/s velocity.

Fig. 1
figure 1

Sketch of the lunar landing scenario with a polar orbit

Figure 1 illustrates the sequence of events for the reference mission.

For the simulation runs, several prominent and four arbitrary landing sites have been selected. For some of them, landing at different times has been simulated to see the effect of different illumination conditions.

A powered descent trajectory with constraints for actuators and flight states as well as with the objective of minimal fuel consumption can only be generated as a solution of an optimal control problem. For the specific case of a landing vehicle with non-throttable engines, a solution is provided in [24]. The paper defines an optimal control problem and provides a solution. Furthermore, a tracking controller is designed which enables the vehicle to follow the designed reference trajectory even in presence of uncertainties. A more robust implementation of an onboard algorithm is presented in [19] where a suboptimal trajectory is interpolated onboard depending on the initial state. This allows very large uncertainties at the initial conditions at PDI.

Figure 2 shows the altitude vs. downrange profile of the powered descent. Figure 3 displays the velocity profiles. It can be seen that the main thrust is changed in only three steps. This meets the specific requirements of a landing with non-throttable engines where the thrust is reduced by switching off pairs of engines.

Fig. 2
figure 2

Altitude vs. downrange of the powered descent trajectory as computed in [24]

Fig. 3
figure 3

Horizontal and vertical velocity vs. time for the powered descent trajectory as computed in [24]

In Fig. 4, the pitch angle is shown for the powered descent. It can be seen that the pitch angle is kept at low angles (below 20\(^{\circ }\)) for a long period where mainly the horizontal velocity is decreased. Afterward, the landing vehicle pitches down so that the x-axis (down direction when the lander is landed) points more and more down. To provide good visibility of the landing site for the onboard sensors, the last part of the descent is almost vertical with a pitch angle close to − 90\(^{\circ }\). The trajectory design includes constraints for the visibility of the landing site by the optical sensors (LIDAR and camera). For the nominal trajectory both sensors have the landing site in the field of view below altitudes of 1000 m. This allows to use the same sensors for hazard detection. In the early phase of the project ATON hazard detection was also analysed and developed. The processing started at an altitude of 1 km and a decision for a landing site has been obtained at an altitude of 250 m. The time between these two gates is approximately 44 s.

Fig. 4
figure 4

Pitch angle vs. time for the powered descent trajectory as computed in [24]

Before PDI, the spacecraft follows an elliptical descent orbit from an altitude of 100 km. For the project ATON, a period of 2600 s before PDI is included in the scenario to provide sufficient time for the acquisition phase of the navigation system so that the navigation accuracy can be achieved.

3 System overview

In addition to the definition of requirements and the definition of the mission, a reference system design is established for the project ATON. As pointed out earlier, the goal of the project is to develop a generic system and technologies for optical navigation which should be applicable to various space exploration missions. This section defines a reference set of sensors to be included in the analysis. Parameters are defined for the selected reference mission. Finally, a system architecture is designed for the reference mission based on selected navigation and image processing technologies to be developed during the project ATON.

3.1 Reference sensor configuration

Based on the analyses above, the following sensors are included in the navigation system:

  • Inertial measurement unit (IMU) providing measurements of the angular rate and the non-gravitational acceleration,

  • Star tracker (STR) providing inertial attitude information,

  • Laser altimeter (ALT) delivering the distance to the ground along its line of sight,

  • Monocular monochrome navigation camera (visible light spectrum), taking images of the target body and terrain which are subject to further image processing, and

  • Flash LIDAR providing 3D-images.

For the simulation of input data for the navigation and image processing algorithms and methods, the parameters of the sensors must be fixed. These parameters are a baseline for further development steps. For some later analyses some parameters can be changed too.

For the IMU and the STR, three different classes of sensors (low, medium and high accuracy) are defined. This allows to analyze the impact of the sensor accuracy on the navigation accuracy.

For the navigation camera, the following parameters have been selected based on the review of currently developed and planned missions and a detailed geometric analysis. Table 2 shows the baseline parameters.

Table 2 Camera specifications as used in the project ATON

For the LIDAR, the parameters have been selected based on the review of currently developed and planned hardware [26]. Table 3 shows the baseline parameters.

Table 3 LIDAR specifications as used in the project ATON

For the alignment of the star tracker the following conditions are considered: it shall point away from Sun and lunar surface, thus the baffle exclusion angles for the STR are met at all times. Furthermore, the plume of the main engine shall not be included in the FOV of the sensor. During the landing, the vehicle performs a pitch of about 90\(^{\circ }\) where the baffle exclusion angles also have to be considered.

Since most landings on the Moon occur on a lunar morning (in order to have about 14 days of illumination), it can be assumed that the Sun elevation at the landing site is not very high. Furthermore, it can be assumed that the low elevation of the Sun is not in flight direction or anti-flight direction since in this case either the navigation camera might be blinded by the Sun or would not see shadows and therefore have only very few characteristic features for optical navigation. Based on these assumptions the Sun elevation will be below 60\(^{\circ }\) and the Sun azimuth with respect to flight direction is between + 30\(^{\circ }\) and + 150\(^{\circ }\) or between − 30\(^{\circ }\) and − 150\(^{\circ }\), respectively.

With these conditions, the STR should be mounted with its boresight close to the pitch axis. Depending on which side of the flight path the Sun is expected, it has to be mounted on the left or the right side with respect to flight direction.

Fig. 5
figure 5

Sensor reference configuration for ATON: red—camera, green—laser altimeter, blue—LIDAR; This figure is a representation for the viewing directions of the sensors. The placement of the sensors is not representative since the thruster plume has not been considered

The camera, LIDAR and ALT are mounted outside or on the surface of the landing vehicle. A diameter of about 4 m is assumed for the vehicle. This leads to a lever arm with respect to the IMU or body-fixed frame. The viewing direction is chosen such that the edge of the FOV for camera and LIDAR is on one side the x-direction of the body-fixed frame (down direction when the lander is vertical upright, e.g. on ground, or forward direction when the main engine is horizontally aligned in PD). All optical sensors are mounted on the side which faces the ground during the almost horizontal flight in the first phase of the PD. The setup of the sensors is illustrated in Fig. 5.

3.2 System architecture

In the section above, the set of sensors as well as their alignment on the landing vehicle have been defined. Since the output of the system shall be the navigation state vector, a mandatory element is a navigation filter which combines and fuses all sensor measurements and preprocessed data to a navigation solution. This is complemented with further modules for processing of image data. Figure 6 shows the conceptual data flow within the ATON navigation system with seven processing modules including the navigation filter.

Fig. 6
figure 6

Block diagram of the ATON system: blue—sensors, green—on-board processing modules

Figure 7 provides an overview which of the modules is running in the different phases of the landing.

Fig. 7
figure 7

Timeline for the different processing modules of ATON

The processing modules are encapsulated in tasks which are executed in parallel. The inter-module communication and the scheduling of the tasks are managed by DLR’s data flow-oriented tasking framework [23]. It ensures that a module is only executed if all necessary inputs are available. The integration of the ATON software was conducted in a model-driven manner: an extended SysML/UML model was created to describe the processing modules with their interfaces and parameters, data types, priorities and the data flow between the modules [7]. Custom code generators create the source code for data types, communication, module interfaces, and serialization code for the telemetry.

The following paragraphs provide a short overview of each processing module.

3.2.1 Feature Tracker

This module is used to extract and track image features over the camera sequence. To perform this task, the Lukas Tomasi Kanade (KLT) Tracker is used over two successive images at each step. The tracker is based on two steps: the first step is image feature extraction based on high gradients in two axes [29] in the very beginning of the sequence and later at image regions where no features are present (anymore). The second step is feature tracking which is based on image region similarity [20]. This step allows sub-pixel accuracy for sharp textures. The 2D pixel coordinates of these image features in the two successive input images is output to the Navigation Filter module.

3.2.2 Crater Navigation

The Crater Navigation module detects lunar surface impact craters in the images of the navigation camera, and it assigns each crater detection to an element from a static crater catalog referenced in Moon-fixed coordinates. From that correspondence, a Moon-fixed camera (and thus vehicle) position can be computed. This position is supplied as a measurement to the Navigation Filter that may use it to cancel accumulated position and velocity errors from the feature-based relative navigation (see above). Next to this regular drift removal over crater fields, larger corrections after phases where no craters were visible are of great value. The crater detection is based on the extraction and matching of adjacent areas of above- and below-average brightness that model the reflection and shadow of typical crater interiors under illumination [21, 22].

3.2.3 Shadow Matching

The Shadow Matching module provides an absolute localization in the planet’s reference frame with help of the binary shadow matching algorithm (BSM) described in [15]. The algorithm is based on the idea to use shadows on the lunar surface as landmarks. Given a camera image and the current pose estimate, the BSM extracts shadows from the image and creates descriptors for each extracted shadow. The descriptors are represented as one-dimensional binary vectors for memory and matching efficiency. These shadow descriptors are matched with reference descriptors which have been computed previously, e.g., on ground. In a final step, the matching result is used to compute an estimate of the absolute pose with a covariance. As an accurate orientation of the lander is provided by the STR, only the absolute position along with its covariance values are provided to the Navigation Filter.

3.2.4 Epipolar Geometry

The Stereo Matching, as used in the 3D processing chain, requires an accurate knowledge about the relative orientation of every two images being matched. It is required to calculate their epipolar geometry with less than 0.5 pixels of error to ensure the quality of the 3D model. The Epipolar Geometry module performs this task taking two subsequent images as input, together with the rough relative orientation provided by the Navigation Filter. It extracts and matches common features between the two images and uses them to calculate the precise relative orientation between the two images using a small bundle adjustment with RANSAC. Finally, it passes the calculated relative orientation to the stereo matching module for each pair of images.

3.2.5 Stereo Matching

The Stereo Matching module computes dense depth maps from two consecutive and partly overlapping images, also known as structure from motion [30]. It uses the semi-global matching algorithm (SGM) which is known from robotics and aerial image processing to provide accurate and dense depth maps [10,11,12]. Given two camera images with approximately 75–80% overlap and the accurate relative orientation provided by the Epipolar Geometry module, the SGM can triangulate the so-called disparity for corresponding image points. As the ATON system uses calibrated cameras, it is possible to convert the disparities into metric depth values. This allows to provide metric depth maps to the 3D Matching module.

3.2.6 3D Matching

The 3D Matching module provides an absolute pose estimation in the planet’s reference frame. It is based on the iterative closest point (ICP) algorithm [5, 6, 28] which can determine the offset, i.e., the relative transformation, between two 3D point clouds. The module can either use a range map from the flash LIDAR or a metric depth map from the Stereo Matching module as input. The pose estimate at the time of creation of the input data is required as an initial guess of the offset between the point clouds. The in-flight-generated point cloud is matched to a reference point cloud, which was created previously on-ground from a DEM of the fly-over area or of the landing site. First, the ICP searches corresponding points from the point clouds, and second, it estimates an optimal transformation that minimizes the distance between the correspondences. This is repeated until the optimization converges and a best guess of the pose estimate is achieved. Hence, the Epipolar Geometry module, the Stereo Matching module, and the 3D Matching represent a sequence of consecutive modules that provide an improved absolute position estimate. Since the navigation system contains a STR providing attitude, only the position estimate is provided as the output to the Navigation Filter.

3.2.7 Navigation Filter

This module uses the output of the Feature Tracker, the Crater Navigation, the Shadow Matching and the 3D Matching along with the raw IMU, altimeter and star tracker measurements to estimate the true navigation solution. The Navigation Filter is based on high-rate strap-down computation and a low-rate error-state Unscented Kalman Filter (UKF) [1]. The strap-down algorithm uses the IMU measurements to propagate the total navigation solution forward in time for each measurement. The low rate UKF estimates the error of the strap-down algorithm and corrects the propagated navigation solution based on the absolute position measurements from the other modules, the absolute attitude from the star tracker, and the altitude above the lunar surface measured by the altimeter. Additionally, the tracked image features are used in a visual SLAM algorithm [2, 3] to provide further position updates to the Navigation Filter.

4 Results

4.1 Past development milestones

During the project ATON, several milestones have been achieved since its start in 2010. These can be grouped into four phases which are:

  1. 1.

    Setup of simulation environment including the simulation of images of the navigation camera and LIDAR,

  2. 2.

    Integration and verification of software modules in a Model-in-the-Loop (MiL) environment and later a Software-in-the-Loop (SiL) environment,

  3. 3.

    Verification of the navigation system and elements of the system in Hardware-in-the-Loop (HiL) and Processor-in-the-Loop (PiL) test environments,

  4. 4.

    Verification of the navigation system with outdoor flight tests using an unmanned helicopter testbed.

4.1.1 Setup of simulation environment

In the first step, a simulation environment was set up which included the dynamical model of a lunar landing vehicle as well as models for all sensors. To create the proper inputs for image processing methods which are part of the ATON system, an extensive simulation was set up to generate artificial images from the given state of the vehicle, the chosen camera parameters and the digital elevation models (DEM) of the lunar surface. For that purpose, the DEM maps of the Japanese Selene (Kaguya) mission were acquired and preprocessed [9, 14]. Although the Selene mission provided a global mapping, the DEM resolution is limited. For the final phases of the landing (below 2 km altitude), the noise in the DEM is dominating. For that reason, the DEM was enhanced with artificial structures which can be expected at the landing site [18].

The preprocessed and enhanced DEMs were used in the camera and LIDAR simulation [25]. It provided a 1024 by 1024 pixel monochrome image as well as a depth image of the same size. Figure 8 shows an example of a simulated image. From the depth image a subset of 400 by 400 pixels was cut out to simulate the LIDAR measurements. A single point of the depth image was used for simulating the laser altimeter. In addition to these time-tagged images, simulated time-tagged sensor outputs for IMU, STR and laser altimeter as well as a true state were created. Based on the sets of simulated navigation sensor outputs, the development of the different processing modules could be supported.

Fig. 8
figure 8

Example of a generated navigation camera image during a simulated lunar landing

4.1.2 MiL and SiL tests

For initial development and also for verification in later development stages, the image processing and navigation modules have been embedded in a Matlab/Simu-link-based simulation environment. Since most of the modules have been based on C/C++-coded processing libraries, the same coding language has been used. To test the modules, their code was embedded in Matlab/Simulink s-functions. The sensor models for STR and IMU have also been created in Matlab/Simulink. As described above, the simulation of images is a very extensive task. For that reason, the camera and LIDAR models in Matlab/Simulink were just loading precomputed image files into the simulation. Thus, the initial simulation environment was limited to open-loop tests where a limited number of pre-computed trajectories including their pre-computed images could be used. Nevertheless, the integration into Matlab/Simulink proved to be the right way since this environment allowed easy debugging of inter-module communication and the analysis of effects that do only occur in the interaction of modules. It also enabled the variation of architecture and configuration for the navigation system.

As a further evolution, the processing modules (see also Fig. 6) were embedded in DLR’s Tasking Framework [23] which would be needed for the integration on an embedded system. The initial tests of the framework were also done in the Matlab/Simulink environment where the complete set of processing modules including the Tasking Framework were embedded as a single s-function.

To prove function and performance in closed-loop operation, the simulation was extended by models for vehicle dynamics and actuators as well as by a guidance and control function. Furthermore, the simulation was connected to the image simulation engine to compute the camera and LIDAR images based on the current true state vector which is influenced by the control actions. Since the computation of a single camera image took about 20 s, the closed-loop simulations became a lengthy exercise lasting several days for a single simulation of the powered descent with a length of about 600 s simulated time. Nevertheless, the effort to create the closed-loop environment and to run the simulations was returned with results indicating how the control actions may influence the navigation function and performance.

4.1.3 HiL and PiL tests

Since the main part of the development is focusing on image processing and optical navigation, a HiL-test with a camera in the loop was chosen as an important development and verification step.

Fig. 9
figure 9

Simulation of the descent orbit phase of a Moon landing trajectory in TRON. The robot positions the optical sensor (in this case a camera) with respect to the illuminated terrain model, with the sensor recording data. Simultaneously, the laser tracker measures precisely the true pose of the sensor with respect to the simulated Moon

To generate a realistic scene for a camera as experienced during a lunar landing, DLR’s Testbed for Robotic Optical Navigation (TRON) was set up (see Fig. 9). It allows test and verification of optical navigation technologies up to TRL 7 [16,17,18]. TRON offers the possibility to perform HiL tests within scenes representative for the ones encountered by optical sensors during exploration missions. Typical sensor hardware which can be tested in TRON are active and passive optical sensors like LIDAR systems and cameras. The major components of the lab are a robot on a rail for dynamic positioning of the sensor under testing, terrain models and other environmental structures, a dynamic lighting system for illumination of the targets, laser metrology equipment for high-precision ground truth, and a dSPACE real-time system for test observation and control, and synchronization of ground truth and sensor data.

The lab allows to acquire highly realistic camera images including errors and effects of the sensor which cannot be modeled easily, e.g., lens distortion or lens flares. An example for images generated in TRON is shown in Fig. 10.

Fig. 10
figure 10

Lunar landing images created in TRON

For several reasons (see [16]) it is not feasible to replicate all the lunar surface visible during the mission. Therefore, the goal was to demonstrate a successful navigation during three sections of the trajectory which are most significant in terms of geometric shape and in the use of optical navigation methods. These have been found to be the descent orbit, the powered descent, and the landing phase. Furthermore, the project did not have access to a LIDAR sensor to be used in the lab. Therefore the tests in the lab have been limited to the camera.

TRON was applied first for an open-loop testing and then for closed-loop testing of the complete ATON navigation system. For both the software simulation was modified. In Fig. 11, this is shown for one step of the closed-loop system. The software-in-the-loop test environment was changed by replacing the camera model by the real camera in the TRON environment. The simulation provided TRON with the attitude and position of the camera in the Moon-centered Moon-fixed frame (MCMF). This information was used to position the robot with respect to the lunar landscape model in TRON. Other auxiliary information as the Sun vector were also provided to TRON to position the lamp in the proper way and to simulate realistic time and position-dependent illumination of the surface.

Fig. 11
figure 11

Block diagram of HiL closed-loop setup in TRON

The images acquired during the tests have then been used in the processing chain of the ATON system together with simulated sensor data for altimeter, STR and IMU. The same images have also been used to improve the single image processing and optical navigation modules. For example, the crater navigation module was tested as a single element during the descent orbit. Figure 12 shows an example how position determination by crater navigation was verified.

Fig. 12
figure 12

Crater navigation during lunar landing simulation in TRON. Turquoise ellipses: detected craters, Pink crosses: craters in database. Overlapping symbols indicate match between detected craters and database used for navigation

The three sections simulated in TRON are down-scaled by ratios of 1:125,000, 1:10,000, and 1:100. Especially for the first two cases, the positioning errors of the robot can translate into large errors in ground truth. For example, a positioning error of 1 mm with respect to the descent orbit model which is scaled by 1:125,000 translates to 125 m. To make sure that the ground truth error of position and attitude is in reasonable bounds, the camera’s position and attitude on the robot is measured by a high-precision laser tracking system. Further, the laser tracker measurement is used in an internal closed-loop to reduce the positioning error of the robot down to a level of 0.3 mm and 0.2\(^{\circ }\) [17]. This allowed to prove the operation of the ATON navigation system in closed-loop for parts of the powered descent.

As a preliminary step to the following flight tests, the ATON software was implemented on an embedded system. In a first step, the simulated data from the MiL simulations were fed into the navigation software to prove its function and performance on the embedded system in an real-time environment. Later, the same setup was used to replay recorded flight data to analyze different software settings and processing parameters. For both steps, a real-time capable log player was developed.

4.2 Unmanned helicopter flight testing

Before conducting the flight tests, several other development steps had to be done. First, specific flight hardware had to be integrated, i.e., interfaces and software drivers had to be developed, implemented, integrated and tested. Furthermore, the development included the design and production of targets resembling craters as well as the design, implementation and verification of generating ground truth data, together with accurate mapping of the crater targets.

As pointed out earlier in this paper, testing on-ground of GNC systems does not allow to verify the item-under-test completely in a single test with the environment or on the trajectory to be expected in operation. The same applies for the flight tests. It is obvious that the illumination conditions on the Moon cannot be created easily on Earth. The lunar landscape cannot be created on large areas for flight tests. And obviously, the flight dynamics of a helicopter are different to that of a lunar landing vehicle.

The very first tests were done by mounting the whole experimental setup on a small carriage and driving it around on ground. Later, flight tests were conducted using an unmanned helicopter. The first flight test campaign focused on recording flight data from all available sensors. This was followed by a second flight campaign were the ATON navigation system was tested in open-loop. The last test campaign was concluded in March 2017. In these tests, the ATON system was used as the primary navigation system for the autonomous flight of the unmanned helicopter. The results of this most recent test campaign are presented in the following sections.

4.2.1 Test set-up

The objective of the flight test was to demonstrate the real-time closed-loop operation of the ATON navigation system in an exploration mission scenario. The overall test concept was to fly a navigation sensor suite along a predefined reference trajectory over a field. The ground has been equipped with artificial crater-like targets that were mapped into an Earth-fixed frame. During flight, the ATON navigation system provided a navigation solution in the Earth-centered Earth-fixed (ECEF) frame which was fed back into the guidance and control system of the helicopter. The navigation solution was used to track the predefined flight trajectory.

4.2.2 Trajectory and flight apparatus

The test campaign took place near Braunschweig, Germany, at a test site offering a strip of land and a volume of restricted airspace suitable for flying unmanned vehicles over an area of about 300 m \(\times\) 300 m. The job of transporting the navigation payload was performed by an unmanned SwissDrones (SDO 50 V2) helicopter operated by DLR (Fig. 13). This platform is capable of autonomous, assisted and remote-controlled flight, and it offers a payload capability of approximately 50 kg (fuel plus experiment equipment).

All sensors were integrated on a single platform. The devices relevant for this paper are marked in the image of the experimental payload in Fig. 13. A tactical-grade IMU (iMAR iTraceRT-F400-Q-E, specifications in Table 4) was used for acquiring velocity and angle increments. It operated at 400 Hz.

Capturing of images was performed by two monocular, monochromatic cameras (AVT Prosilica GT1380). Having been installed in a forward-looking and down-looking configuration, their resolution was set to 1024 pixel \(\times\) 1024 pixel at 20 Hz sampling rate. For measuring the altitude of the platform similar to the description of the lander in Fig. 5, a laser scanner (SICK LD-MRS) is used. The laser scanner has been configured to have only a small field of view to emulate a laser altimeter. The STR measurements could—of course—not be acquired during daylight. Therefore, they have been emulated by the reference navigation system which was also used to provide ground truth navigation. Before the flight tests all sensors have been calibrated and aligned using sensor measurements and images of checkerboard patterns. Major changes caused by vibration and shock during transport and test flight have not been observed.

Fig. 13
figure 13

Camera, IMU, laser scanner, onboard computing and ground truth hardware installed on helicopter during flight

Fig. 14
figure 14

Trajectory of one test flight (red) and crater center positions (yellow). Image background: Google Earth

Table 4 IMU (1\(\sigma\)) specifications.

Considering the experience of earlier activities with the ATON system, a position accuracy in the order of low one-digit percent of (camera) line-of-sight range was assumed as a likely upper bound. Given the flight trajectory followed (Fig. 14), this translates to a ground truth accuracy requirement of centimeter level. Therefore, the helicopter payload was equipped with a high-grade GNSS receiver NovAtel Propak6. It uses both L1 and L2 frequencies and the German precise satellite positioning service, SAPOS. This service relies on a network of reference stations with precisely known positions to determine corrective data for all visible GPS satellites. Furthermore, two GNSS antennas were used allowing the receiver to also determine heading and pitch angles in the North-East-Down reference system. The Propak6 output has the following \(1\sigma\) accuracies: about 0.03 m in position, about 0.4\(^{\circ }\) in heading and pitch, and about 0.03 m/s in velocity.

Fig. 15
figure 15

Craters after preparation and ready for testing

About half of the available terrain in Fig. 14 was used for the flight operation. The remainder was reserved as safety perimeter, ground station and test crew area. The reference flight trajectory was defined as a linear path, stretching from north-east to south-west for about 200 m, and from an initial altitude of 50 m down to 10 m. After the slight descent on this path, the helicopter performed an almost vertical descent down to 1 m above ground. Figure 14 illustrates this profile.

Obviously, craters are necessary for the crater navigation module to work. A pattern of planar crater targets (Fig. 15) was thus scattered in a random manner over four sub-fields along the flight area. Altogether, 80 craters with diameters between 5 and 0.5 m were used. The bigger craters were situated near the beginning of the path (higher altitudes) and the smaller craters nearer to the end (lower altitudes), ensuring a near-constant coverage of the camera images during the linearly decreasing altitude. After placing the crater planes, they were fixed to the ground (Fig. 15). A picture of the crater scattering is shown in Fig. 16.

Fig. 16
figure 16

Helicopter over test field during flight

4.2.3 Crater catalog

Subsequent to field preparation, a catalog of crater positions was created. The pose estimated by the Crater Navigation and processed by the Navigation Filter is relative to this reference database. Tasks such as autonomous navigation for lunar landing or near-asteroid operation require the Crater Navigation to provide a pose in the reference frame of the target body. Therefore, the crater catalog was in this case expressed in the ECEF reference frame. A two-stage process was performed: at first, a tachymeter (Leica TDRA-6000) was used to measure all crater centers and three auxiliary points in a local (tachymeter) frame. Then, using the Propak6, the same three auxiliary points were measured directly in ECEF. This allowed the determination of a transformation from the local tachymeter reference frame into ECEF. Applying this transformation to all measured craters yielded the ECEF crater catalog. The accuracy of this catalog is then at the level of 0.01 to 0.02 m.

4.2.4 Configuration of ATON navigation software

As in the previous tests, the setting for the flight tests did not allow to include all sensors. A flash LIDAR was not available to the project. A star tracker could not be used since the flights have been executed during daylight. Furthermore, the flight area comprised only flat terrain. Therefore, the 3D processing chain as well as the 3D matching with LIDAR data was deactivated for the helicopter test flights. Due to development delays the shadow matching was also deactivated and verified in post-processing. All other modules were active. The crater navigation was processing images of both cameras and was running asynchronously starting the processing of the most recent image after the previous processing step was finished. The model-based software development allowed flawless activation, deactivation and re-configuration.

4.2.5 Ground truth

As mentioned above, a high-end GNSS receiver was used as means to obtain a ground truth for the tested trajectories. In an effort to increase the accuracy of this information, the output of the Propak6 receiver was fused with IMU data in post-processing. This did not only smooth the position and velocity solutions, it also completed the two degrees of freedom of attitude information given by the receiver (i.e., pitch and heading). The slight observability of attitude provided by the accelerometer measurements in combination with measured position and velocity further increased overall attitude accuracy. The covariance levels of kinematics states of the fused ground truth can be seen in Figs. 17, 18, and 19.

Fig. 17
figure 17

Fused ground truth position quality (\(1\sigma\) covariance): x—red, y—green, z—blue

Fig. 18
figure 18

Fused ground truth velocity quality (\(1\sigma\) covariance): x—red, y—green, z—blue

Fig. 19
figure 19

Fused ground truth attitude quality (\(1\sigma\) covariance): roll—red, pitch—green, yaw—blue

4.2.6 Flight results

The latest flight campaign in March 2017 conducted six single flight runs in closed-loop setup. For each flight, the final altitude above ground was set individually. A final altitude of 0.75 m has been achieved. Figures 20 and 21 show the track of the helicopter (ground truth and navigation solution) in the North-East and East-Up planes. The beginning of the trajectories is in the point (0, 0, 0) where the helicopter hovers for a short time before the begin of the descent. It follows an almost straight path down to an altitude of about 10 m above the landing site. From that point, the helicopter executes a vertical descent down to the final altitude of about 0.75 m. In both plots it can be seen that the true trajectory (blue) and the navigation solution of the ATON system (green) differ only by a small amount. At the end of the trajectory the divergence gets larger when the camera loses visibility of the ground targets. The hook shaped divergence is a combination of this effect with the climbing motion of the helicopter after the end of the experiment.

Fig. 20
figure 20

Plot of flight trajectory in North-East plane: blue—ground truth, green—ATON navigation solution; the experiments starts at the point (0,0)

Fig. 21
figure 21

Plot of flight trajectory in East-Up plane: blue—ground truth, green—ATON navigation solution; the experiments starts at the point (0,0)

Figures 22, 23, and 24 show the estimation errors for one of the closed-loop test flights. In Fig. 22, the estimated position errors and the estimated corresponding covariances are displayed. Furthermore, the rows of green, blue and yellow dots at − 0.5, 0 and 0.5 m indicate the state of the navigation system and the state of the closed-loop guidance and control system. The blue and green dots at − 0.5 and 0.5 m denote an update of the navigation filter by the sensor inputs. The blue dots at 0.5 show updates by the forward camera. Green dots at − 0.5 show updates of the down-looking camera. The yellow and green dots at 0 show which navigation solution is used for the closed-loop. Yellow dots indicate that the built-in GPS-based navigation system of the helicopter has been used. Green dots at 0 denote that the navigation solution of the ATON navigation system is used in closed-loop. The switching between these states was done manually by telecommand. Therefore, no chattering between the states was possible.

The experiment stops at time 340 s. At that point, the helicopter has reached the minimal height above ground. After reaching this experimental goal, the helicopter climbs up, and guidance and control switches back to the GPS-based navigation solution.

Fig. 22
figure 22

Position error in ECEF coordinates (x—blue, y—green, z—red); dashed lines denote the estimated error covariance; dots at − 0.5, 0 and 0.5 denote the state of the system. Yellow dots denote that the GPS based navigation system is for controlling the helicopter

When observing the position estimation error and the covariances, it can be seen that at higher altitudes the position estimation is slightly worse than at lower altitudes. This is expected since the same angular variation in the camera image corresponds at higher altitude to a larger variation in position. Towards the end of the flight when hovering low above the crater targets on ground, their visibility in both camera images is lost. For that reason the updates of the navigation filter stop and the error starts to grow slightly.

For the velocity errors in Fig. 23, a similar behavior can be seen. At high altitude, the errors are larger and become smaller at low altitudes. The error starts also to grow slightly when the observations from the image processing cannot be used for the filter update. For the attitude error in Fig. 24, the deviations are independent from altitude as it can be expected.

Fig. 23
figure 23

Velocity error in ECEF coordinates (x—blue, y—green, z—red); dashed lines denote the estimated error covariance.

Fig. 24
figure 24

Attitude error in body fixed frame (x—blue, y—green, z—red); dashed lines denote the estimated error covariance.

5 Conclusions

This paper provided an overview of the ATON project and its most recent results from flight testing. With the last flight test campaign, it was demonstrated that the ATON navigation system can provide a navigation solution based on optical and inertial measurements in real time. It could be proven that the provided navigation solution is accurate and robust enough to close the loop for the autonomous flight of an unmanned helicopter. Throughout the project duration and while achieving several development milestones, many valuable images, data, information and lessons learned have been created, processed and collected. Furthermore, the ATON project paved the way for verification of optical navigation sensors and components in representative environments. Hence, the creation of realistic scenes for cameras in TRON and the flight tests on the unmanned helicopter have been major milestones, and they are now available for further developments steps, for the verification of mission specific systems, and also for tests of equipment and software of the space community.

5.1 Lessons learned

Within the project many lessons have been learned. The most important are summarized here:

  • High-fidelity sensor simulation: for the proper development of image processing and navigation algorithms, a thorough knowledge and a complete representation of sensor signals is needed. This includes the simulation of realistic images.

  • Use and analyze real sensor data: for advancing the methods and algorithms as well as for making them more robust, it is essential to switch to real sensor and image data at an early point in development. This triggers failure modes which are not apparent in simulations. If this is done late, the test with real data may contain a few surprises.

  • Early real-time implementation: just from the beginning, implement while considering future portability to embedded platforms (e.g. being independent from libraries). If not considered, the re-implementation for an embedded system comes at a high cost.

  • Use model-driven software development: since the core of a navigation fusing optical and inertial sensor data is a complex software, a model-driven software development is recommended [7]. It allows to control and adapt the interfaces of the single modules in a consistent way. This way, the tedious debugging of inter-module communication could be limited.

  • Test in real-world and real-time environment: the transition to real sensors and real-time processing can offer a lot of pitfalls. If this could be done for parts of the system at an early stage it reduces the effort for bug-fixing when integrating and testing the complete complex system.

  • Accurate ground truth data: to assess the performance in HiL or flight tests, care should be taken to create a ground truth measurement with sufficient accuracy. It should be at least one order of magnitude better than the expected accuracy.

  • Stereo matching can work for the reference mission but requires the lander’s trajectory to be as much parallel to the surface as possible. Analysis showed that the stereo matching module can work in the altitude range of 10–2 km with an average depth error between 0.1 and 0.5% and an inter-image interval of approximately 10 s.

  • The 3D matching could be used from time to time as a final refinement step on pose estimates with a high confidence. It provides high accuracy when a good initialization is given, but due to the local optimization step in the ICP algorithm it is not well-suited to correct larger deviations of the actual pose.

5.2 Outlook

Although the project ATON has achieved a major milestone by demonstrating the capability of the navigation system to provide a robust and accurate navigation solution to guide and control an unmanned helicopter, the development of the system and its core software is continuing. Currently, the focus is set on optimizing the software to make it more efficient and robust to run it on space-qualified hardware with limited computational resources. As a reference architecture the results of the parallel project OBC-NG [4] are considered. One element of the further development will be the integration of the ATON software on the hybrid avionics architecture of OBC-NG and the transfer of a part of the image processing to FPGAs. In parallel, the work is going on to adapt the system and its elements to different mission scenarios. They include asteroid orbiters and landers as well as landings on larger solar system bodies.