Abstract
For safe operation of active space crafts, the space debris population needs to be continuously scanned, to avoid collisions of active satellites with space debris. Especially the low Earth orbit (LEO) shows higher risks of collisions due to the highest density of orbital debris. Laser ranging stations can deliver highly accurate distance measurements of debris objects allowing precise orbit determination and more effective collision avoidance. However, a laser ranging station needs accurate a priori orbit information to track an orbital object. To detect and track unknown orbital objects in LEO, here, a passive optical staring system is developed for autonomous 24/7 operation. The system is weather-sealed and does not require any service to perform observations. To detect objects, a wide-angle imaging system with 10° field of view equipped with an astronomical CCD camera was designed and set up to continuously observe the sky for LEO objects. The system can monitor and process several passing objects simultaneously without limitations. It automatically starts an observation, processes the images and saves the 2D angular measurements of each object as equatorial coordinates in the TDM standard. This allows subsequent initial orbit determination and handover to a laser tracking system. During campaigns at twilight the system detected up to 36 objects per hour, with high detection efficiencies of LEO objects larger than 1 m3. It is shown that objects as small as 0.1 m3 can be detected and that the estimated precision of the measurements is about 0.05° or 7 × the pixel scale.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
The number of space debris objects is increasing constantly, putting active satellites into a higher risk of collisions. Even small debris particles with a size of 1 cm can cause major damage to a satellite, and this cascading effect causes an exponential growth to the debris population. To avoid collisions between active satellites and debris, the orbital debris population needs to be constantly scanned and cataloged. To keep orbital objects cataloged their position needs to be measured frequently with high precision, which is required for precise orbit determination. Using the resulting orbit predictions, the risk of a collision between a satellite and a debris object can be calculated and collision avoidance maneuvers can be performed subsequently. Of special interest is the LEO as it shows the highest density of debris fragments. For residential space objects (RSO) in LEO, only tracking radar or laser ranging sensors are capable of delivering good enough (radar) or highly accurate (laser ranging) data for predictions. Current prediction uncertainties of debris RSO in LEO are based on radar measurements and require a large safety margin, resulting in 10,000:1 false alert rate [1]. Laser ranging measurements, on the other hand, have an order of magnitude better precision of the distance measurement. This allows much better orbit prediction [2] hence allowing more effective collision avoidance maneuvers between an active satellite and orbital debris [1].
Due to the small field of view (FOV) of a laser tracking system and small laser beam divergence, the RSO needs to be tracked continuously with high accuracy. This requires a priori orbit information of the RSO, which needs to be obtained by a separate sensor network. Currently, only radar systems in staring mode can fulfill this task of initial detection of unknown RSO in LEO. Their downside is their high hardware and operating cost. For this reason, we developed a passive optical staring system to detect and measure unknown orbital objects in LEO for subsequent laser tracking.
We already reported in our first results [3], that such a system is detecting 25% of objects which could not be correlated using the publicly available TLE catalog [4]. The detection efficiency of object with a radar-cross-section (RCS) between 1 and 2 m2 was already 50%, and almost 100% for objects with an RCS larger than 2.25 m2 [5]. We also demonstrated an instant handover to our tracking telescope (UFO) and redetected objects, like a rocket body, within the 0.27° large field-of-view (FOV) of the tracking camera, without any a priori information [5].
While former activities required manual observation and processing, here we will present a fully operational passive optical surveillance system called APPARILLO (Autonomous Passive Optical Staring Of LEO Flying Objects), which is operational to contribute in a space surveillance network. It is built for 24/7 autonomous operation to detect orbital objects in LEO and export their measured tracklets in the tracking data message (TDM) format [6].
The system is the foundation of future Stare and Chase handover, where an initial orbit determination (IOD) is calculated instantly from the measurements taken by the staring sensor. This orbit prediction will be sent to a tracking telescope which can perform subsequent tracking and laser ranging. The ranging data allow precise orbit determination and cataloging of a newly detected LEO RSO. This concept was previously published [7] and protected under the utility patent DE 20 2017 101 831 U1 in Germany [8].
The current system is a subsequent improvement to ensure autonomous operation, including weather sealing, automatic data recording, and processing. The photograph in Fig. 1 below shows the current system during observation campaign in December 2020. It also illustrates the detection principle of the camera system. The camera records stars as a point source and LEO objects as a streak due to their larger angular velocity. The streak recorded in the picture is the International Space Station (ISS) and the bright object in the center is the full moon.
2 Performance estimation
A passive optical sensor benefits from the fact that the sun illuminates the RSO and the reflected light from its surface can be detected via a camera. To detect this signal, the background illumination at the ground-based sensor needs to be low. A clear sky at night is required for successful operation. Furthermore, observations are not possible if the RSO is in the Earth shadow. Which is why under certain observation conditions RSO are not detectable around midnight. This, for example, is the case for zenith line-of-sight (LOS) in winter on the norther hemisphere.
To model the system performance, a spherical RSO is assumed which is illuminated by the Sun. Using the modeled signal-to-noise ratio (SNR) allows to calculate the RSO diameter for a set of system parameters. This gives the minimum detectable RSO diameter dRSO and is calculated as follows [9],
where ypx is the pixel size, f’ the focal length, Dopt the diameter of the aperture, and τopt the transmission of the optical system. Furthermore, τatm is the transmission of the atmosphere, QE the quantum efficiency of the detector, SNR the algorithm required signal-to-noise ratio, eread the read noise per pixel, Lb the background illumination, texp the exposure time, ωR the angular velocity of the RSO, RRSO the slant range to the RSO and P(ρ, ψ) the phase function of the RSO. Please see reference [9] for more details.
A more detailed analysis of theoretical system performance parameters of small telescopes for detection and tracking are described in reference [10]. These include the positional accuracy of object tracking, the limiting magnitude, and the information rate of the system. Design parameters are defined which describe optimal system performance. The defined metric for system FOV is the instantaneous field of regard, IFOR, and is defined as,
where a larger value follows a larger FOV and therefore covers a larger orbital volume. The limiting magnitude mv of such a system is defined as,
where Φ0 is the Irradiance of magnitude zero object, mRSO the number of pixels occupied by the RSO, and eb the electrons by the background sky irradiance per pixel.
The information rate shows that the detection performance does not only depend on the sensitivity but also on the FOV, pixel scale and number of crossing RSO. This metric will not be analyzed in more detail but shows the dependencies of the system performance in an analytical way. The metric for information rate is depending on the density of RSO per deg2 nRSO, the FOV, the information objective JI, and the probability of successful RSO detection P(ΓRSO+n > SNR σn):
where the information objective JI is defined as,
The metric JI gives the relative amount of information in a single observation and is inversely proportional to the uncertainties.
Further details about theoretical system performance metrics can be found in ref. [10]. For the system under consideration, Table 1 lists the simulation parameters and results. Due to a small focal length compared to tracking system [10], the information metric is small; therefore, the IFOR is considerably larger and allows observing a larger orbital volume. The minimum detectable RSO diameter is rather small compared to the limiting magnitude of this system.
3 System details
The system consists of an imaging camera with a lens, a GPS receiver for time synchronization, a computer for image recording and processing, a weather station, and a weather-proof housing. Latest observations were performed on top of the roof of our office building in Stuttgart Vaihingen. The core system is already operational with a camera, lens and a notebook, which makes it easy to set up and operate almost everywhere. To avoid any maintenance and manual operation, the system was extended with a camera and lens mounting for reliable camera pointing, and a housing for environmental protection. Environmental data measured by the weather station are used to toggle camera acquisition. Figure 2 below shows the location and connection between the components.
3.1 Camera
The system is based on passive optical measurements and the basic components are a camera and a lens to perform angular measurements. As camera we are using a large area CCD (charge coupled device) imaging sensors, namely the FLI PL09000, which has the On Smi KAF-09000 CCD sensor. The large pixel size results in a very good dynamic range of 110,000 e−, whereas the readout and dark noise is very low with a total of 15e− per frame. There is also no pattern noise visible, which is typical for CMOS (Complementary metal–oxide–semiconductor) sensors. The image sensor diagonal is 51.7 mm, and the resulting FOV is listed in the next Sect. 3.2. The following Table 2 lists more camera specifications with the settings used for observations.
The downside of these cameras is their relatively slow shutter with a total opening time of 54 ms and closing time of 52 ms [12]. Another downside is the slow image readout and transfer speed, which results from the high resolution in combination with the CCD read out architecture and the USB 2.0 interface. This limits their use to long exposures only and the maximum image recording frequency with our soft- and hardware was 0.2 Hz using binning of 2 (2px × 2px) and 0.1 Hz using full resolution.
3.2 Lens
The tradeoff between different lens choices is between a large aperture and short focal length. A short focal length results in a larger FOV which covers a larger orbital volume. A larger aperture diameter results in a higher sensitivity of the system and therefore better detectability of smaller (fainter) objects.
To keep system size and cost small we decided to use a common single-lens-reflex (SLR) medium telephoto lens. These are commercially available, have a very good image quality across a large image circle (even beyond their designed 43 mm) and are affordable (compared to their alternatives). Table 3 below lists the specifications of lens used during latest observations, including the resulting angular properties with the camera given in previous Sect. 3.1. (see Table 2).
A picture of the camera and lens mounted inside the weather-sealed housing can be seen in following Fig. 3 (more details follow in Sect. 3.5.).
For this lens, the image quality is very good across the entire image frame resulting in a point spread function of a star covering only a few pixels, see Fig. 4. The lens show an illumination fall off to the image corners, sample images are shown below (e.g. Figure 6). The vignetting due to the small FOV results mainly from the optical construction where the entrance pupil is obscured by lens element borders, which is typical for lenses with a small focal ratio (f#) [14]. But even without this, vignetting will be present due to the cosine fourth power law. A degradation of the image quality by the window could not be measured/observed, Fig. 4 contains two cropped images, one from the image center and the other from the outer corner of the image. It can be seen that the stars are recorded as symmetric points across the entire image and measure about 1.5px in FWHM.
3.3 GPS synchronization
For GPS synchronization, an Arduino-based GPS timer was developed, it consists of an Adafruit Ultimate GPS receiver and an Arduino Uno microcontroller [15]. It can be used to record the UTC timestamp of an incoming TTL pulse from the camera. The microcontroller compares the incoming TTL pulse from the camera with the PPS signal provided by the GPS receiver. This way, it measures the time of a TTL signal with 100 µs of precision [16].
3.4 Weather station
As weather station the Diffraction Limited, Boltwood Cloudsensor II was used. The weather station records the temperature difference between sky (using an infrared sensor) and ambient temperature, the light level, and if it is raining. This information is used to verify clear sky, darkness, and absence of precipitation, respectively. Following conditions are calculated from the weather data:
-
Clear sky: is True, if difference between sky and ambient temperature is smaller than < − 28 K.
-
Darkness: is True, if light level is smaller than 10 a.u.
-
Dry: is True, if rain equals 0.
3.5 Weather-proofed housing
The weather-proofed housing was developed by RaymetricsFootnote 1 and is an adaption of a wind LIDAR system. Its IP68 rating protects the equipment from the environment. The head is weather-sealed and has a viewing window. Blowers in front of the window as well as a thermoelectric cooler (TEC) prevent condensation on the window and controls the temperature and humidity, this ensures that each component runs within their specified operating conditions. The sensor head contains the camera (shown in Sect. 3.1) and lens (shown in Sect. 3.2) for recording the images, see Fig. 3. The head is movable between 90° and − 90° in respect to the horizon. It points upwards to zenith (or any other suitable elevation angle) during observation and moves the head pointing downwards when the image recording is stopped. This protects the viewing window from any precipitation and the sun accidently being focused on the image sensor or shutter blades of the camera.
The main compartment or cabinet is the control unit, which contains the main controller of the enclosure (SMU200), a 1U UPS, and a power distribution panel. These are mounted on a sliding 19″ rack with a 4U free space to accommodate our workstation computer. On top of the enclosure are mounted the Boltwood weather station, GPS antenna, and power supply nit box (which provides DC current to the electronics). Similarly, the enclosure on the head is environmentally sealed and temperature-controlled. Figure 5 shows the housing and its components with the sensor head pointing upwards.
3.6 Image recording
Image recording is performed with OOOS (Orbital Objects Observation Software), it has been developed by our department for satellite laser ranging (SLR) activities and is highly modular [17]. The software records the images taken by the camera, includes information about observation location, observation line-of-sight (LOS), UTC timestamp from our GPS image timer and meta-data of the camera settings and optics used.
A module named staring daemon handles the automatic image acquisition depending on data provided by the weather station and current time. The setup without any optical filters in front of the lens allows observations with a sun elevation of about -6° and lower to the horizon. This time is automatically calculated and image acquisition is started or stopped depending on the time only if no weather data are available. The sensor head is moved upwards pointing towards the sky and image acquisition is started, if all of the weather conditions (shown in Sect. 3.4) and time constrains are satisfied. If one of the conditions is not satisfied, the image acquisition is stopped and the head moves downwards. If the weather station data are not available, all conditions are considered as satisfied and image recoding is started and stopped based on the computers system time only. It requires the image processing to handle bad images. More on image processing, astrometric calibration and TDM export of tracklets will be described in following chapter 3. When image recording is started by the sensor, astrometric calibration is performed regularly to determine the exact pointing direction of camera system. Typical sample images are shown in Fig. 6.
4 Software structure
The image recoding is managed by the so-called Staring Daemon, which is a Python 3 program based on the OOOS software package [17]. The Staring Daemon handles the connected hardware:
-
Weather station,
-
GPS Timer (Arduino microcontroller),
-
Camera,
-
Enclosure (Raymetrics),
starts the image acquisition and handles the data export.
4.1 Staring Daemon
All parts of the software are separated, especially the hardware interfaces are worth mentioning as this follows that the software is not hardware-bound, allowing the user or system designer to select hardware independently. The weather station is controlled by the Environment Daemon and records the weather information. The camera is controlled by the Acquisition Process, which is also connected to the GPS timer. The Staring Daemon itself is connected to those processes and daemons using high-level commands. It reads the recorded weather data and toggles the image recording and sensor head position depending on the weather conditions and time. The Staring Daemon also handles the communication to the image processing program and hands over information and measurements to be uploaded to our website by the Internet Daemon. Following Fig. 7 gives an overview of the external programs (Space Debris [18], Astrometry.net [20]), OOOS Daemons, OOOS processes and data transfer between those.
4.2 Image processing
Image processing is performed by a separate program written in C++. The software was developed in cooperation with Kormoran Technologie GmbHFootnote 2 and is simply called “Space Debris” [18]. It reads in the recorded FITS image files and uses OpenCV [19] algorithms to process the images. A major challenge for the processing is the presence of clouds in the images, see Fig. 8 below. Combined with stray light from artificial light sources, this caused too many false-positive detections previously [3]. For autonomous operation, the new software was included and false-positive detection could be rejected completely during our campaigns in December 2020.
To detect objects in the images, the background intensity profile is determined by filtering high frequencies from the image first. The result is subtracted from the original image to remove the background. This process is performed iteratively which removes clouds or other intensity variations. The image can be binarized to separate the objects (e.g. stars or streaks) from the background. Stars and streaks are distinct by their size and shape. Pixel coordinates of the stars and streaks are measured subsequently. The star positions are used to perform astrometric calibration, which allows converting the streak coordinates into equatorial coordinates. As astrometric software the engine of the Astrometry.net project is used [20], installed as a separate program. Up to this stage, all calculations are done separately for each image.
4.3 Data export
In the next step, the equatorial coordinates of the streaks observed in several different images are combined into traces by the angular velocity and direction in the sky. Finally, a straight line is fitted with constant velocity into the observed data. This way we obtain a circular fit of the measured coordinates to remove any outliers. The results are written into a “Tracking Data Message” (TDM) file [6] which allows data sharing with other stations or databases. These unidentified TDM files are uploaded directly to our web database using the Internet Daemon, which delivers the connection for interchanging the data to a separate subsystem. Chapter 4 shows the resulting data of the first unsupervised campaign of the fully autonomous staring system.
5 Results
Compared to manual operation [3], the system now can use every minute to observe when conditions turn good. We operated the system constantly between November 20th and December 23rd. The weather conditions covered classical German winter weather, including storm, rain, fog, frost and snow. The conditions were far from optimal, making it a worst-case scenario for the system. During the campaign, APPARILLO was placed on top of the roof of our office building as shown in Fig. 1. The observation direction was fixed in the horizontal reference frame over the entire campaign. Zenith was chosen as LOS because it should give the best performance [9]. The system settings, properties and geodetic coordinates are listed in Table 4.
Figure 9 shows a series of 89 images combined to show how streaks are recorded by the system. It contains low altitude LEO RSO as long streaks, high-altitude LEO RSO as short streaks and rotating RSO with visible intensity variations.
Two more samples of a combined series of images containing a single object in the FOV are shown Fig. 10 below.
The cloud sensor does not prevent the occurrence of clouds in the images. Especially transparent high clouds were often recorded, see Fig. 11.
The image processing provides a lot of parameters to be adjusted which affects sensitivity, false-negative and false-positive detection rates. The settings are chosen to have zero percent false-positive detections. During latest campaign, this requirement was fulfilled, but due to very few air traffic during that time (caused by the COVID-19 pandemic), no aircraft crossed the LOS. This is to current knowledge the only case the system might falsely detect an RSO.
However, false-negative detections were caused from faint objects or rotating RSO with large-intensity variations (like shown in Fig. 10, right). These intensity variations result in higher momentum of the grayscale streak which causes a false-rejection. This kind of exclusion is implemented because of background stars that coincide with the streak induce misplaced detections. Currently, we cannot provide an exact number of the actual false-negative rate as the human observer shows large variations in detecting streaks from a stack of images which range up to 8500 images per night. In a manual review of 15% of the data, a false detection rate of 4% was observed. These were the number of streaks which a human could detect, but not the image processing. It should be noted that the human on the other hand missed about the same number of streaks which the software did detect properly.
For each detected object, the corresponding images were reviewed to validate the measurements. Next, the recorded TDM data are analyzed to review the performance of the system in more detail.
5.1 Detection statistics
During the campaign, the system was running continuously without any interruptions. The following Table 5 lists how many objects (TDM files) were recorded each night. Furthermore, the amount of images recorded, the number of identified objects using the CelesTrak TLE catalog (more details follow in the end of this section) and the resulting unidentified objects are listed. An object was correlated with the available TLE predictions [4] and considered as identified when all measured coordinates divert less than 1° from the prediction. Prediction uncertainties of TLE have been reported being as large as 517 m for in-track and 137 m as average residuals across the catalog for LEO [21]. Considering a 3σ limit, this results in a maximum angular radius of 0.15°.
Compared to former observation campaigns from 2015 [3], the number of unidentified objects has been reduced by about 1/3 (15% vs. 23%). Former observations were performed manually during good conditions only. These conditions were: good weather when the night sky was clear and when LEO RSO are illumined by the sun while the observer is in the earth’s shadow (at night). This is about 1 h after sunset for 4 h and 5 h before sunrise to 1 h before sunrise. Current observations are performed across the entire night depending on the weather conditions, which results in no detection around midnight but lot of detections after sunset and before sunrise. Following Fig. 12. shows the hourly detection rate of all nights between November 20th to December 23rd. Days without any detections are excluded. The detection rate is shown over the time of the day in 1-h steps.
During good conditions, the detection rate is about 30 objects per hour which is 20% higher than previously measured (24.2 obj/h) [3]. This is due to the larger population of LEO RSO compared to 2015. The largest measured detection rate was 36 objects per hour. It can be seen that the system can only detect object during the terminator phase (during twilight hours). These are between 2 am UTC to 6 am UTC in the morning and 4 pm UTC to 7 pm UTC, round midnight (11 pm UTC), the system cannot detect any RSO. It should be noted that observation conditions were various during the campaign, but the system was capable of recording objects once there was a gap in the clouds, which was the case on December 16th and 18th in the morning or in the afternoon of December 13th, 19th and 22nd. Following Figs. 13, 14 show more details of the measured weather conditions and the corresponding detection rates for each day in a separate graph. In each graph, there are 4 separate rows showing the weather conditions: dry (violet)/rain (black), darkness (magenta)/light (black), clear sky (orange)/cloudy (black) and if the observation was started (yellow). E.g. it shows on which time the observation was started as a yellow bar and the resulting detection rate is shown on the right y-axis in blue depending on the time of that day. The presented data cover only 20 days in this article due to space constrains.
The system performed very reliable in judging the weather conditions and observations were started automatically. When weather conditions were good during the terminator phase, the detection rate went up to 36 objects per hour for the 14th of December but mostly ranged between 25 and 30 objects per hour. The only unexpected behavior, which we observed, was wrong recording of the delta sky to ground temperature when the weather sensor became wet. This can be seen as fine lines (orange) in the Clear sky condition in Figs. 13, 14, which caused the system to start observation falsely for a few minutes. Remarkably, the image processing handled those situations without a single false-positive detection. To get more details on the system performance, detected objects need to be identified. These results are presented in following Sect. 5.2.
5.2 Size of detected objects
All detected objects were compared with CelesTrak’s TLE [4] and SATCAT (Satellite catalog) [22] catalog. The TLE catalog is operated and maintained by the NORAD (North American Aerospace Defense Command) and contains publicly available predictions of RSOs. The SATCAT catalog contains supplementary information, like RCS (Radar Cross Section) or launch date of those RSO. All 823 detected RSOs are compared with every object in the TLE catalog to identify it and the SATCAT is used to obtain the RCS if available. The SATCAT does not provide information of all 704 detected TLE objects, which reduces the amount of data to be analyzed to 680. The detection statistics grouped by their RCS is shown in following Fig. 15 as a histogram and in relation to the (predicted) objects distance during observation in Fig. 16.
Figure 15 shows that objects with an RCS in the range between 0.1 and 20 m2 were detected. And the number of detected objects is peaking around 1 and 2.2 m2. As the total number of RSO increases with smaller object size, this indicates that the number of undetected objects rises with objects smaller then RCS 2 m2. In Fig. 16, it can be seen that there is no direct correlation between RSOs RCS or range and the radial angular displacement of the measurements to the predicted positions. The residual deviation of the measured angles of a single object also does not correlate with RCS or range (angular velocity). In Sect. 5.3, the angular measurements are analyzed in more detail.
Furthermore, the detected object can be compared to the cataloged NORAD objects which crossed the FOV during observation. This is also performed using the TLE orbital data and allows to calculate the detection efficiency to all cataloged objects. The detection efficiency gives a figure of merit how good the system can detect certain object size. Following Fig. 17 shows a histogram of detected objects in red and crossed objects in blue according to TLE predictions [4] for different RCS [22]. The bottom half shows the resulting detection efficiency.
Figure 17 shows that the detection efficiency for objects with an RCS larger than 1 m2 is about 50%. Even though objects with an RCS as small as 0.1 m2 were detected, the detection efficiency is effectively 0% due to the large population of smaller RSO.
ESA’s DISCOS catalog [23] allows to obtain physical properties and optical cross section (OCS) of RSOs. The catalog does not provide data of every identified object, but for 410 RSO, the data were available. This allowed evaluating the system’s detectability to RSO dimensions, OCS and mass. In the following, all objects were sorted ascending to their volume. Figure 18 shows the volume and mass, Fig. 19 the dimensions (length, height, depth), Fig. 20 range, apogee and perigee and Fig. 21 compares the RCS to OCS for each object. Additionally, a histogram shows the distribution of each dataset on the y-axis of the graph.
The detected RSOs range between 40 kg and 4t in mass and between 0.045 and 550 m3 in volume. The dimensions of each objects are shown in Fig. 19. Except the smallest objects (Transit 12 (NNS O-8), NORAD 2119, COSPAR: 1966-024A), the dimensions are in the same order of magnitude. As a general rule, the system mostly detects objects with a size of a dish washer and larger.
The majority of detected objects measure a few meters in size. The satellite with the smallest volume measures 30 m in depth. Hence, his OCS is larger than the next larger objects, which measure about 1 m in each dimension.
Following Fig. 20 shows the apogee, perigee and calculated range from the TLE predictions of each identified object, to show if the detectability correlates with the orbit apogee and perigee.
It can be seen that the detected size does not depend on the range to the system. All the objects are detected around 1000 km of range, with a few objects as low as 550 km and up to 4500 km. With a few exceptions the orbits are mostly circular.
The DISCOS catalog also provides a range of OCS values, namely minimum OCS, average OCS and maximum OCS. This represents how the OCS varies as many factors have an impact on the RSO brightness, like phase angle, material, shape, and size. The following Fig. 21 compares the RCS with these three OCS values for each identified object.
The data show that in general, the OCS, RCS, and dimensions are correlated with the OCS being more often larger than the RCS. And the RCS and OCS are similar to the actual dimensions (volume) of each RSO, compare Fig. 21 with Figs. 18, 19. Thus, the RCS statistics shown in Fig. 17 is a good way of illustrating system performance and allows to estimate the dimensions of detected objects.
Next Sect. 5.3 evaluates the angular measurements of the detected and identified objects in more detail.
5.3 Angular measurements
The available TLE prediction data are also used to compare recorded angular position with predictions. Two-sample visualizations from a random subset of detected RSO are shown in Figs. 22, 23. They each show the measured equatorial coordinates of about two dozen RSO and its corresponding predicted position calculated using TLE data [4] from that day. The selected range covers the entire RA-Dec range observed that day.
The displacement between measurement and predictions shows a constant offset for every object measured. The amount and direction of the angular offset vary between objects. To analyze the angular displacements, the angular displacement is separated into the in-track and cross-track angular displacement. Following Figs. 24, 25 show a subset of the in- and cross-track angular displacement distributions between measurement and TLE prediction, respectively, for single objects.
It can be seen that each single distribution is offset and the offset is mostly larger than the deviation of the angular measurements of a single object. In general, the displacement offsets from all objects scatter around zero. For most measurements, the displacement is equally distributed, but other shapes are visible. Both in- and cross-track errors (offset and deviation) are in general equally large for each object. The median angular deviation from all measured objects is just 0.0041°, but the angular deviation varied from − 0.07° up to 0.05°. The constant offset which was observed between measurements and TLE predictions ranged up to about 0.1°.
Figure 26 shows the histograms of both in- and cross-track angular displacement for all measurements taken by APPARILLO during the December campaign. It can be seen that both displacements scatter around 0, which indicates that there is no systematic error causing the uncertainties. The expected residuals from the TLE predictions are larger in-track than cross-track (0.049° vs. 0.013°) [21]. But the observed displacements show very similar distributions, which is why the observed angular cross-track displacement is a measurement error by the system.
Due to the travel time of the light and the velocity of the RSO, an angular aberration is present. By the time the exposure time is synchronized, the object has already moved and Fig. 27 shows the time delay and corresponding angular error for our system depending on orbit height (assuming a circular orbit). For the observed objects, the angular aberration by RSO’s velocity ranges from 0.0012 to 0.0015°, and is not causing the observed displacement offsets of the distributions. Additionally, no systematic error radial from the image center could be observed, which shows that that lens distortion is not causing the angular error.
Due to large uncertainties of the TLE prediction which ranging from 0.01° for small inclinations in LEO up to 0.05° for large inclination in LEO [21], these results do not give the resulting angular in- and cross-track precision. The mean angular standard deviation of the angular measurements of an object was 0.0041° and showed that streak positions are measured with sub-pixel accuracy. Unfortunately, the mean angular displacement to the prediction is rather large with about 0.05°, and can be considered as first approximation of the angular precision. This is about 7 × larger than the pixel scale of the system (0.007°).
It requires more precise orbit predictions, to determine the absolute precision of the system. Unfortunately, only two of the detected objects had more precise CPF [24] predictions available. For those two, the in- and cross-track angular displacement distributions are shown in Fig. 28.
Both objects show a similar mean angular displacement as observed previously (Figs. 24, 25), which is larger than the deviation of the distribution with 0.02° (Jason-3, NORAD: 41240, COSPAR: 2016-002A) and 0.05° (SARAL, NORAD: 39086, COSPAR: 2013-009A) between measured and predicted positions. The standard deviation of the angular measurement is 0.003° for Jason-3 and 0.006° for SARAL. For the majority of objects, this was observed in the analysis using TLE predictions previously (Figs. 24, 25). These deviations are up to 7 × the systems pixel scale of 0.07°/px (see Table 4) and we can conclude that the estimated system precision is about 0.05° or 7 × the pixel scale. But, these are far too few measurements to conclude the finial precision of the staring system.
6 Conclusion
A fully functional staring system was presented which reliably operated between November 20 and December 23, 2020 even under harsh winter weather conditions. The system operated 24/7 during this observation campaign and no false-positive detections were evaluated, which was the major design target of the newly developed image processing. Under optimal observation conditions, the system detected 823 LEO objects within this period with a detection rate up to 36 objects per hour. Multiple detections within one frame can be handled without any constrains. About 15% of observed objects could not be identified using the TLE database [4]. This demonstrates that the presented system is capable of detecting unknown objects and can effectively support subsequent handover to existing database or processing pipelines in form of the TDM format. The detection performance was evaluated using available SATCAT’s RCS and DISCOS data of identified objects. In Figs. 15, 16, it was shown that objects smaller than 1 m2 RCS can already be detected, which is very good performance for such a small system. However, the amount of detections smaller than 0.8 m2 RCS remains below 10%, see Fig. 17. The smallest objects detected were as small as 0.2 m2 in OCS, or 0.25 m2 in RCS (see Fig. 21), or 0.045 m3 in Volume (see Fig. 18). This is about 2 × larger than theoretical predictions (see Table 1).
The mean angular deviation of a measured object was 0.0041° with a mean angular distance of about 0.05° to the predicted positions. Comparison with precise CPF predictions confirmed this numbers and confirmed that the precision of the system can be estimated as 0.05° or 7 × the pixel scale.
Angular data quality will further improve with improved time synchronization of the image exposure times and velocity aberration correction. Improvements in the image evaluation algorithms should be able to suppress the majority of the false-negative detections to approach near unity detection efficiency, by implementing more sophisticated image processing algorithms like presented by Vananti et al. [25]. A different weather station is recommended to eliminate false clear sky identifications and short image recording during cloudy conditions.
The modular and COTS approach allows simple reproduction of this sensor which is easy to operate within any space surveillance network. This allows to easily adopt the system on future or costumer needs and extend by latest developments like processing techniques. Synthetic tracking is worth to mention as it already showed promising results [26] and is close to real-time processing time with latest consumer GPUs. This will allow an order of magnitude higher sensitivity using the same optical components. This system is the base for future developments to extend space surveillance networks with a small low-cost sensor. Stare and chase with subsequent laser ranging allows immediate cataloging capabilities and was already demonstrated [27].
Notes
Raymetrics S.A., 32 Spartis Str, Metamorfosis, GR-14452, Athens, Greece, www.raymetrics.com.
Alte Poststraße 24, 88,690 Uhldingen-Mühlhofen, Germany, www.kormoran-tech.com.
References
Krag, H., Setty, S.J., Di Mira, A., Zayer, I., Flohrer, T.: Ground-based laser for tracking and remediation—an architectural view. IAC-18-A6.7.1. Proceedings of the 69th international astronautical congress (IAC) (2018)
Bamann, C., Hugentobler, U.: Accurate orbit determination of space debris with laser tracking. Proc. 7th European conference on space debris, Flohrer, T., Schmitz, F.(eds.). (2017)
Wagner, P., Hampf, D., Riede, W.: Passive optical space surveillance system for initial LEO object detection. In: Proceedings of 66th international astronautical conference, IAC-15.A6.3x28491 (2015)
CelesTrak: NORAD Two-Line Element Sets. www.celestrak.com/NORAD/elements/ (2020). Accessed 5 Dec 2020
Hasenohr, T., Hampf, D., Wagner, P., et. al.: Initial detection of low earth orbit objects through passive optical wide angle imaging systems. Proc. of Deutscher Luft- und Raumfahrtkongress. https://www.dglr.de/publikationen/2016/420008.pdf (2016). Accessed 25 June 2021
CCSDS: Tracking Data Message—RECOMMENDED STANDARD CCSDS 503.0-P-1.1. https://public.ccsds.org/Lists/CCSDS%205030P11/503x0p11.pdf (2018). Accessed 21 December 2020
Wagner, P., Hasenohr, T., Hampf, D., Sproll, F., Humbert, L., Rodmann, J., Riede, W.: Detection and laser ranging of orbital objects using optical methods. Proceedings Volume 9977, Remote Sensing System Engineering VI; 99770D (2016)
Wagner, P., Hampf, D., Sproll, F., Humbert, L., Hasenohr, T.: System zur Bestimmung und/oder Vorhersage einer Position und/oder einer Flugbahn von orbitalen Objekten im Weltraum. Gebrauchsmusterschrift, DE 20 2017 101 831 U1, https://patents.google.com/patent/DE202017101831U1/en (https://register.dpma.de/DPMAregister/pat/PatSchrifteneinsicht?docId=DE202017101831U1) (2017). Accessed 22 Jan 2021
Wagner, P., Hampf, D., Sproll, F., Hasenohr, T., Humbert, L., Rodmann, J., Riede, W.: Passive optical link budget for LEO space surveillance. Technical proceedings of 18th AMOS conference, https://amostech.com/TechnicalPapers/2017/Poster/Wagner.pdf (2017). Accessed 26 Jan 2021
Coder, R.D., Holzinger, M.J.: Multi-objective design of optical systems for space situational awareness. Acta Astronaut. 128, 669–684 (2016)
Finger Lakes Instruments: ProLine 09000 data sheet. http://www.flicamera.com/spec_sheets/PL09000.pdf (2018). Accessed 2 Aug 2018
Vincent Associates: Uniblitz CS65 datasheet. https://www.uniblitz.com/wp-content/uploads/2020/05/uniblitz-cs65-datasheet.pdf (2020). Accessed 5 Nov 2020
Preisvergleich Internet Services AG: Canon EF 200mm 2.0 L IS USM weiß (2297B005). https://geizhals.de/canon-ef-200mm-2-0-l-is-usm-weiss-2297b005-a292286.html (2020). Accessed 05 Dec 2020
Gross, H.: Handbook of optical systems: vol 1—fundamentals of technical optics. Wiley, Weinheim (2005)
Hampf, D.: GPS camera timer. (15 Dec 2016)
Hampf, D., Wagner, P., Riede, W.: Optical technologies for observation of low earth orbit objects. Proceedings of 65th international astronautical conference. pp. IAC 14,A6,P,70,x24899 (2014)
Hampf, D., Sproll, F., Hasenohr, T.: OOOS: a hardware-independent SLR control system. In ILRS technical workshop, Riga (2017)
Haussmann, R., Wagner, P., Clausen, T.: Streak detection of space debris by a passive optical sensor. Proc. of 8th European conference on space debris (2021)
OpenCV team: OpenCV Website. https://opencv.org/ (2020). Accessed 24 Dec 2020
Astrometry.net project: Astrometry.net astrometric engine. http://astrometry.net/use.html (2020). Accessed 08 Feb 2020
Flohrer, T., Krag, H., Klinkrad R.: Assessment and categorization of TLE orbit errors for the US SSN catalogue. Technical proceedings of 9th AMOS conference (2008)
CelesTrak: SATCAT online satellite database. https://celestrak.com/satcat/boxscore.php (2021). Accessed 23 Jan 2021
ESA: DISCOSweb. https://discosweb.esoc.esa.int/ (2021). Accessed 29 Jan 2021
TU München: satellites tracked by ILRS. https://edc.dgfi.tum.de/en/satellites/ (2020). Accessed 28 Jan 2021
Vananti, A., Schild, K., Schildknecht, T.: Improved detection of faint streaks based on a streak-like spatial filter. Adv. Space Res. 65(1), 364–378 (2020)
Zscheile, J., Wagner, P., Lorbeer, R.-A., Guthier, B.: Synthetic tracking for orbital object detection in LEO. Proc. of 1st NEO and Debris Detection Conference https://conference.sdo.esoc.esa.int/proceedings/neosst1/paper/399/NEOSST1-paper399.pdf (2019) Accessed 9 Feb 2021
Steindorfer, M., Kirchner, G., Koidl, F., Wang, P., Anto, A., Sánchez, J.F., Merz, K.: Stare and chase of space debris targets using real-time derived pointing data. Adv. Space Res. 60, 1201–1209 (2017)
Hasenohr, T.: Initial detection and tracking of objects in low earth orbit. University of Stuttgart, Master thesis (2016)
Kießling, C.: Development of digital imaging methods for an autonomous system for detection and mapping of orbital objects. Johannes Gutenberg-University Mainz, Master Thesis (2018)
Acknowledgements
The concept of passive optical staring in combination with subsequent laser ranging was protected under the utility patent DE 20 2017 101 831 U1 (System zur Bestimmung und/oder Vorhersage einer Position und/oder einer Flugbahn von orbitalen Objekten im Weltraum. Gebrauchsmuster) on 29.03.2017 [8]. Many thanks to Thomas Hasenohr [28] and Carina Kießling [29] who helped to develop the system during their thesis. All analysis was done using Python scripts and shown graphs are plotted using matplotlib library.
Funding
Open Access funding enabled and organized by Projekt DEAL.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Wagner, P., Clausen, T. APPARILLO: a fully operational and autonomous staring system for LEO debris detection. CEAS Space J 14, 303–326 (2022). https://doi.org/10.1007/s12567-021-00380-6
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12567-021-00380-6