Keywords

1 Introduction

The 3rd Generation Partnership Project (3GPP) has in its Release 15 developed the fifth-generation (5G) wireless access technology, known as the 5G system (5GS). 5GS defines the 5G core (5GC) and the new radio (NR) air interface, which features spectrum flexibility, ultra-lean design, forward compatibility, low latency support, and advanced antenna technologies [1]. Built on the first release of NR, the evolutions of NR including 3GPP Release 16 and the ongoing Release 17 are bringing additional capabilities to improve performance and address new use cases. Meanwhile, we are witnessing a resurgent interest in providing connectivity from space. In the past few years, there has been a surge of proposals about using large constellations of low earth orbit (LEO) satellites, such as OneWeb and SpaceX, to provide broadband access. There are also several European H2020 research projects dedicated to integration of satellite and 5G networks such as Shared Access Terrestrial-Satellite Backhaul Network Enabled by Smart Antennas (SANSA) and Virtualized Hybrid Satellite-Terrestrial Systems for Resilient and Flexible Future Networks (VITAL) [2].

The ambition of providing connectivity from space is not new. A series of satellite communications projects (e.g., Iridium and Globalstar) were launched in the 1990s, but the services were limited to voice and low-data-rate communications. A resurgence of interest in providing connectivity from space started around 2014, stimulated by technology advancement and demand for ubiquitous connectivity services. The advancement of microelectronics following Moore’s law has paved the way for using advanced technologies in satellite communications such as multi-spot beam technologies, onboard digital processing, and advanced modulation and coding schemes. Meanwhile, the development cycle and the costs of satellite manufacturing and launching processes have been dramatically reduced.

A major driver of the success of terrestrial mobile networks over the past few decades has been the international standardization effort, which yields the benefits of significant economies of scale. 3GPP has been the dominating standardization development body for several generations of mobile technology. The international standardization effort helps ensure compatibility among vendors and reduce network operation and device costs. In contrast, the interoperability between different satellite solution vendors has been difficult to achieve, and the availability of devices is limited [3].

Already in 2014 it was pointed out that the satellite community must work closely with the mobile 5G community to realize the integration of satellite and 5G networks [4]. Indeed, the satellite industry has realized the need to embrace standardization and furthermore to join forces with the mobile industry in 3GPP. The ongoing evolution of 5G standards provides a unique opportunity to revisit satellite communications. The satellite work in 3GPP is commonly known as non-terrestrial networks (NTN). The objective is to achieve full integration of NTN in 5G by evolving the 5GC and the next-generation radio access network (NG-RAN) protocols and functions including the NR radio interface to support NTN [3].

3GPP Technical Specification Group (TSG) Radio Access Network (RAN) has completed a first NTN study in its Release 15, focusing on channel models, deployment scenarios, and identifying potential key impacts on NR [5]. 3GPP TSG RAN has also conducted a Release 16 NTN study to define and evaluate solutions for the identified key impacts [6]. 3GPP TSG Service and System Aspects (SA) working groups have also completed a study that identifies use cases and requirements when using satellite access in 5G [7, 8]. 3GPP TSG SA is currently conducting further studies on integrating satellite access in 5G including architecture aspects [9] and management and orchestration aspects [10].

In this chapter, we focus on the NR radio access network and study how to adapt the NR air interface for satellite links. The overview provided in this chapter covers the 3GPP state-of-the-art findings (including the recently completed 3GPP Release 16) on NTN in 3GPP TSG RAN. Integrating satellite with 5G networks can also occur in the core network based on recent developments in network softwarization using the tools such as software-defined networking (SDN), network functions virtualization (NFV), and network slicing. We refer interested readers to the recent IEEE Network special issue [11] for more comprehensive treatments on the integration of satellite and 5G networks.

This chapter is an accessible reference for researchers interested in learning the latest 3GPP findings on satellite access in 5G. There are several other works addressing the same topic [12,13,14]. The work [12] focused on using a LEO satellite system to provide backhaul connectivity to terrestrial 5G relay nodes. The work [13] discussed and assessed the impact of the satellite channel characteristics on the physical and medium access control (MAC) layers. The work [14] provided details on higher-layer standardization aspects for both connected mode and idle mode mobility as well as network architecture aspects in both GEO and non-GEO NTN systems. The discussions in this chapter are along similar lines and provide further insights by presenting link budget analysis and new system simulation results on path gain distribution, geometry signal-to-interference ratio (SIR) distribution, and packet delay distribution. In addition, this chapter provides solutions to the identified challenges for adapting NR air interface for satellite links.

2 Use Cases of Satellite Communications

2.1 Introduction to Satellite Communications Use Cases

Satellite access networks have been playing a complementary role in the mobile communications ecosystem. Satellite links can provide direct connectivity to user equipment (UE) or indirectly serve a UE by providing backhaul connectivity to terrestrial base stations or via relay nodes.

Despite the wide deployment of terrestrial mobile networks, there are unserved or underserved areas around the globe. Satellite access networks can augment the terrestrial networks to provide connectivity in rural and remote areas. Satellite connectivity can also be used for backhauling, fostering the rollout of 5G services.

Satellite access networks can benefit communication scenarios with airborne and maritime platforms (onboard aircrafts or vessels) while being attractive in certain machine-to-machine and telemetry applications. In case of natural disasters which disrupt terrestrial communications systems and services in some areas, satellites can help quickly restore the communications network in the affected areas by leveraging their wide coverage to enable rapid response in emergency situations.

Satellite communication is well positioned for broadcasting/multicasting data and media to a broad audience spread over a large geographical area. While television broadcasting has undoubtedly been the main satellite service in this area, there are other use cases as well. For instance, mobile operators and Internet service providers can utilize satellite communications to multicast content to the network edge to facilitate content caching for local distribution.

2.2 Understanding the Use Cases by Link Budget Analysis

To get a more concrete understanding of the use cases of satellite communications, we carry out a link budget analysis based on the assumptions in 3GPP Release 16 NTN study [6]:

  • A LEO satellite operating in the S-band, with both the nominal downlink and uplink carrier frequencies of 2 GHz. The system bandwidth is 30 MHz. For the satellite, the effective isotropic radiated power (EIRP) is 34 dBW/MHz, and the antenna gain-to-noise-temperature (G/T) is 1.1 dB/K. The UE is assumed to be a handheld terminal with 23 dBm EIRP. The UE has two cross-polarized antenna elements and the G/T equals -31.6 dB/K.

  • A LEO satellite operating in the Ka-band, with the nominal downlink and uplink carrier frequencies of 20 GHz and 30 GHz, respectively. The system bandwidth is 400 MHz. For the satellite, the EIRP is 4 dBW/MHz and the antenna G/T is 13 dB/K. The UE is assumed to be a very small aperture terminal (VSAT) with 76.2 dBm EIRP and G/T of 15.9 dB/K.

Table 18.1 presents the link budget calculation results assuming an orbit altitude of 600 km and an elevation angle of 30°. In S-band downlink with 30 MHz bandwidth, the signal-to-noise ratio (SNR) is 8.9 dB, which according to the Shannon formula can yield a spectral efficiency of 3.1 bps/Hz and a total throughput of 93.9 Mbps. In Ka-band downlink with 400 MHz bandwidth, the SNR is 9.4 dB, which can yield a spectral efficiency of 3.3 bps/Hz and a total throughput of 1.32 Gbps.

Table 18.1 Link budget calculation for LEO with 600 km orbital height based on Set-1 satellite parameters in 3GPP TR 38.821. Note that the bandwidth values in this table are units for normalizing EIRP and are not system bandwidths

In S-band uplink, the handheld UE uses 180 kHz bandwidth to obtain the 8.1 dB SNR. The corresponding spectral efficiency is 2.9 bps/Hz and the achieved data rate is 0.52 Mbps. In Ka-band uplink, the VSAT with high transmit power and high gain antenna can use the whole 400 MHz bandwidth and achieves 19.3 dB SNR. The corresponding spectral efficiency is 6.4 bps/Hz and the achieved data rate is 2.57 Gbps.

The above results show that LEO satellite can support use cases with medium-high data rate requirements.

3 A Primer on Satellite Communications

In this section, we provide a primer on satellite communications. We refer interested readers to [15] for a more in-depth introduction to satellite communications.

3.1 Satellite Communications System Architecture

Besides LEO satellite, there are other types of NTN platforms including geosynchronous earth orbit (GEO) satellite, medium earth orbit (MEO) satellite, high elliptical orbit (HEO) satellite, and high-altitude platform station (HAPS). A satellite communications system may consist of the following components [5]: satellite, terminal, gateway, feeder link, service link, and inter-satellite link. An illustration is given in Fig. 18.1. Depending on the implemented functionality of the communication payload of the satellite in the system, we can consider two payload options: bent-pipe transponder and regenerative transponder. With a bent-pipe transponder, the satellite receives signals from the earth, amplifies the received signals, and retransmits the signals to the earth after frequency conversion. With a regenerative transponder, the satellite performs onboard processing to demodulate and decode the received signals and regenerates the signals for further transmission.

Fig. 18.1
figure 1

An illustration of satellite communications system architecture

A modern satellite typically uses multi-spot beam technology to generate multiple high-power beams to cover a geographical area. For a non-geostationary satellite, the footprint may sweep over the earth’s surface with the satellite movement or may be earth fixed with some beam pointing mechanism used by the satellite to compensate for its motion. The radii of the spot beams depend on the satellite communications system design and may range from tens of kilometers to a few thousands of kilometers.

Low latency is a key 5G requirement. While satellite communications systems by nature cannot provide ultra-low latency (e.g., 1 ms), a mega LEO constellation communication system with a sufficient number of gateways properly distributed over large geographic areas can offer low latency (on the order of tens of milliseconds) across long distances. Leveraging inter-satellite links can further reduce the latency.

3.2 Example System-Level Simulation Results

To get some intuition on path gain distribution and SIR distribution in satellite access networks, we present example system-level simulation results for a LEO communications system with 600 km orbital height and 2 GHz carrier frequency at an elevation angle of 90°. The satellite antenna pattern is generated based on a typical parabolic reflector antenna with a circular aperture, as described in the 3GPP TR 38.811 [5]. The diameter of the satellite antenna aperture is 2 m. The UE has two cross-polarized antenna elements, and each antenna element has 0 dBi antenna gain. The satellite creates a hexagonal pattern of spot beams on the ground. The maps in Fig. 18.2 show the center area of this pattern.

Fig. 18.2
figure 2

Example system-level simulation results for a LEO communications system with 600 km orbital height and 2 GHz carrier frequency at 90° elevation angle: subfigure (a) shows path gain and subfigure (b) shows geometry SIR

Figure 18.2a shows the path gain distribution over the simulation area. Path gain here is the sum of free-space path loss and normalized antenna gains. It can be seen that the path gain range is less than 5 dB, which is much smaller when compared to a macro terrestrial network where the path gain range may span over 100 dB.

Figure 18.2b shows the geometry SIR distribution over the same simulation area. Geometry SIR is a measure of the satellite to UE signal quality in a fully loaded network. It can be seen that the geometry SIR range is comparable to that of a macro terrestrial network.

3.3 Varying Coverage in Time and Space

The coverage of a GEO satellite is quite static, with infrequent updates of spot beam pointing directions to compensate for the GEO satellite movement. In contrast, the movements of non-GEO satellites, especially LEO satellites, lead to a varying coverage in time and space [5]. A typical LEO satellite is visible to a ground UE for a few minutes only. This implies that even in a LEO satellite communications system with earth-fixed beams, the serving spot beam associated with the serving satellite for a fixed position on the ground changes every few minutes. In a LEO satellite communications system with moving beams, from the perspective of a fixed position on the ground, a typical spot beam with a radius of tens of kilometers may serve the position for less than 1 minute before another spot beam starts to cover the position. The serving satellite stays the same if the consecutive spot beams covering the position are generated by the same satellite.

Figure 18.3 illustrates the varying coverage in LEO satellite communications with polar orbiting satellites for three different heights. Figure 18.3a shows satellite elevation angle trajectories observed by a static reference UE #0 as a function of time. Assuming a typical 10° minimum satellite elevation angle for service link connection, the UE can stay connected to the satellite passing at 600 km height above for only about 450 s. Figure 18.3b shows the trajectories of the distance between the reference UE #0 and the center of downward pointing spot beam as a function of time. If the spot beam radius is 50 km, the spot beam from the satellite at the height of 600 km covers the UE for only about 15 s.

Fig. 18.3
figure 3

Varying coverage in satellite communications with polar orbiting satellites at three different orbital heights: subfigure (a) shows satellite elevation angle trajectories of a static reference UE #0 as a function of time; subfigure (b) shows the trajectories of the distance between the reference UE #0 and the center of the downward pointing spot beam as a function of time

3.4 Propagation Delays

Rapid interactions between a UE and its serving base station in a terrestrial mobile communications system are possible since the propagation delay is usually within 1 ms. In contrast, the propagation delay in a satellite link is much longer [5]. The one-way propagation time between a GEO satellite and a ground UE is 119.3 ms, assuming that the radio signal propagates at the speed of light in a vacuum and that the UE is immediately underneath the GEO satellite. With 600 km LEO satellite height, the minimum service link propagation delay is 2 ms attained at 90° satellite elevation angle. The propagation delay may increase to ~6.5 ms at 10° satellite elevation angle.

Differential delay, which refers to the propagation delay difference of two selected points in the same spot beam, is of interest as it impacts the multi-access scheme. Since the feeder link is shared by the devices in the same spot beam, the differential delay mainly depends on the size of the spot beam, which results in different path lengths of the service links.

3.5 Doppler Effects

In terrestrial mobile communications systems, Doppler effects are typically caused by the movements of the UE and surrounding objects, while in satellite systems the satellite movement induces additional Doppler effects [5]. Doppler effect is pronounced in LEO systems. At the height of 600 km, a LEO satellite moves at the speed of 7.56 km/s, which can result in a Doppler shift value as large as about 48 kHz at the carrier frequency of 2 GHz. In addition, the Doppler shift value varies rapidly over time, and the rate of such variation is referred to as the Doppler variation rate.

The Doppler effects due to satellite movements in GEO systems in most cases can be negligible. Note that when a satellite is in near GEO orbit with inclination up to 6° relative to the equatorial plane, the Doppler shift can reach around 300 Hz at the carrier frequency of 2 GHz [5]. Terrestrial mobile technologies such as NR have been designed to handle this order of magnitude of Doppler shift values. For a satellite communications system operating at a higher frequency, the Doppler shift increases proportionally. NR supports a flexible numerology, which can handle the increased Doppler shift with increased carrier frequency. For example, 15 kHz subcarrier spacing in NR can be used to handle the ~300 Hz Doppler shift in the GEO satellite communications system operating at 2 GHz. The Doppler shift would increase to ~3000 Hz if the GEO satellite communications system would operate at 20 GHz. In this case, 120 kHz subcarrier spacing in NR can be used to handle the increased Doppler shift.

4 Design Aspects of NR over Satellite Links

In this section, we describe several key areas that require adaptation to evolve NR for satellite communications.

4.1 Uplink Timing Control

4.1.1 Problems

NR utilizes orthogonal frequency-division multiple access (OFDMA) as its multi-access scheme in the uplink. The transmissions from different UEs in a cell are time-aligned at the 5G NodeB (gNB) to maintain uplink orthogonality. Time alignment is achieved by using different timing advance values at different UEs to compensate for their different propagation delays. The gNB can estimate the timing advance value based on the UE’s uplink signals such as the physical random-access channel (PRACH) preamble. The existing NR uplink timing control scheme, however, has not been designed to handle the large propagation delays incurred in satellite communications.

4.1.2 Potential Solutions

One promising approach is to rely on global navigation satellite system (GNSS)-based techniques. Each UE equipped with a GNSS chipset determines its position, calculates its propagation delay with respect to the serving satellite using ephemeris data of the satellite constellation, and derives the initial timing advance value. The UE then uses its initial timing advance value to initiate the random-access procedure, which can help to further refine the timing advance to cope with a residual timing error associated with the initial timing advance estimate.

Some low-cost, reduced complexity UEs may not be equipped with GNSS chipsets. Thus, non-GNSS-based techniques are also needed. One possible technique may work as follows. For each spot beam, the gNB may choose a reference point such as the center of the spot beam and adjust its uplink receiver timing with respect to the reference point. With this approach, the uplink timing control only needs to handle the delay difference between each UE and the reference point instead of the much larger absolute propagation delays. The existing uplink timing control can be directly used for spot beams with radii up to about 200–300 km. For spot beams with radii larger than 300 km, further adaptation of uplink timing control design may be needed.

4.2 Frequency Synchronization

4.2.1 Problems

NR uses orthogonal frequency-division multiplexing (OFDM) for both downlink and uplink transmissions and additionally supports the use of discrete Fourier transform (DFT) spread OFDM in the uplink. Maintaining the orthogonality of OFDM requires tight frequency synchronization between transmitter and receiver. The downlink synchronization can be treated as a point-to-point OFDM synchronization problem since each receiver in a cell tunes its downlink reference frequency based on the received synchronization signals. The uplink synchronization is more challenging since it is a multipoint-to-point synchronization problem in OFDMA-based NR. The transmissions from different UEs in a cell need to be frequency-aligned at the gNB to maintain uplink orthogonality. Therefore, different frequency adjustment values at different UEs are needed in the uplink to compensate for their different Doppler shifts.

4.2.2 Potential Solutions

GNSS-based techniques can be used for uplink frequency adjustment: Each UE equipped with a GNSS chipset determines its position and calculates its frequency adjustment value based on the estimated Doppler shift using its position information, satellite ephemeris data, and carrier frequencies. To mitigate the effects of large Doppler shifts due to satellite movements in non-GEO satellite communications systems, pre-compensation can be applied to forward link signals [5]: A time-varying frequency offset tracking the Doppler shift is applied to the forward link reference frequency such that the forward link signals for a spot beam received at a reference point in the spot beam appear to have zero Doppler shift. With pre-compensation, the Doppler shift of the forward link signals received at a given location in the spot beam becomes equal to the difference between the original Doppler shifts of the given location and the reference point.

The Doppler shift differences at different locations in the spot beam however are different and time-varying. Consider a spot beam with 100 km radius in a LEO satellite system with 600 km orbital height. Even if frequency pre-compensation is applied to the downlink of the service link with respect to the center of the spot beam, the Doppler shift difference of a point at the edge of the spot beam and a reference point in the center of the spot beam can still be as large as 8 kHz at 2 GHz carrier frequency.

For non-GNSS-based frequency adjustment techniques, the gNB may estimate the return link frequency shift of each UE and transmit is a corresponding frequency adjustment command to the UE. To establish the uplink orthogonality as early as possible, it is desirable that the gNB estimates the uplink frequency shift from the random-access preamble transmitted by the UE and includes the frequency adjustment command in the random-access response message. The challenge is that the gNB has to estimate both timing advance and frequency adjustment values based on the PRACH preamble.

The existing NR PRACH preambles are based on Zadoff-Chu sequences. In case of large timing and frequency uncertainties, there are several peaks in the ambiguity function of Zadoff-Chu sequences in the delay-Doppler plane, leading to timing and frequency ambiguities. In other words, due to the nature of Zadoff-Chu sequences, both timing delay and frequency shift cause cyclic shift in the observation window of the received Zadoff-Chu sequence at the gNB. One potential solution is to transmit not only the Zadoff-Chu sequence but also its complex conjugate R1-1912725, "On NTN synchronization, random access, and timing advance," Ericsson, 3GPP TSG-RAN WG1 Meeting #99, Reno, November 2019. The gNB can then detect a composite cyclic shift from the first transmission and another composite cyclic shift from the second transmission. Based on the two composite cyclic shifts, the effects of delay and frequency shifts can be separated.

4.3 Hybrid Automatic Repeat Request

4.3.1 Problems

To combat transmission errors, NR uses a combination of forward error correction and automatic repeat request (ARQ), which is known as hybrid ARQ (HARQ). NR supports 16 HARQ processes with stop-and-wait protocols per component carrier in both uplink and downlink. In a stop-and-wait protocol, the transmitter stops and waits for acknowledgement after each (re)transmission. Using 16 HARQ processes with stop-and-wait protocols would lead to significant throughput reduction especially in GEO communications systems [5].

4.3.2 Potential Solutions

One straightforward approach is to increase the number of HARQ processes to cope with the increased round-trip delays in satellite communications systems. This, however, comes at the cost of UE implementation complexity due to the increased UE HARQ soft buffer size. Another approach is to introduce a mechanism in NR to support the possibility of turning off retransmissions in HARQ processes. Instead, the retransmissions are handled by the layers above MAC if error-free data units are required at the receiver. For example, the radio link control (RLC) layer supports an acknowledged mode that may be used for the retransmission of erroneous data. Retransmissions at the layers above MAC may lead to increased latency due to the slower feedback. Additionally, the communications system may need to operate with more conservative coding rate in the physical layer to avoid excessive retransmissions in the layers above MAC.

Figure 18.4 shows an example of simulated delay distributions for a traffic model defined by periodic packet arrival. The delay of RLC protocol data units (PDU) includes the delays in physical layer and MAC layer. The delay of packet data convergence protocol (PDCP) service data units (SDU) includes the delays in physical layer, MAC layer, RLC layer, and PDCP layer. Figure 18.4a, b, show the delay distributions of RLC PDU and PDCP SDU when HARQ is used and the transport block size is 1000 bits. Due to the good link quality, most of the RLC PDUs are successfully received without HARQ retransmission and thus have a ~256 ms delay, while a small fraction of the RLC PDUs are successfully received with one HARQ retransmission and thus have a ~800 ms delay. Accordingly, most of the PDCP SDUs have delays in the range of 0.8–2 s. In contrast, when HARQ is not used, Figure 18.4c, d, show the delay distributions of RLC PDU and PDCP SDU. While most of the RLC PDUs have ~256 ms delay due to the good link quality, small fraction of the RLC PDUs has a delay greater than ~1.25 s. Accordingly, compared to the case with HARQ, more PDCP SDUs have delays greater than ~2 s when HARQ is not used. This is because HARQ the feedback is faster than feedback in higher-layer protocols.

Fig. 18.4
figure 4

Simulated delay distributions for a traffic model characterized by periodic packet arrival (1 kilobyte per second), 256 ms one-way propagation delay, RLC acknowledged mode, and good link quality: subfigures (a) and (b), respectively, show the delay distributions of RLC PDU and PDCP SDU when HARQ is used; subfigures (c) and (d), respectively, show the delay distributions of RLC PDU and PDCP SDU when HARQ is not used

4.4 Idle Mode UE Tracking and Paging

4.4.1 Problems

When a UE in idle mode selects or reselects a cell, it reads the broadcast system information to learn which tracking area the cell belongs to. If the cell does not belong to any of the tracking areas to which the UE is registered, the UE performs a tracking area update to notify the network of the tracking area of the cell it is currently camping on. For a GEO satellite communications system where the cell’s coverage area is usually fixed on the ground, the existing UE tracking and paging procedures in NR can be largely reused. However, for a non-GEO satellite communications system with moving beams, the cell’s coverage area moves on the ground. Under the existing UE tracking and paging procedures in NR, the tracking area sweeps over the ground as well. As a result, a stationary UE would have to keep performing location registration in idle mode [6].

4.4.2 Potential Solutions

One potential solution to tracking idle mode UEs in non-GEO satellite communications system with moving beams would be to decouple the tracking area management from the moving cell pattern. While the beams and cells are moving, the registration and tracking areas are fixed on the ground. This implies that while the cells sweep over the ground, the broadcasted tracking area is changed when the cell enters the area of the next earth-fixed tracking area location [14].

5 Conclusions

It is an interesting time to witness the increasing interest in satellite communications. The ongoing evolution of 5G standards provides a unique opportunity to revisit satellite communications. Though NR has been designed mainly targeting terrestrial mobile communications, the inherent flexibility of NR allows it to be evolved to support NTN. As this chapter has highlighted, when adapting NR to support satellite communications, there are challenges including long propagation delays, large Doppler shifts, and moving cells. Addressing such challenges requires a rethinking of many of the working assumptions and models used to date for designing NR. Throughout this chapter, we have attempted to highlight ideas on how to overcome the key technical challenges faced by NR evolution for satellite communications in the 3GPP Release 16 NTN study.