1 Introduction

IEEE 802.15.4 is one of the key IoT standards with a variety of objectives to benefit a wide range of IoT services [1]. The standard specifies the operation of media access control (MAC) and physical layers of low-rate wireless personal area network (LR-WPAN), which is referred to standard IoT system for convenience in this work [2]. This gives IoT developers a high level of flexibility to design the related top layers. The standard IoT systems can benefit from different features. Due to the low power consumption, they are well-suited to energy-restricted and battery-operated IoT devices by extending battery life, which is an IoT key requirement [3]. Moreover, the low cost and ease of installation ensure their cost efficiency and complexity at a minimum level. In this regard, the implementation of the IPv6 over low-power WPAN (6LoWPAN) protocol at the network layer provides additional benefits to the standard IoT systems [4]. It provides the ability to transmit IPv6 data between IoT devices by introducing the adaptation layer. Because standard IoT systems have packet sizes of up to 127 bytes, the IP header compression techniques, such as IPHC and HC1 are performed by the adaptation layer to efficiently reduce the 40-byte IPv6 headers down to a few bytes [4], with IPHC outperforming HC1 [5]. The adaptation layer also ensures that the packet size complies with the standard criterion through the fragmentation and reassembly process [6]. Additional features offered by the 6LoWPAN protocol are the flexibility to connect and communicate with different networking standards, effective routing, and support for large-scale IoT systems.

These features of the standard IoT systems allow for the provision of a wide and diverse set of IoT services and applications, including industrial control and monitoring, public security and surveillance, home and office automation, smart cities, and smart health care [7]. However, despite the benefits, standard IoT systems continue to face major challenges. The limited coverage distance in which IoT services are provided (up to 100 m), as well as the low data rate (up to 250Kbps), can impose limits on the given IoT services and restrict their use from being expanded to new IoT applications. Therefore, addressing these challenges at various levels of planning can enhance the performance of standard IoT systems and contribute to the further development of a new level of IoT systems, applications, and services. To address the challenges, this work proposes a model that correlates and expands standard IoT systems from personal to wide areas, thus improving performance in terms of providing fast data processing and remote links for IoT data access. To achieve these, the model develops two IoT systems.

The first system is based on 5G cellular technology and it is called 5GIoT. The 5GIoT features a new dual connection interface (DC) that, besides using the benefits of the standard IoT system, addresses its limitations by leveraging the benefits of the 5G standard. The 5G technology is expected to improve current technologies, including IoT, advanced healthcare, smart cities, automation, and video surveillance with regard to speed, capacity, and reliability [8, 9]. The 5G networks can operate in a wide range of frequency bands, including low band Sub-1 (below 1 GHz), mid-band Sub-6 (between 1 and 6 GHz), and high band millimeter waves (mmWave; between 6 and 100 GHz) [10]. The unique characteristics of each frequency band provide developers a great amount of flexibility in using the appropriate band depending on explicit network requirements. Moreover, 5G offers both narrow and broad carrier channels spanning from 5 to 400 MHz, while broader channels offer a higher data transfer rate [11].

However, despite these benefits, the feature-rich structure of the 5G standard brings certain challenges which make its capabilities directly depend on a broad range of contributing key factors. Moreover, the fact that each factor includes a different set of values creates more challenges and further extends the deployment complexity. Therefore, the model develops the second new DC interface based on long-term evolution (LTE) cellular technology to enable both standard IoT and LTE 4G design benefits. In comparison to the 5G technology, LTE only runs in Sub-1 and Sub-6 bands with no mmWave support and the carrier channels range from 1.4 to 20 MHz [12, 13], which are considerably smaller. While these features of LTE have the disadvantage of limiting data transmission speed when compared to 5G, they offer the advantage of using less power for transmission. These key differences between 5G and LTE features can lead to profound changes in the performance and functionality of IoT systems. On the other hand, IoT is a global technology with a wide range of applications, each with its own unique set of requirements and criteria. These signify the importance of precisely determining the actual level of improvements offered by 5GIoT and LTEIoT systems to the IoT domain, avoiding any overestimation or underestimation. Accordingly, the main contributions of this work are as follows.

  • The standard IoT system is included in the model with two objectives: to establish its benefits and limits, and to serve as a reference system for comparative purposes.

  • The 5GIoT and LTEIoT systems are developed with two new edge gateways (EGW) into which new DC interfaces are integrated. The EGWs are responsible to implement multiple functions related to standard IoT and cellular networks.

  • The 6LoWPAN protocol with IPHC support is implemented in all three IoT systems to transfer IPv6 data.

  • The systems are implemented in diverse use cases with five distinct IoT system dependent (ISD) factors, including operational frequency, data transfer rate, number of active IoT devices, coverage distance, and data packet structure. The efficiency and functionality of the 5GIoT and LTEIoT systems are measured and compared to each other, as well as to the standard IoT system to validate the model, precisely determine primary differences between 5GIoT and LTEIoT, and establish their real advantages and constraints in the IoT domain.

The remainder of the work is structured as follows. The second section addresses the most recent relevant works. Section 3 contains the model and implementation details. Section 4 presents the simulation results and associated analysis, and Sect. 5 concludes the work.

2 Related works

Different techniques have been provided to leverage 5G and LTE for IoT communication architecture. The authors in [14] compare non-cellular long range (LoRa) IoT with cellular narrowband IoT (NB-IoT) which as a subset of LTE standard uses the same Sub-6 frequency spectrum but with limited bandwidth. The findings suggest that when the same frequency spectrum is used, cellular NB-IoT systems outperform non-cellular LoRa in terms of improved coverage area and reduced link loss. However, the work does not evaluate the 5G architecture in the IoT domain while the link loss is the only evaluation parameter measured for comparative purposes. The use of LoRa IoT with LTE and 5G as the primary transmission architecture are also proposed in [15, 16], respectively, although no implementation is provided to verify the idea. In contrast, an experimental setup is presented in [17] to evaluate the concurrent performance of the 5G and LTE in the 700 MHz frequency band but with no reference to the IoT requirements.

The authors in [18] present the IoT development and challenges associated with using cellular LTE and non-cellular Wi-Fi, as well as a technique for their optimal usage in the 5 GHz frequency band. Two distinct arrival rates are selected to measure the throughput and latency parameters that determine a better performance of cellular connections over Wi-Fi. However, 5G implementation is not part of the IoT development. The usage of Wi-Fi standard as the communication connection of the standard IoT systems is proposed in [19] to determine the performance of the systems but no cellular deployment is available. A technique for LTE deployment as the communication connection of Wi-Fi networks is also described in [20], but without consideration of IoT requirements. The intelligent transportation system as one of the IoT applications is taken into account in [21] by adopting two cellular IoT technologies, including LTE-M and NB-IoT. While the findings indicate that LTE-M achieves lower latency than NB-IoT, the 5G standard for IoT development is not part of the system.

The LTE-M and NB-IoT cellular architectures are also implemented in [22] to determine their power consumption levels and battery lifetime estimation of IoT devices. However, the implementation is solely based on LTE, and the measurements are limited to power consumption evaluation; no additional parameters are provided. The two IoT cellular technologies studied in [23] are LTE-M and LTE category zero (LTE-Cat0). The throughput, delay, jitter, and loss ratio of the packets are measured as the distance between the devices and the eNB as well as the number of users changes. According to the findings, LTE-M provides better support for IoT systems. The NB-IoT performance in urban areas is provided in [24] utilizing the reference signal received power (RSRP) parameter for further analysis as the distance varies. However, no IoT services are available to compare the findings and further determine the level of improvements achieved by NB-IoT. A device-to-device communication mechanism is proposed in [25] for enabling IoT services over LTE cellular networks. To measure the throughput and access rate, the distance between IoT devices and their gateway, as well as the distance between IoT gateway and eNB are changed together with an increasing number of IoT devices up to 50. However, no standard IoT implementation is offered to be compared with LTE accomplishments. Furthermore, the model is based on LTE, and the IoT services delivered via 5G networks are not included.

The authors in [26] propose a technique for deploying Zigbee IoT systems in 5G environments that combines the advantages of the Zigbee standard, such as low cost, low power, and simple installation, with the advantages of 5G for transmission connection. The impacts of increasing the number of joining nodes (from 0 to 50) are examined when 5G terminals function as coordinators, as well as when they work as routers compared to standard Zigbee with no 5G presence. The packet delivery ratio, hop counts, and end-to-end delay are measured and the results demonstrate that Zigbee IoT systems function better in 5G environments. However, the work is limited as the other relevant aspects and factors, aside from the number of nodes, are not examined. The authors in [27] propose a design architecture technique for developing standard IoT systems with cellular NB-IoT as the connection link for IoT applications including industrial control applications, smart cities, environmental monitoring, smart metering, transportation, or structural health. Energy efficiency is assessed in terms of battery life and error rate, which determines a 20% gain in system performance. However, the design architecture technique excludes 5G networks and does not evaluate further IoT-related parameters. The deployment of cellular NB-IoT in standard IoT systems is also suggested in [28], although no implementation is offered. Utilizing 5G as the main IoT transmission link is proposed in [29] along with the associated challenges and issues but no implementation is provided. The dual connection of 5G and LTE is proposed in [30] to improve the handover process, however, it excludes IoT systems and requirements. With regard to the improvement in hardware for IoT-based devices supporting 5G, a 12 transistors SRAM is presented in [31] to improve the reading and writing process and thus reduce power consumption.

From the related works, certain limitations are identified. For the cellular IoT development, they give insight only into the 4G systems, including NB-IoT and LTE-M, while for the non-cellular approaches they are also restricted to LoRa. Furthermore, the current works related to developing 5G-based IoT networks are merely descriptive. They present the idea and discuss the associated challenges and concerns, but they lack implementation and measurements to validate the possible advantages and limitations. While the low cost, power consumption, and complexity as the key features of the standard IoT systems are well suited for a wide range of multi-purpose IoT services, the additional advantages of adopting 5G can improve the existing IoT services and enable the development of new options in IoT domain. Despite the importance, no implementation is performed to measure and establish the benefits and limits of 5G-based IoT systems and their practical functionality and performance are mostly unknown. Thus, to address the limitations, this work proposes an IoT model based on 5G and 4G LTE cellular technologies. The simulations are conducted to implement and validate the model in relation to the core benefits and limitations.

3 Materials and methods

The network simulator (ns-3) is used to design and implement the model [32]. The model consists of three main parts, including the development of the IoT systems, model validation using IoT system-dependent (ISD) factors, and performance modeling. Figure 1 presents the model followed by details of its development and implementation.

Fig. 1
figure 1

The proposed IoT model architecture

3.1 Development of IoT systems

The model develops a 5G-based IoT system which is called 5GIoT. In this system, the IoT devices in the data acquisition (DA) layer include the standard IoT interface with 6LoWPAN and IPHC compression technique at the network layer. This enables the advantages of the standard IoT system (low power consumption, complexity, and cost) and 6LoWPAN protocol in the 5GIoT system. Moreover, an edge gateway (EGW) called 5GIoT-EGW is developed with dual connectivity (DC) interface to take advantage of 5G technology in the system. The devices in the DA layer capture local environmental data and send it to 5GIoT-EGW. The DC equips the 5GIoT-EGW with one standard IoT interface on one side, allowing it to connect to the standard IoT networks and accept the connection requests and IoT data, and one 5G interface on the other side, allowing it to communicate with the 5G networks and forward the data to the data processing and management (DPM) unit via the 5G core network. Accordingly, the 5GIoT-EGW is responsible for the following tasks:

  • to provide the connection and exchange data between the standard IoT and 5G technologies

  • to perform protocol conversion and encapsulation and thereby ensuring compatibility and interoperability between the two technologies

  • to provide long-distance communication and remote access to the local IoT data

  • to enforce the IPv6 header compression procedure

  • to perform fragmentation and reassembly of IoT data packets

  • to implement scheduling and IoT data decoding

Upon reception of the data collected by IoT devices, the 5GIoT-EGW forwards it to the 5G base station node B (gNB) through the 5G new radio (NR) access link. Because 5G technology offers multiple frequency bands, the access link between 5GIoT-EGW and gNB supports the most common worldwide frequencies in Sub-1, Sub-6, and mmWave bands. The gNB sends the data to the 5G core components which in turn forward data to the DPM unit where all the information processing and analysis are performed to conduct the required actions in the 5GIoT system.

The model further develops a 4G LTE-based IoT system called LTEIoT. The system comprises an edge gateway (LTEIoT-EGW) with a DC interface to exchange IoT data between LTE and standard IoT networks. The LTE access link between the LTEIoT-EGW and LTE base station node B (eNB) supports the most popular Sub-1 and Sub-6 frequency bands. As with 5GIoT, the frequency bands supported by LTEIoT are selected with two objectives: first, to implement the system using the most globally available bands, and second, to provide fair and reliable measurement and comparison conditions between the 5GIoT and LTEIoT systems when operating in a similar frequency band. The third IoT system in the model is a standard 2.4 GHz IoT system that serves two purposes. It is used to evaluate the performance and thus determine the capabilities and limitations of standard IoT systems. It is also used as a reference model for comparative evaluation of the 5GIoT and LTEIoT systems to determine the level of improvements these systems offer to the IoT domain compared to when they are not utilized.

3.2 Model validation

IoT as an evolving technology is extending into a wide diverse range of applications, each with certain requirements. Fulfilling the requirements is among the major challenges of IoT systems, for which they need to be validated against various conditions to ensure compliance with real-world deployments. Validation of IoT systems is performed based on defining their capabilities for diverse services from collecting and processing data to providing final decisions in a flexible and reliable way. For this purpose, the model includes a set of five configurable IoT system-dependent (ISD) factors to validate the performance of the 5GIoT and LTEIoT systems and determine the maximum as well as the optimal measurements. The ISD factors are described as follows.

3.2.1 Spectrum management

As mentioned, 5G and LTE technologies support different frequency bands. The lower frequencies offer lower path loss, which makes them more suitable for long-distance use cases but at the cost of data transfer speed. When it comes to speed demands, higher frequencies are the alternative solution, but at the cost of distance. Because different IoT systems have different requirements, spectrum management is a vital task in given systems to select the frequency band that is best suited to fulfill the explicit demands. Therefore, to adapt to different frequency ranges, the model supports the most common available bands in terms of global usage. The 5GIoT system supports the n28 band (700 MHz) as the most common Sub-1 band, the three most used Sub-6 bands, including n3 (1800 MHz), n7 (2600 MHz), and n78 (3500 MHz), as well as the n258 (26 GHz) and n260 (39 GHz) as the most common mmWave bands [33]. Similarly, the LTEIoT system supports the most popular Sub-1 and Sub-6 frequency bands, which are b20 (800 MHz), b3 (1800 MHz), and b7 (2600 MHz), respectively. On the contrary, the standard IoT system in the model only operates in the 2.4 GHz frequency band as defined by the standard [34].

3.2.2 Coverage distance (CD)

The second IoT key factor is the planning and optimization of coverage distance (CD) over which reliable data transmission is provided. The standard IoT system can only cover distances of up to 100 m. This short coverage distance restricts IoT data accessibility over remote links. In contrast, 5GIoT and LTEIoT systems by supporting different frequency bands can overcome the restriction and extend the distance. This implies the importance of measuring the maximum as well as the optimal distances covered by 5GIoT and LTEIoT systems. For this purpose, the model includes up to 11 alternative distances, D1–D11, depending on frequency bands. The testing CDs provided by the model are presented in Table 1. They are selected for each frequency band independently and arranged from smallest to largest such that they verify covered and uncovered distances in the 5GIoT and LTEIoT systems for both remote and local deployments.

Table 1 Testing coverage distances in the IoT systems

3.2.3 Number of active IoT devices (NIoT)

Network capacity is an indicator of the available resources in terms of the maximum number of active users allowed to connect to the network. Due to resource constraints, there are certain limitations on this number to further ensure the quality of the provided services. To determine this limit and capacity of the 5GIoT and LTEIoT systems, the model randomly distributes a low to high number of IoT devices varied in the range of NIoT ∊ {5, 10, 15, 20, 25, 30, 35, 40, 45, 50} in the given sensing area with 20 active devices as the default value. This facilitates determining the maximum and optimal number of IoT devices that 5GIoT and LTEIoT systems support with regard to the required quality of services.

3.2.4 Rate of data transfer (RDT)

The rate of data transfer (RDT) by IoT devices is the fourth factor differentiating IoT systems. Fast and real-time communication is a primary concern for various IoT applications. They demand a high and sufficient data rate lack of which can significantly affect the overall performance. However, transmitting at high rates consumes more power from IoT devices and results in shorter battery life. Therefore, to achieve the required performance while consuming less power and extending the battery life of IoT devices, it is essential to determine the transfer rate requirements of the given IoT systems. For this purpose, the model offers low to high RDT varied in the range of RDT ∊ {5, 10, 15, 20, 25, 30, 35, 40, 45, 50, 100, 150, 200, 250} Kbps with 20 Kbps as the default value. The range selection is based on the transfer rate of standard IoT systems which is up to 250 Kbps.

3.2.5 Packet characteristics

The last IoT factor is packet characteristics in terms of payload size (PS) and packet compression. Larger packets carry more data. However, since the standard IoT system limits the packets to 127 bytes, any packet that exceeds this limit will result in fragmentation. Fragmentation reduces efficiency due to resource consumption and the additional processing overhead to perform reassembly at the end recipient. Shorter packets, on the other hand, are transmitted faster which makes them more suitable for real-time communications, but at the cost of higher transmission overhead caused by extra headers. As different applications set different limits on packet size, flexible and efficient adaptation to traffic variation is essential in 5GIoT and LTEIoT systems to conform to the specified requirements of diverse applications. Therefore, the model allows various payload sizes for IoT data in the presence of the IPHC compression method. The payload sizes are in the range of PS ∊ {10, 20, 30, 40, 50, 60, 70, 80, 90, 100} bytes with 40B as a default value. In this way, the performance of 5GIoT and LTEIoT systems is measured with and without fragmentation and reassembly overhead and the optimal lengths of IoT packets are determined.

3.3 Performance modeling

When developing an IoT system, it is important to evaluate its performance and capabilities to ensure that the system meets the specified requirements and objectives. This is achieved by utilizing parameters that take into account different criteria, such as time sensitivity, speed, and service quality. The parameters to evaluate the performance of 5GIoT and LTEIoT systems include throughput, reliability in terms of data loss rate, end-to-end (E2E) latency, jitter, signal to interference plus noise ratio (SINR), fairness index (FI), and channel efficiency (CE).

Because network resources are limited, they must be fairly allocated and managed to fulfill the needs of all requests. Fairness is characterized as the proper allocation of limited system resources among users upon request. The model takes into account the fairness issue in the 5GIoT and LTEIoT systems by using Jain's fairness index (FI), a well-known metric for evaluating the fairness of resource allocation [35]. It is a number between zero and one, with a value closer to one signifying better resource distribution. The model measures fairness based on the throughput of every connected IoT device (Ti) as total received bits over to the total receiving time interval. The findings are used to evaluate the level of fairness 5GIoT and LTEIoT systems provide for IoT resource allocation compared to that of the standard IoT system. The model measures the fairness index using Eq. (1).

$$ \begin{aligned} & {\text{FI}} = \frac{{\left( {T_{1} + T_{2} + \cdots + \, T_{{N_{{{\text{IoT}}}} }} } \right)^{2} }}{{N_{{{\text{IoT}}}} \times \left( {T_{1}^{2} + T_{2}^{2} + \cdots + T_{{N_{{{\text{IoT}}}} }}^{2} } \right)}} \\ & {\text{where:}}\left\{ \begin{gathered} N_{{{\text{IoT}}}} \;{\text{is the number of active IoT devices}}\; \in \;\{ 5,10,15,20,25,30,35,40,45,50\} \hfill \\ T_{i} \;{\text{is Throughput of the}}\;i{\text{th IoT device}},\;\forall i \in 1 \le i \le N_{{{\text{IoT}}}} \hfill \\ \end{gathered} \right. \\ \end{aligned} $$
(1)

Channel efficiency (CE) is a helpful indicator to measure the quality of services and determine the efficiency of bandwidth utilization. It is expressed as a percentage of successfully transmitted data relative to the theoretical value, with a higher number indicating better performance [36]. The model measures CE using Eq. (2).

$$ \begin{aligned} & {\text{CE}} = \frac{{\sum\nolimits_{i = 1}^{{i = N_{{{\text{IoT}}}} }} {{\text{Avg}}T_{i} } }}{{R_{{{\text{DT}}}} \times N_{{{\text{IoT}}}} }} \times 100 \\ & {\text{where:}}\left\{ \begin{gathered} R_{{{\text{DT}}}} \;{\text{is the data tranfer rate}} \hfill \\ {\text{Avg}}T_{i} \;{\text{is the average Throughput of the}}\;i{\text{th IoT device over time}},\quad \forall i \in 1 \le i \le N_{{{\text{IoT}}}} \hfill \\ \end{gathered} \right. \\ \end{aligned} $$
(2)

The performance modeling is further extended to determine the relationship between the ISD factors and the achievable performance of 5GIoT, LTEIoT, and the standard IoT systems. For this purpose, the Pearson correlation (PC) and p‐value (PV) with the alpha significance level set at 0.05 (5%) are used [37].

The hardware specifications of the system used to implement the model include an Intel(R) Core-i7 CPU with 8 GB of memory running the Kali Linux 2022 operating system. The simulation time is 300 s, during which the performance modeling parameters are measured to evaluate 5GIoT, LTEIoT, and standard IoT systems in diverse use cases. Table 2 summarizes the notations and descriptions used in the model.

Table 2 Notations and descriptions used in the model

4 Results and discussion

The model is implemented and the results are presented in this section. The performance of 5GIoT and LTEIoT systems is measured and compared to each other to first verify the functionality of the two systems as well as the benefits and challenges associated with 5G and LTE technologies when used in the IoT domain, and then to determine whether one system outperforms another to meet diverse IoT requirements. In addition, the performance of 5GIoT and LTEIoT systems is compared to the standard IoT to determine their level of efficiency and the extent to which they can provide improvements to the IoT domain.

4.1 Capacity analysis

To gain precise insight into the capacity of 5GIoT and LTEIoT systems, their performance is evaluated as the number of active IoT devices and demand for resources increase. The findings establish the resource management capabilities of the systems as well as the optimal number of IoT devices to deliver services consistently and reliably. The throughput and loss rate findings are provided in Fig. 2.

Fig. 2
figure 2

Capacity analysis relating to throughput (left) and data loss rate (right)

The results imply multiple findings. It is observed that different frequency bands in the 5GIoT system attain the same performance for a different number of IoT devices. The low n28 frequency band performs similarly to the mid n3, n7, n78, and mmWave n258 and n260 bands. Likewise, the same performance is observed in the LTEIoT system for the low b20 band as well as the mid b3 and b7 bands. This gives great flexibility to IoT developers and service providers for adapting alternative frequency bands based on available resources. The results also demonstrate that 5GIoT achieves the same performance as the LTEIoT, with no noticeable differences. The throughput results imply that 5GIoT and LTEIoT improve the standard IoT system when the number of active IoT devices is less than 20, after which there are no significant differences between the three systems. The improvement level, on the other hand, decreases for all three systems due to more intensive competition to access the transmission channel as more IoT devices are connected and involved in data communication. The highest throughput is obtained when 5 active IoT devices are connected to the systems, which is as high as the default transfer rate (20Kbps). Then, as more devices are connected, throughput drops so that for 50 devices it is approximately 2Kbps, which is one-tenth of the default rate.

Concerning the packet loss rate, no significant differences are observed between the 5GIoT and LTEIoT systems and the standard IoT while increasing the number of IoT devices directly increases the loss rate. The findings suggest restricting the number of active IoT devices below 20 to obtain a better performance in 5GIoT and LTEIoT systems. This high level of packet loss rate can be attributed to the fact that the communication between IoT devices is based on a centralized star topology, as shown in Fig. 1. Standard IoT systems offer both mesh and star topologies. However, due to the cost and complexity associated with mesh for the arrangement of IoT devices, star topology is provided by the model. While conserving battery life, the star topology has a simpler structure and is less complex than mesh. However, because IoT devices cannot interact directly and the entire communication is routed through a single EGW, there may be more collisions and retransmissions, resulting in a higher packet loss rate.

Although 5G and LTE can enhance network performance, their implementation in real-time IoT systems must be efficient and conform to their explicit timing constraints to achieve that enhancement. As a result, in addition to speed and reliability requirements, the performance of 5GIoT and LTEIoT systems for latency-critical IoT applications must be evaluated. Figure 3 depicts the latency and jitter results under a varying number of IoT devices.

Fig. 3
figure 3

Capacity analysis relating to latency (left) and jitter (right)

The results confirm that increasing network density causes higher delay values in 5GIoT and LTEIoT systems when compared to standard IoT. Delay is the total time duration between the generation of a packet and its successful reception at the final destination. It is well known that as more devices attempt to access the transmission channel to send data, increased congestion causes packet buffering and extends transmission time because packets cannot be processed at the rate they arrive. This buffering occurs in every core component as the packet travel through the network toward its destination. As a result of having more core components in the 5GIoT and LTEIoT systems, there is more buffering and hence more latency. Consequently, the results confirm that IoT data transmission through the multi-hop core components of 5GIoT and LTEIoT increases delay and jitter when compared to single-hop communication in standard IoT systems.

While both 5GIoT and LTEIoT result in higher latency than standard IoT, it is higher in the 5GIoT system compared to LTEIoT. The latency in the 5GIoT system begins initially at 0.4 s and then extends as it becomes more congested with additional devices, reaching 0.9 s at its peak. At this point, the system has reached its maximum bandwidth capacity and is unable to accommodate further traffic, thus latency begins to reduce as the consequence of data loss and having fewer packets in the network. Accordingly, the initial delay in the LTEIoT system is around 0.2 s, which as a result of the growing congestion increases to 0.8 s at the highest level and then decreases due to the lost packets. In comparison, the standard IoT begins with a 0.05 s delay and reaches 0.3 s at the highest level, which is significantly lower than those of the 5GIoT and LTEIoT systems. Therefore, it is concluded that to conform to the latency requirements, while the findings show a better performance for the standard IoT, the LTEIoT outperforms 5GIoT. To further investigate the performance of the systems, the channel efficiency and fairness of resource distribution among IoT devices are measured as more of them connect to the systems. The findings are shown in Fig. 4.

Fig. 4
figure 4

Capacity analysis relating to fairness (left) and channel efficiency (right)

Fair resource distribution happens when resources are allocated properly based on demands and requirements, with 50% fairness being the threshold limit of an acceptable level below which inefficient resource distribution occurs. According to the findings, 5GIoT system is better capable of fairly distributing and managing resources than LTEIoT and standard IoT systems. In this aspect, 5GIoT initially delivers 100% fairness, implying that resources are distributed fairly among devices, and it can maintain this highest level of fairness even after increasing the number of IoT devices to 15. After that, although the fairness begins to decline, it remains high as more devices connect to the system and require resources. When the network is at its highest point of congestion with 50 active IoT devices, the average fairness (50%) is maintained.

With regard to the LTEIoT system, an initial fairness level of 80% is provided and is maintained at that level for up to 20 active IoT devices, which is higher than the standard IoT. By connecting more devices to the system, resource allocation fairness falls to a lower level than standard IoT. When the number of IoT devices exceeds 30, the LTEIoT fairness falls below the acceptable level. In this case, the lowest fairness is about 20% when the system is highly congested with the resource demands of 50 IoT devices. Therefore, the fairness findings and comparison of the 5GIoT and LTEIoT with the standard IoT systems determine that 5GIoT is more efficient as it outperforms the others by offering a higher degree of fairness among IoT devices even under the most congested conditions. The LTEIoT system can provide a better resource allocation and fairness than the standard IoT system when the number of IoT devices is less than 25, beyond which the fairness declines to a lower level. In terms of channel efficiency, the 5GIoT and LTEIoT systems deliver comparable results while outperforming the standard IoT. In both systems, the maximum channel efficiency of 100% is initially reached for data transmission, as opposed to the standard IoT, which begins at 80% as the highest level. As more IoT devices join and attempt to transmit data, the channel efficiency decreases in the three systems, whereas the reduction is higher in standard IoT systems. However, at the point of the highest density with 50 active devices, there are no significant differences between the three systems and they all have channel efficiency of about 10%.

4.2 Data transfer rate assessment

Increasing the transfer rate is an alternative approach to speed up data access and the decision-making process while also ensuring that IoT applications with data-intensive requirements function effectively. However, transiting at higher rates results in higher power consumption, which is not desirable for energy-constrained IoT devices. As a result, determining the optimal IoT data transfer rate for the given systems is essential to support diverse applications and meet energy-constrained requirements. Accordingly, 5GIoT and LTEIoT systems are deployed with varying transmission rates of data collected from IoT devices. The throughput and loss rate evaluation are demonstrated in Fig. 5.

Fig. 5
figure 5

Influence of IoT data transfer rate variations on throughput (left) and data loss rate (right)

According to the findings, 5GIoT and LTEIoT achieve higher throughput than the standard IoT system, with LTEIoT outperforming 5GIoT. In both systems, increasing the data rate from 5 to 20 Kbps improves throughput, but beyond that point, there are no significant changes. When high-speed mode (RDT = 250 Kbps) is enabled on IoT devices, the highest throughput values are achieved by LTEIoT and 5GIoT systems, which are about 6 and 5.2 Kbps, respectively. In this case, the actual throughput achieved by the systems is substantially lower than what is provided to them. This implies that, while increasing the transfer rate to the maximum feasible level is expected to improve performance, the results show that the improvement is minor. The highest achieved throughput with RDT = 30 Kpbs is around 5.8 Kbps, whereas it is 6Kbps with RDT = 250 Kbps, indicating no significant difference. As a result, RDT = 20 Kbps is determined as the optimal data transfer rate in the 5GIoT and LTEIoT systems, and transmission above that is inefficient because it requires more power to obtain the same throughput as the lower data rate of 20 Kbps, which consumes less power.

Despite considerable differences in throughput values, 5GIoT and LTEIoT provide a comparable level of link reliability measured by the loss rate of IoT data. The findings show that increasing the IoT data transfer rate results in increased packet loss. This is because communication happens at a faster rate than the systems’ capacity allows, which causes packet loss. The packet loss rate of 5GIoT and LTEIoT systems in full-speed mode (RDT = 250 Kbps) is 0.95, which is comparable to that of the standard IoT system. It is the point at which practically a system failure in terms of successful data transfer occurs. As a consequence, transfer rates of up to 20 Kbps are sufficient for ensuring reliable connection. On the other hand, because real-time and delay-sensitive IoT applications demand certain data rates for smooth and seamless transmission, it is necessary to assess their performance on an end-to-end basis when the data transfer rate varies in the 5GIoT and LTEIoT systems and further determine the appropriate data rates to fulfill real-time demands. The end-to-end delay and jitter findings are presented in Fig. 6.

Fig. 6
figure 6

Influence of IoT data transfer rate variations on latency (left) and jitter (right)

The results reveal that the increased rate of IoT data transmission directly affects the end-to-end delay in 5GIoT and LTEIoT systems. As the data transfer rate increases, more packets are sent across the network in a shorter time, thereby exhausting the link's transmission capacity and requiring packet buffering before forwarding. In this case, the LTEIoT system manages the congestion induced by large traffic volume better than 5GIoT in terms of decreased latency. In both systems, the delay is initially minimal (0.1 s) and then grows as the data rate increases, reaching 1.3 s and 1.6 s for LTEIoT and 5GIoT, respectively, at full speed (RDT = 250 Kbps). Comparing these results with the standard IoT system shows that while LTEIoT can mitigate the effects of congestion caused by high data rates and improve performance, 5GIoT system provides delay values comparable to those of the standard IoT system with no significant improvement. These findings are also consistent with the jitter results. They indicate that the LTEIoT system outperforms 5GIoT by providing lower jitter and more stable connection link for IoT data communication. The performance of the 5GIoT and LTEIoT systems is further examined to verify the optimal IoT transfer rate and achieve the required level of fairness and channel efficiency. The results are provided in Fig. 7.

Fig. 7
figure 7

Influence of IoT data transfer rate variations on fairness (left) and channel efficiency (right)

Congestion is one of the primary causes of network performance degradation. Congestion is commonly caused by either an increase in the number of devices or an increase in the data transmission rate, both of which can generate a large amount of traffic in the network. This is confirmed by the above results so that as the transfer rate of IoT data increases, the fairness of managing requests and distributing resources decreases. Initially, the 5GIoT system delivers a high level of fairness (90%). Although the fairness level falls as the transfer rate increases, the system is able to maintain it at a high level of approximately 70% even in full speed mode, when the network is heavily congested with the highest possible data rate (RDT = 250 Kbps). As a result, given that any fairness above 50% is considered acceptable, the 5GIoT can fairly distribute resources even under congestion conditions. In contrast, while the LTEIoT system provides less fairness than 5GIoT, it is also less than the standard IoT system. Although the fairness level of LTEIoT is above the threshold at transfer rates of up to 10 Kbps, it thereafter drops and remains between 40 and 44%, which is below the acceptable level. When it comes to channel efficiency, both 5GIoT and LTEIoT deliver similar results to standard IoT systems without substantial differences. According to the results, efficient communication is provided at lower data rates.

4.3 Coverage evaluation

In order to validate the capability of 5GIoT and LTEIoT systems for providing remote access to IoT data, it is required to determine the maximum distance that they can cover. Thus, the performance of the systems is measured and evaluated over various distance ranges, as shown in Table 1, to determine the optimal and maximum distances achieved by the systems. The throughput and loss rate results are provided in Fig. 8.

Fig. 8
figure 8

Comparison between coverage distance and throughput and loss rate in: a standard IoT, b 5GIoT-n28, c 5GIoT-n3, d 5GIoT-n7, e 5GIoT-n78, f 5GIoT-n258, g 5GIoT-n260, h LTEIoT-b20, i LTEIoT-b3, j LTEIoT-b7

Contrary to capacity and data rate measurements, which revealed no substantial differences between the performance of different frequency bands, the above results reveal significant differences in their maximum transmission distance. From the results, a very limited coverage distance of the standard IoT system confirms its limitation in long-distance transmission. While it performs optimally at distances of up to 50 m, the maximum coverage distance is 90 m, beyond which the system fails to function. The 5GIoT and LTEIoT systems, on the other hand, can significantly extend the coverage distance, reaching up to 19,000 m at maximum. The results confirm that lower frequencies in 5GIoT and LTEIoT systems cover larger distances than higher frequencies, which is due to power attenuation and signal loss with distance. In this aspect, the n28 maintains the longest coverage distance in 5GIoT over a connection of around 1300 m, as opposed to 300 m for the n260 mmWave.

Using the n28 Sub-1 band (700 MHz), optimal performance for the 5GIoT system is achieved over links of up to 1300 m. By increasing the frequency from Sub-1 to Sub-6, including the n3 (1800 MHz), n7 (2600 MHz), and n78 (3500 MHz), while the same performance is achieved, the transmission distance reduces so that the longest links with reliable connectivity are around 800 m, 600 m and 450 m, respectively. When it comes to mmWave bands with significantly higher frequency than the Sub-1 and Sub-6, the 5GIoT cell coverage is reduced, with 400 m and 300 m being the optimal transmission distances provided by the n258 (26 GHz) and n260 (39 GHz), respectively. In comparison to 5GIoT, the LTEIoT system provides substantially longer distances. The b20 Sub-1 band, which runs at the same frequency as n28 (800 MHz), can provide coverage of up to 19,000 m, compared to 1300 m for n28. While retaining the same performance as Sub-1, increasing the frequency to Sub-6 using the b3 (1800 MHz) and b7 (2600 MHz), reduces the transmission distances to about 15,000 m and 11,000 m, respectively, compared to 800 m and 600 m for the n3 and n7. As a result, whereas the standard IoT system has the shortest transmission distance (90 m), 5GIoT and LTEIoT systems meet the long-distance coverage criteria. In this regard, the longest link covered by the LTEIoT system can reach up to 19,000 m using the b20 band, which also outperforms 5GIoT by covering 1300 m at maximum using the n28 band. To further determine the performance of real-time IoT services delivered by 5GIoT and LTEIoT systems over long distances, the delay and jitter findings are provided in Fig. 9.

Fig. 9
figure 9

Comparison between coverage distance and latency and jitter in a standard IoT, b 5GIoT-n28, c 5GIoT-n3, d 5GIoT-n7, e 5GIoT-n78, f 5GIoT-n258, g 5GIoT-n260, h LTEIoT-b20, i LTEIoT-b3, j LTEIoT-b7

IoT real-time services are highly time-sensitive and their functionality depends on low network latency and jitter. The above results imply that while the standard IoT system has the shortest coverage distance, it also provides lower latency with a maximum of 0.4 s compared to LTEIoT and 5GIoT systems. The 5GIoT results corroborate the findings on the optimal distances reached by different frequency bands. In this case, the maximum delay values measured during communication with Sub-1, Sub-6, and mmWave are up to 0.8 s over their respective optimum distances. This implies that the n28 with the optimal distance of up to 1300 m can deliver IoT data with the same latency values as the n3, n7, n78, n258, and 260 bands with up to 800 m, 600 m, 450 m, 400 m, and 230 m as optimal coverage distances.

For the LTEIoT system, however different findings are observed. While the optimal and maximum distances reached by the b20 band for data-intensive applications are around 19,000 m, the optimal distance for real-time services is 11000 m with a delay of 0.8 s and the maximum distance is 19000 m with a delay of 0.7 s. When using the b3 band, the optimal and maximum distances are 11,000 m and 15,000 m with 0.7 s and 0.1 s delay, respectively. The b7 band, like the b3, provides the same delay values for optimal and maximum distances of 7000 m and 11,000 m, respectively. In terms of jitter performance, the 5GIoT and LTEIoT systems obtain comparable results as delay. Therefore, from the results, it is concluded that the LTEIoT system by providing greater coverage distance and lower latency than the 5GIoT can better fulfill the delay requirements of real-time IoT services.

In addition to evaluating the efficiency of the systems for communication over long distances, it is also important to determine the quality of the communication. Therefore, the SINR of communication in the 5GIoT and LTEIoT systems are measured to determine the quality of communication. The SINR results are provided in Fig. 10.

Fig. 10
figure 10

Comparison between coverage distance and SINR distribution in a standard IoT, b 5GIoT-n28, c 5GIoT-n3, d 5GIoT-n7, e 5GIoT-n78, f 5GIoT-n258, g 5GIoT-n260, h LTEIoT-b20, i LTEIoT-b3, j LTEIoT-b7

It is well known that SINR degrades with an increase in distance as the signal travels through transmission media. Compared to standard IoT systems, the 5GIoT and LTEIoT significantly extend the range and provide higher SINR for distances up to D5, beyond which the quality of the signal deteriorates at longer distances (D6–D11). The results show that different frequency bands have varied features. Regardless of the type of IoT system, the quality of the received signals degrades over longer distances until no signal is detected, which results in the failure of the IoT data transmission process. Moreover, lower frequencies are capable of offering a higher level of signal quality during data transmission. As defined by the standard, the SINR range for 5G is specified in the range of − 23 to 40 dB, whereas the top margin for LTE is set at 20 dB. According to this criterion, the SINR level is considered excellent if it is greater than 20 dB, good if it is between 13 and 20 dB, and fair to poor if it is less than 13 dB [38]. Accordingly, it is observed that the signal quality over the provided distances is better in the 5GIoT as compared to the LTEIoT system. The 5GIoT with the n28 band offers the highest signal quality, while the b7 band provides the lowest in the LTEIoT system. Figure 11 presents a better SINR visualization of the systems based on the corresponding frequency bands over the covered distances.

Fig. 11
figure 11

SINR distribution over distance

4.4 Packet characteristics analysis

Due to the diversity of IoT applications, different packet structures are required for different services. While faster transmission of shorter packets can fulfill the data rate requirements of certain applications, such as real-time, the resulting increase in traffic volume may limit other services. Larger packets, on the other hand, reduce traffic volume at the cost of higher data transmission latency and transmission overhead imposed by the fragmentation and reassembly process. As a result, it is essential to evaluate the flexible and efficient adaptation of the IoT to traffic variations and to further determine the optimal feature attributes as the key to their successful deployment. The model is implemented with regard to packet characteristics in terms of the size of packets (small, medium, and large) in the presence of the 6LoWPAN protocol and the IPHC compression algorithm. Figure 12 depicts the throughput and loss rate of IoT data.

Fig. 12
figure 12

Packet structure effects on throughput (left) and data loss rate (right)

The above results indicate that when the same packet structure is utilized, the 5GIoT and LTEIoT systems achieve comparable performance and they both outperform the standard IoT system for packets smaller than 70B. The findings also show that, regardless of the type of IoT system, increasing the packet size improves performance by providing higher throughput and lower data loss rate. The improvement, however, persists until the size of IoT data is up to 70 B, after which the performance begins to decline. Because from the results the PS of 70 B is determined as a threshold point at which state transmission begins and significant changes occur in the systems, it is specified as the maximum transmission unit (MTU) in the 5GIoT and LTEIoT systems comparable to the standard IoT system. To validate this, the operation of the systems is monitored during data transmission and the captured results with a resolution of one size before and one size after the threshold (70 B) are presented in Fig. 13.

Fig. 13
figure 13

No fragmentation for PS ≤ 70 B (left) compared to fragmentation for PS > 70 B (right)

The above results demonstrate that 70 B payloads create 123 B packets, implying that the total size of the headers is 53 bytes. Because the maximum packet size of the standard IoT is 127 bytes, this leaves 74 bytes for the payload. Thus, any payload larger than 74 B will produce a packet larger than 127 B, resulting in fragmentation. This is the reason for the sudden state transmission in the throughput and loss rate results at the point of 70 B as the payload size. The figure on the left side shows that payloads of up to 70 B are not fragmented. The reason is that the packet size for payloads of 70 B is 123 B, which is less than the maximum allowable size (127 B) [39]. The fragmentation process, on the other hand, is depicted in the figure on the right because 80 B payloads generate 133 B packets, which are larger than the maximum allowed size. This fragmentation process consumes system resources and degrades performance. As a result, while payloads of up to 74 B can improve the performance of 5GIoT and LTEIoT systems, the optimal payload size is 74 B. In order to further determine the effects of fragmentation and reassembly process on time-constraint IoT services in the 5GIoT and LTEIoT systems, the delay and jitter values are measured and the results are shown in Fig. 14.

Fig. 14
figure 14

Packet structure effects on latency (left) and jitter (right)

While due to having more core components, the 5GIoT and LTEIoT systems result in higher latency than standard IoT, both systems achieve comparable performance. The IoT applications with the same packet structure obtain equivalent latency values in the 5GIoT and LTEIoT. In this case, the optimal performance in both systems is achieved when IoT devices generate data with payloads of 70 bytes. As the payload size of IoT data increases closer to 70 B, better delay outcomes are achieved, which enhance performance and fulfill the requirements of real-time IoT services. In contrast, when IoT devices transmit data with payloads larger than 70 B, the resulting packet fragmentation and reassembly require a longer time and increase the overall end-to-end latency of the data delivery process.

4.5 Correlation analysis

The PC and PV are used commonly to quantify and establish relationships between different variables. Accordingly, they are used by the model to determine the relationship between the ISD factors and the performance achieved by the 5GIoT, LTEIoT, and standard IoT systems. Table 3 presents the PC and PV statistical analysis between the density of IoT devices and the achievable performance.

Table 3 Correlation between the density of IoT devices and performance

Because PV is smaller than the alpha value (0.05) and the correlation is negative for throughput, channel efficiency, and fairness parameters, they are correlated inversely with NIoT. Therefore, as the number of active IoT devices increases, the throughput, channel efficiency, and fairness of 5GIoT and LTEIoT systems decreases. On the other hand, the positive correlation values for loss rate and latency of IoT data delivery confirm that they have a positive linear relationship, so that increasing density in 5GIoT and LTEIoT systems directly increases the loss rate and latency and affects the performance. Table 4 shows the relationship between IoT data transmission rate and the performance achieved by 5GIoT, LTEIoT, and standard IoT systems.

Table 4 Correlation between IoT data transmission rate and performance

The positive correlation values along with highly significant PV results for throughput, IoT packet loss rate, and latency of data delivery confirm that as IoT devices transmit data at higher rates, performance in IoT systems decreases. The reason is the transmission link capacity exhaustion and packet buffering as a result of increasing volume of traffic. The strength of the relationship is higher for standard IoT systems compared to 5GIoT and LTEIoT throughput. The negative correlation values, together with PV values of less than the threshold, imply that transmission at higher rates degrades network performance in terms of less fairness in resource distribution and less efficient utilization of the available bandwidth. Table 5 shows the PC and PV statistics to help establish the relationship between the payload size of IoT data and performance.

Table 5 Correlation between IoT data payload size with IPHC and performance

The negative correlation results show that payload size is inversely correlated with the performance of the IoT systems. However, because performance directly relies on the 70 B MTU size, the strength of the relationship is not very high. For payloads smaller than MTU, the performance improves and as the payload size grows larger than the MTU, the performance degrades. This results in an average low correlation value. Contrary to capacity, data rate, and payload size measurements, which found no substantial differences between the performance of different frequency bands, the results revealed significant differences in transmission distance. Therefore, in order to establish the correlation between the coverage distance and performance of the IoT systems, the PC and PV statistics are measured independently for each frequency band. The results are shown in Fig. 15.

Fig. 15
figure 15

The p-value (left) and correlation (right) between coverage distance and performance

Negative correlation values show that, regardless of the type of IoT system, throughput and SINR are inversely related to coverage distance. Therefore, increasing the coverage distance reduces systems throughput and SINR, while the correlation is stronger for LTEIoT and standard IoT systems compared to 5GIoT. The same relationship results are obtained for the loss rate of the IoT data, except that they are positively correlated. In terms of latency, while the correlation is positive for all the frequency bands in the 5GIoT system, it is only positive for the b20 band in LTEIoT with a highly significant PV (0.00548), which also provides the largest coverage distance when compared to b3 and b7 bands.

5 Conclusion

To overcome the constraints associated with the standard IoT system, this work develops a model with two IoT systems called 5GIoT and LTEIoT. The model further includes standard IoT, which is used as a reference system for comparison purposes. The model is implemented to validate the systems and determine their performance for diverse IoT use cases. The network capacity results show that increasing the number of active IoT devices reduces the performance of all three IoT systems. The 5GIoT and LTEIoT outperform the standard IoT by improving fairness and channel efficiency while retaining comparable throughput and loss rates. In comparison to the reduced latency of the standard IoT system, LTEIoT meets the criteria for delay-sensitive services better than the 5GIoT system. Regarding the speed requirements of IoT services, the throughput of 5GIoT and LTEIoT outperforms the standard IoT system, although they all achieve similar performance in other aspects except fairness. The 5GIoT has better fairness for resource distribution than LTEIoT, which is even lower than the standard IoT system. When it comes to remote links and long-distance coverage, both 5GIoT and LTEIoT significantly outperform the standard IoT system. It can only cover up to 90 m compared to 19,000 m and 1300 m for LTEIoT using the b20 and 5GIoT using the n28 frequency band, respectively. In order to adapt to traffic variation for diverse services, preserving the size of payloads up to the fragmentation limit (74 B) improves performance, beyond which performance of all the three systems degrades. In this regard, both 5GIoT and LTEIoT achieve higher throughput than the standard system, which outperforms them with lower IoT data latency and loss rate. As part of future work, the 5G and LTE features are utilized to improve long-range wide area network (LoRaWAN) performance.