1 Introduction

The rapid progress of science and technology over the past decade has elevated the complexity of real-world optimization challenges, driving the creation of efficient and effective optimization algorithms. In recent times, addressing optimization challenges has emerged as a compelling and dynamic subject across various research domains. The rapidly expanding domain of decision-making problems can be characterized as instances of optimization problems. The first step of optimization involves creating an objective function that can be either maximized or minimized. Subsequently, after formulating the optimization problem, it becomes necessary to employ an optimization algorithm to explore the optimal variables leading to the most favourable solution. Real-world optimization challenges are expressed in mathematical terms as outlined below:

$$minimize\, f\left(X\right).$$
$$s.t. g\left(X\right)\le 0.$$
$$X\epsilon {R}^{n}$$

where \(X=\{{x}_{1}, {x}_{2}, \dots , {x}_{n} \}\) is an \(n-\) dimensional solution, \({R}^{n}\) is the domain of definition, \(f\left(X\right)\) is the objective function, and \(g\left(X\right)\) is the constraint function.

Generally, optimization challenges can be addressed through deterministic or stochastic approaches. Deterministic methods such as linear and non-linear programming, which rely on gradient information, are commonly used to discover optimal solutions. However, these traditional deterministic approaches may converge to local optima. In order to address the constraints associated with conventional methods, stochastic approaches like meta-heuristic algorithms can be applied to tackle complex real-world optimization problems. Therefore, metaheuristic algorithms have gained significant popularity and are extensively employed to address practical optimization challenges across various domains, including cloud computing [1], scheduling [2], neural networks [3], feature selection [4], image segmentation [5], fuzzy control [6], photovoltaic models [7], civil engineering [8], reliability-based design [9], and more.

The primary advantages of meta-heuristic algorithms include their simplicity, flexibility, capacity to evade local optima, and mechanisms that operate without relying on derivatives. The search process of meta-heuristic algorithms involves two distinct phases: exploration and exploitation. During the exploration stage, the algorithm extensively covers the search space to identify promising regions that may lead to the optimal solution. Inadequate exploration may result in being trapped in local optima. The exploitation phase concentrates on searching around the promising regions identified during exploration. Failure to execute effective exploitation can substantially diminish solution accuracy. Striking a balance between exploration and exploitation stands as a significant challenge for meta-heuristic approaches. An effective meta-heuristic algorithm is characterized by: (i). Optimal balance between exploration and exploitation (ii). Achievement of a high level of accuracy. (iii). Convergence towards the optimal solution while avoiding local optima. (iv). Consistent and stable performance, ensuring minimal variation in results across independent runs.

In recent years, numerous metaheuristic algorithms have been developed and implemented in a wide range of fields. For an in-depth and comprehensive review of these algorithms, please refer to Table 1. Despite numerous recently proposed algorithms in the field of optimization, a fundamental question remains about the necessity of introducing more optimization techniques. The No Free Lunch (NFL) theorem logically establishes that no single algorithm can solve all optimization problems. Therefore, the success of an algorithm in addressing a specific set of problems does not ensure effectiveness across all optimization problems with varied types and characteristics. In simpler terms, most optimization methods perform similarly on average, even though they may be really good at solving specific types of problems. The NFL theorem encourages researchers to develop new optimization algorithms designed to specific subsets of problems in diverse fields. This motivation underlies the presented work, which introduces a simple yet effective optimization technique designed for real problems with unknown search spaces. However, the development of numerous metaheuristic algorithms aimed at discovering optimal solutions, significant gaps persist between the theoretical concepts and the practical implementation of these algorithms. Some of these gaps are as follows:

  • Metaheuristic algorithms may struggle to fully explore solution spaces.

  • Slow convergence can lead to premature settling for suboptimal solutions.

  • High-dimensional problems pose challenges due to increased computational demands.

  • Some methods require complex parameter tuning, making them resourcE−intensive.

  • Performance varies based on specific optimization problem characteristics.

  • Adaptation to real-world problems, especially in dynamic or uncertain environments, may be challenging for metaheuristics.

Table 1 Survey of related works of recently published metaheuristic algorithms

To address the problems in existing algorithms, researchers are encouraged to develop new and creative algorithms with practical solutions for current challenges. It is suggested to create algorithms that reliably reach a solution, have a strong system for controlling and adjusting parameters, and perform well within reasonable time limits. This study also demonstrates that a meta-heuristic doesn't necessarily require direct inspiration; instead, basic mathematical functions, such as the hyperbolic \(sinh\) function, can be employed to create optimization algorithms in this domain. The proposed algorithm utilizes the hyperbolic \(sinh\) function to navigate and exploit the search space, leading to improved solutions. This research introduces the Hyperbolic Sine Optimizer (HSO), a novel meta-heuristic optimization algorithm designed to address a diverse range of optimization challenges in scientific applications. The algorithm places a primary emphasis on mathematical concepts, specifically exploring mechanisms related to algebraic and hyperbolic functions within the context of population members. A distinctive aspect of this study lies in the development of exploration and exploitation phases for metaheuristic algorithms, departing from conventional mechanisms and relying entirely on behavioural patterns associated with hyperbolic function convergence. The exploration and exploitation phases are pivotal in any meta-heuristic algorithm, and this research introduces a unique approach to these stages. Unlike many existing population-based meta-heuristic optimization algorithms, the HSO ensures that population members actively contribute to the exploration of the search region. This deviation from the usual practice, where the population commonly stays inactive during the exploration phase, underscores the importance of involving the contributions of individual population members to enhance the efficiency of investigating the search region. The researchers have addressed this gap by developing an optimizer that utilizes the collective impact of population members to effectively explore the search region. To delve deeper into the functioning of the optimizer, the subsequent subsection provides additional insights. The effectiveness of any meta-heuristic algorithm is contingent upon achieving a well-balanced interplay between exploration and exploitation. In this context, the authors have presented the exploration and exploitation stages using the hyperbolic \(sinh\) function, marking a notable advancement for the HSO. The application of the \(sinh\) hyperbolic function in behavior analysis adds an exciting dimension to the optimizer, showcasing its potential to expedite problem-solving processes and reach global minima or maxima. The exploitation phase, crucial in optimization, relies on a set of algebraic equations based on population elements, further underlining the algorithm's versatility and efficacy in handling complex optimization challenges. Overall, the Hyperbolic Sine Optimizer presents a promising avenue for meta-heuristic optimization, offering a fresh perspective on exploration and exploitation dynamics in the pursuit of global extrema for diverse problem domains.

The main contributions of this study, are as follows.

  • HSO actively involves individual population members, deviating from conventional practices, ensuring a more comprehensive exploration of solution spaces.

  • Utilizes distinctive exploration and exploitation phases with hyperbolic \(sinh\) function convergence to enhance convergence speed, mitigate premature convergence, and actively involve population members for addressing slow convergence

  • Demonstrates efficiency in handling high-dimensional optimization problems, with comprehensive evaluations across dimensions (30, 100, 500, and 1000) showcasing applicability.

  • Utilizes mathematical principles to simplify parameter adjustment, employing behavioural dynamics with algebraic and hyperbolic \(sinh\) functions. This potentially reduces resource demands compared to complex-tuned algorithms.

  • Emphasizes mathematical concepts and unique exploration/exploitation phases, providing a robust, versatile approach to address variability in performance across different optimization problem characteristics.

  • Active involvement of population members in exploration enhances adaptability, and unique dynamics introduced by hyperbolic functions contribute to improved adaptability in real-world and uncertain environments.

The main objective of this paper is to develop a robust optimization algorithm suitable for addressing various challenges encountered in real-world optimization scenarios. This research effectively addresses substantial challenges in the research field related to the theoretical foundation by presenting a clear and straightforward layout. In contrast to other metaheuristic algorithms, this study stands out for its precise mathematical and theoretical underpinnings, exemplified by the incorporation of the hyperbolic \(sinh\) function.

2 Literature review

The focus of current research is on using metaheuristic algorithms for complex engineering problems, driven by their proven effectiveness. This has led to a substantial growth in literature on metaheuristic techniques. The theoretical research within literature can be categorized into three primary avenues: enhancing existing techniques, combining diverse algorithms, and suggesting novel algorithms. In the initial approach, scholars endeavour to enhance algorithmic performance by incorporating various mathematical or stochastic operators. The second prevalent research direction involves merging different algorithms to enhance overall performance or address particular issues. Finally, proposing new algorithms stands out as a widely pursued research path. The inception of innovative algorithms often draws inspiration from evolutionary phenomena, the collective behaviour of creatures (utilizing swarm intelligence techniques), fundamental physical principles, and concepts related to human experiences. The literature is progressively adopting classification methods based on these sources of inspiration, leading to the identification of four main categories of algorithms: swarm-based, evolutionary-based, physics/chemistry-based, and social/human-based algorithms. Table 1 summarizes notable metaheuristic algorithms.

  • Swarm intelligence based algorithms: Swarm Intelligence simulates labour division and collaboration among organisms, evolving the population through interactions. Examples of swarm intelligence based algorithms include: PSO [10], a widely recognized swarm-based metaheuristic algorithm introduced by Kennedy and Eberhart in 1995. Another classical algorithm is Ant Colony Optimization (ACO), proposed by Dorigo et al. [11], inspired by ants' foraging behaviours and the communication of chemical pheromone trails to find optimal paths. The Artificial Bee Colony (ABC) algorithm [12] is inspired by bee foraging, involving employed bees, onlooker bees, and scouts. Various swarm-based metaheuristic algorithms have gained attention, such as the Grey Wolf Optimizer (GWO) [13], simulating grey wolves' cooperative hunting with alpha, beta, delta, and omega groups. The Whale Optimization Algorithm (WOA) [14] mimics whale foraging, demonstrating strong optimization convergence. The Salp Swarm Algorithm (SSA) [15] is inspired by the salp chain, attracting attention across diverse fields. Harris Hawks Optimization (HHO) [16] excels in engineering optimization, simulating hawks' preying with distinct chasing patterns. The Marine Predator Algorithm [17] draws from predator–prey behaviours, utilizing velocity ratio, Levy, and Brownian movement. Seagull Optimization Algorithm (SOA) [18] mimics seagull foraging for optimization. Other notable algorithms include Monarch Butterfly [19], Lion [20], Pity Beetle [21], Squirrel Search [22], Butterfly [23], Slime Mould [24], Golden Eagle [25], Red Fox [26], Starling Murmuration Optimizer (SMO) [27], Aquila Optimizer (AO) [28], Gannet optimization algorithm (GOA) [29], Bacteria phototaxis optimizer (BPO) [30], Sea horse optimizer (SHO) [31], Coati Optimization Algorithm (COA) [32], Dwarf Mongoose Optimization Algorithm (DMOA) [33], Snake Optimizer (SO) [34], Reptile Search Algorithm (RSA) [35], Prairie Dog Optimization Algorithm (PDO) [36], Ebola Optimization Search Algorithm [37] and more.

  • Evolutionary algorithms: The second class of metaheuristic algorithms falls under the evolutionary-based category. These algorithms imitate biological evolution through processes like crossover, mutation, and selection, achieving evolution by preserving highly adaptable individuals (solutions). For instance, GA, introduced by Holland in 1975 [38], stands as a pioneer in metaheuristics, drawing inspiration from Darwin's theory of natural competition. This approach proves suitable for a diverse range of optimization problems. Storn and Price developed Differential Evolution (DE) [39], a widely used algorithm for optimization. Biogeography-based Optimization [40] is derived from the migration and mutation of biological organisms, with the best solution obtained by updating the habitat suitability index through migration and mutation. Additionally, various variants of evolutionary-based metaheuristics have been introduced, including Evolution Strategy [41], Gene Expression Programming [42], and Memetic Algorithm [43, 44].

  • Physics and chemistry based algorithms: Physics and chemistry-based algorithms, inspired by physical forces like electromagnetic and gravity, as well as chemical concepts, emphasize theoretical and mathematical aspects. This makes their convergence easier to demonstrate with field-specific theorems. Examples include simulated annealing [45], gravitational search algorithm [46], big-bang big-crunch algorithm [47], charged system search [48], ray optimization [49], stochastic fractal search [50], equilibrium optimizer [51], sine cosine algorithm [52], water cycle algorithm [53], thermal exchange optimization [54], Archimedes optimization algorithm (AOA) [55], Kepler Optimization Algorithm (KOA) [56], Fick' s Law Algorithm (FLA) [57], Propagation Search Algorithm (PSA) [58], weighted mean of vector (INFO) [59] and others. Simulated annealing [45], inspired by the physical law of metal cooling and annealing, is effective in solving complex optimization problems. The gravitational search algorithm [46], drawing inspiration from the law of gravity, attracts particles based on mass weight to achieve optimal solutions.

  • Human based algorithms: The final category of metaheuristic algorithms is based on social or human behaviours. Brainstorm Optimization [60], developed by Shi, simulates intense ideological collisions among people. Each idea represents a candidate solution, and solution updates involve clustering and fusion. Teaching–Learning-Based Optimization [61] draws inspiration from the teaching and learning process in classrooms, where students learn not only from teachers but also from peers. TLBO is recognized as a high-quality algorithm in the metaheuristic field. Other notable social or human-inspired metaheuristics include Tabu Search [62], Harmony Search [63], Political Optimizer [64], Imperialist Competitive Algorithm [65], League Championship Algorithm [66], Interactive Autodidactic School [67], Arithmetic Optimization Algorithm [68], Language Education Optimization (LEO) [69], Skill Optimization Algorithm (SOA) [70], Group learning algorithm (GLO) [71], Growth Optimizer (GO) [72], Success History Intelligent Optimizer (SHIO) [73], Child Drawing Development Optimization (CDDO) [74] and more.

The constant progress and improvement of initial metaheuristic optimization algorithms represent an ongoing process of getting better and more innovative. Researchers continuously work on refining these fundamental algorithms by incorporating inventive methods and strategies (Table 2). Hybrid approaches take advantage of how different algorithms complement each other, making the exploration of the search space more effective and efficiently using potential solutions. This blending of ideas not only pushes the limits of optimization performance but also creates adaptable and strong algorithms that can handle various complex optimization challenges more effectively. In recent years, various enhanced and hybridized versions of existing metaheuristic algorithms have emerged. These variants are essential for optimizing algorithm performance, overcoming limitations, and adapting to diverse problem characteristics, ensuring efficient solutions. For instance, The Conscious Neighbourhood-based Crow Search Algorithm (CCSA) [75] addresses crow search algorithm concerns with three novel strategies, achieving outstanding results in benchmarks and engineering problems, surpassing state-of-the-art swarm intelligence. The Quantum-based Avian Navigation Optimizer Algorithm (QANA) [76] refines differential evolution, introducing self-adaptive quantum elements, success-based distribution, and V-echelon communication. QANA proves superior to DE and other swarm algorithms. The Enhanced Whale Optimization Algorithm (E−WOA) [77] improves feature selection, outperforming existing WOA variants. Binary E−WOA (BE−WOA) excels in medical datasets like COVID-19 detection. The Improved Binary QANA (IBQANA) [78] enhances binary metaheuristics for medical data pre-processing, surpassing seven alternatives. Binary QANA (BQANA) [79] optimizes medical feature selection, outperforming alternatives. Binary Starling Murmuration Optimizer (BSMO) [80] excels in feature selection, surpassing rivals. The Levy Arithmetic Algorithm [81] enhances the Arithmetic Optimization Algorithm, proving superior in diverse benchmarks and real-world scenarios. E-mPSOBSA [82] integrates the modified Backtracking Search Algorithm (BSA) and Particle Swarm Optimization (PSO) to improve global exploration and local exploitation. The hybrid HSMA [83] combines quadratic approximation with SMA, while the novel m-SCBOA [84] merges a modified Butterfly Optimization Algorithm with the Sine Cosine Algorithm, enhancing both exploratory and exploitative searches. E-SOSBSA [85], a hybrid of SOS and BSA, addresses SOS limitations through adaptive mutation and crossover operators. NwSOS [86] modifies the symbiotic organisms search algorithm, enhancing exploration/exploitation balance. ImBSA [87] improves BSA with a multi-population approach, diverse mutation strategies, and adaptive control parameters. MlSOS [88], an improved SOS variant, tackles local optima and weak convergence issues, demonstrating enhanced search performance on benchmarks. The modified DE algorithm in paper [89] addresses control parameter challenges in optimizing real-world problems. QRSMA [90] integrates SMA with quasi-reflection-based learning for improved diversity, convergence, and search efficiency. Similarly, gQR-BSA [91] modifies the backtracking search algorithm with quantum Gaussian mutations, adaptive parameters, and quasi-reflection-based jumping. mLBOA [92] is a BOA variant utilizing self-adaptive parameters and Lagrange interpolation, while m-DMFO [93] addresses premature convergence issues in the moth flame optimization algorithm.

Table 2 Tuning hyper parameters of various comparable metaheuristics, including the proposed ones

The preceding paragraphs showcase a range of metaheuristics devised by researchers and the broad domains where these algorithms find application. In addition to, Table 25 presents a comprehensive overview of the advantages and disadvantages associated with various metaheuristic techniques, offering insights into their strengths and limitations for optimization tasks. The subsequent section presents the Hyperbolic Sine Optimizer (HSO), a novel metaheuristic optimization algorithm crafted to tackle a diverse array of optimization challenges in scientific applications.

3 The proposed algorithm: Hyperbolic Sine Optimizer

The Hyperbolic Sine Optimizer (HSO) is a novel meta-heuristic optimization method for scientific problems. HSO stands out for its utilization of algebraic and hyperbolic functions, particularly focusing on population members. The algorithm comprises two phases: exploration and exploitation, uniquely relying on behaviour mechanisms associated with hyperbolic function convergence, departing from conventional methods. Highlighting the significant role of population members in efficient search region exploration, HSO diverges from traditional population-based meta-heuristic optimization algorithms. While most algorithms minimize the influence of population members in exploring the search region, HSO emphasizes their importance for effective exploration. To address this research gap, the study proposes an optimizer actively involving population members in the exploration process. The paragraph underscores the critical need for balancing exploration and exploitation in meta-heuristic algorithms, with the authors introducing the hyperbolic \(sinh\) function to accelerate convergence towards global minima or maxima. In the exploitation phase, algebraic equations based on population constituents are employed. The paragraph concludes by noting the proposal of a specific equation for position updates during the exploratory phase, emphasizing the crucial role of these mathematical tools in achieving efficient optimization results. Overall, HSO introduces a new approach that considers population dynamics, behaviour mechanisms, and mathematical functions to enhance the exploration and exploitation phases in meta-heuristic optimization.

3.1 Exploration phase

The exploration stage within the framework of the Hyperbolic Sine Optimizer (HSO) signifies a revolutionary transformation in the field of meta-heuristic optimization. Unlike traditional methodologies where populations remain passive during exploration, the HSO takes an active approach by involving individual members and acknowledging their distinct contributions. This approach enhances overall efficiency by tapping into the combined impact of population members, introducing dynamism to the exploration process. Emphasizing the vital interplay between exploration and exploitation, the HSO utilizes the hyperbolic \(sinh\) function during the exploration phase, providing a new outlook on shaping exploration dynamics. The incorporation of the \(sinh\) hyperbolic function in behaviour analysis introduces an inventive aspect, prompting problem-solving and facilitating the effective traversal of the search region. Closely connected to mathematical principles, the exploration phase capitalizes on algebraic and hyperbolic functions, marking a significant advancement. In conclusion, the exploration phase of the HSO, characterized by active involvement and guided by mathematical principles, sets it apart with an innovative approach, ensuring adept handling of intricate optimization challenges and pursuit of global extrema across various problem domains.

$${x}_{i,j}^{\alpha }=rando{m}_{uniform}\left({\text{min}}\left({x}_{i,j}^{p},{r}_{1}\right), {\text{max}}\left({x}_{i,j}^{p},{r}_{1}\right)\right),$$
(1)
$$ where\;r_{1} = x_{j}^{{best}} - \left( {x_{j}^{{best}} *{\text{sinh}}\left( {r_{2} *{\raise0.7ex\hbox{$\pi $} \!\mathord{\left/ {\vphantom {\pi 2}}\right.\kern-\nulldelimiterspace} \!\lower0.7ex\hbox{$2$}}} \right)} \right), $$
(2)
$$r2=random number between lb and ub$$

In the exploration phase, we evaluate a random number \({x}_{i,j}^{\alpha }\) based on uniform random distribution between population members \({x}_{i,j}^{p}\) and \({r}_{1}\) as of Eq. (1). The definition of \({r}_{1}\) is defined as on Eq. (2) and thereafter replaces original population members with these newly generated members. Here \({x}_{j}^{best}\) represents the population local best.

Afterwards, replaces all population members, finding the best local solution \({[x]}_{1\times n}^{{\alpha }_{best}}\). Now the exploration phase ends, and the exploitation phase initialize.

3.2 Exploitation phase

The Hyperbolic Sine Optimizer (HSO) relies on a crucial exploitation phase for its meta-heuristic optimization. Unlike traditional methods, HSO prioritizes mathematical concepts, especially algebraic and hyperbolic functions within the population, recognizing the importance of exploration and exploitation in meta-heuristic algorithms. The HSO stands out by actively involving population members during search region exploration. In the exploitation phase, departing from common inactivity, HSO prioritizes individual contributions to enhance search region investigation efficiency, emphasizing the algorithm's commitment to harnessing the collective impact of population members. Researchers stress the exploitation phase's importance, relying on algebraic equations based on population elements. This underscores HSO's adaptability and effectiveness in handling complex optimization challenges, presenting a distinctive approach to meta-heuristic optimization.

During the exploitation phase, the following equations are used to update the population:

$${\left[X\right]}_{m\times n}^{\beta }=\left\{\begin{array}{c}{x}_{i,j}^{\beta }= {r}_{4}/(1+{\text{sinh}}(\rho /{\delta }^{2})) | {r}_{3}\ge \rho \\ {x}_{i,j}^{\beta }= {x}_{i,j}^{\alpha } | {r}_{3}<\rho \end{array}\right.$$
(3)
$$where \delta ={x}_{i,j}^{\alpha }\times \mu $$
(4)
$$and \mu =\left({x}_{i,j}^{\alpha }*{x}_{j}^{{p}_{best}}\right)/(\rho *\pi )$$
(5)
$$where \rho =0.5, where {r}_{3} is a random number between 0 and 1$$

\(and\) \({r}_{4} is a random number between lb and ub.\)

Hereafter evaluate the target objective function value \({[f]}_{m\times 1}^{{\beta }_{obj}}\) concerning each search agent of the population \({[X]}_{m\times n}^{\beta }\) Subsequently, finding the best-fit search agent \({[x]}_{1\times n}^{{\beta }_{best}}\) concerning the objective function value \({f}_{i}^{{\beta }_{obj}}\). Now Repeat these procedures over some number of iterations (Fig. 1).

Fig. 1
figure 1

Infographic of the proposed algorithm HS

Algorithm 1
figure a

Hyperbolic Sine Optimizer (HSO)

3.3 Hypotheses about the HSO algorithm

For the following reasons, the HSO algorithm theoretically achieves the global optimum of optimization problems (Fig. 2):

  • Effective search space exploration is guaranteed by the population dispersion behaviour of each generation around the population members.

  • The population members' adaptively diverse boundaries guarantee efficient utilization of search space.

  • Utilizing random adaptive variables and population update techniques greatly increases the likelihood of escaping the local optimum stagnation.

  • A new exploration and exploitation phase based on hyperbolic functions gradually decreases the rates at which population members are modified over the course of iterations to assure convergence of the HSO algorithm.

  • Each iteration's population diversity is increased by producing random population members within the search boundary for each search agent.

  • Individual search agents are directed by the population to investigate more fruitful areas of the search space.

  • Each iteration's best fitness value is noted and compared with the other previous values.

  • HSO has a great ability to avoid local optima due to its population-based characteristics.

  • There aren't many parameters to adjust when using the HSO algorithm.

  • The HSO algorithm approaches the problem as a "black box," not using gradients.

Fig. 2
figure 2

Flow chart of the proposed algorithm HSO

4 Experimental results and discussions

The performance of the proposed HSO algorithm is assessed using 110 well-known standard benchmark functions, and the outcomes have been compared with those of other 15 metaheuristic algorithms, including GA [ 38 ], PSO [ 10 ], BBO [ 40 ], FPA [ 103 ], GWO [ 13 ], BAT [ 104 ], FA [ 105 ], CS [ 106 ], MFO [ 107 ], GSA [ 46 ], DE [ 39 ], TSA[ 108 ], SCA [ 52 ], WOA [ 14 ], AOA[ 68 ]. In addition to table 8 showcases the outcomes produced by the proposed algorithm (HSO) and contrasts them with the results derived from eight recently introduced algorithms. Following the experimental setup and comparing algorithms, details of the Benchmark problems are described first. The examinations for HSO's qualitative, quantitative, and scalability are then completed. Seven actual constrained and unconstrained optimization tasks are used to evaluate the performance of HSO

4.1 Benchmark problems and experimental setup

A set of 23 standards, well-known optimization benchmark functions, including unimodal, multimodal, and fixed-dimensional functions, as well as 42 additional functions based on various modalities, including many local minima, bowl-shaped, plate-shaped, valley-based, steep ridges and drops, and others, are used to evaluate the effectiveness of the proposed HSO. Next, all 15 benchmark suits from IEEE CEC-2015 and all 30 benchmark suits from IEEE CEC-2017 are applied. A summary of standard 23 and further 42 benchmarking problems are provided in Appendices A and B, respectively. The IEEE CEC-2015 and IEEE CEC-2020 test suite collections, respectively, are included in Appendices C and D. Each algorithm is subjected to 30 independent runs with 500 iterations per run and a population size of 30 in the experimental setup in order to make fair comparisons between the proposed algorithm HSO and all other fifteen algorithms. The best, worst, mean, and standard deviation values for each function are reported for each algorithm. Results are computed on an Intel(R)Core (TM) i5 7200U processor running at 2.5–2.71 GHz, with 8 GB of primary memory and 1 TB of secondary memory, respectively.

4.2 Exploitation efficiency of proposed HSO

The findings of the first experiment, which was carried out using 23 common benchmark functions suggested by Yao et al., are shown in Appendix Table 40. A set of seven unimodal and six multimodal problems are considered to have a maximum dimension of 30. The number of iterations (30 × 500) is 15,000, which is the maximum number of viable evaluations. Table 3 presents the results for a total of 13 functions, of which 6 are multimodal and 7 are unimodal. In addition, the fact that the unimodal functions have a single (global) optimum serves as evidence of the suggested algorithm's effectiveness in exploitation. Moreover, evidenced by the mean value in the results Table 3, the proposed HSO outperforms the other metaheuristics for unimodal functions F1, F2, F3, F4, and F7 in 30 dimensions. The proposed HSO successfully achieves the exact global optimum for the functions F1, F2, F3, and F4. Tables 4, 5, and 6 show the results for 100, 500, and 1000 dimensions, respectively. Even though the challenge was scaled up to 100 and 500 dimensions, significant progress was still apparent. It is obvious that the suggested HSO can locate the exact global optimum for each of the four functions we are considering (F1, F2, F3, and F4). The suggested algorithm's stability and predictability are further illustrated by the standard deviation. The Rosenbrock function, a non-convex function used to evaluate optimization strategies, was created by Howard H. Rosenbrock in 1960. Other names include Banana Function or Rosenbrock's Valley. The global minimum is located in a long, thin, flat valley with a parabolic shape. It is simple to locate the valley. It is difficult to converge to the global optimum. F5 is the Rosenbrock function in AppendixTable 40. For F5, the suggested approach performed better than alternative metaheuristics in all dimensions (30, 100, 500, and 1000). The proposed algorithm produces adequate results for functions F6 and F7.

Table 3 Comparison results of HSO with other metaheuristics on 30 dimensions
Table 4 Comparison results of HSO with other metaheuristics on 100 dimensions
Table 5 Comparison results of HSO with other metaheuristics on 500 dimensions
Table 6 Comparison results of HSO with other metaheuristics on 1000 dimensions

4.3 Exploration efficiency of proposed HSO

Multimodal functions have more local optima than unimodal functions, which makes them harder to solve. In order to find a global optimum, it is, therefore, essential to avoid local optimal stagnation. Tables 3, 4, 5, and 6 present the results for the 30, 100, 500, and 1000 dimensions, respectively. When the proposed HSO is compared to various metaheuristics, it can be seen that it outperforms them significantly for functions F8, F9, F10, F11, F12, and F13. The results demonstrate that applied mechanisms for population distribution behaviour improve the HSO's exploration capability while exploitation mechanisms with hyperbolic function \(sinh\) strike a good balance between exploration and exploitation to improve convergence in the later stages of a generation.

4.4 Performance of proposed HSO over fixed dimension multimodal functions

These reference problems also exhibit multimodality in fixed dimensions. Each challenge is run through 500 iterations to arrive at the solution. Table 7 findings make it apparent that the suggested HSO can locate global optimum solutions for the specified functions F14, F15, F16, F17, F18, F21, F22, and F23. Additionally, the results for functions F19 and F20 are notably better than those of other metaheuristics. The proposed algorithm HSO's best, worst, mean, and standard deviation values are shown in Table 8 for 23 common benchmark functions with dimensions of 30, 100, 500, and 1000.

Table 7 Comparison results of HSO with other metaheuristics on fixed dimensions
Table 8 Comparison of the proposed algorithm (HSO) with newly published algorithms across 30 dimensions

4.5 Performance of proposed HSO over some 42 extra functions with different modalities

To evaluate the effectiveness of the suggested HSO algorithm, 42 more supplementary functions (mentioned in Appendix Table 42) must be utilized, including various local minima, bowl- or plate-shaped features, valley-based features, steep ridges and drops, and others (Table 9). The 42 supplementary functions' best, worst, mean, and standard deviation results are shown in Table 10. The functions G1 through G15 can be found in numerous local minima categories. For the following functions: G3, G4, G6, G7, G11, G12, G13, and G15, HSO finds the exact global optimum. In addition to this, other operations would also result in positive effects. The bowl shape category takes into consideration seven functions that span G16–G22. In this class as well, HSO demonstrates its usefulness by generating exact global optimum levels for G16, G18, G19, G20, and G21. Other than these, HSO performs satisfactorily. For the platE−shaped categories (G23-G27), five functions are taken into account. For functions like G24 and G27, HSO generates the exact global optimum. Four functions (G28–G31) are considered in order to identify a valley-shaped function. For G28 and G29, HSO offers an exact global optimum. In addition to this, HSO performs well. Four functions (G32-G35) are taken into account while evaluating the shape of steep ridges and drops. For G33, HSO generates the exact global optimum. Except for them, HSO yields acceptable results. Seven further functions are considered to constitute other categories with diverse forms in the range G36-G42 aside from these. All functions in this range from (G36-G42) have exact global minima produced by HSO.

Table 9 HSO results on F1-F23 based on various parameters on 30, 100, 500 and 1000 dimensions
Table 10 HSO results on Forty-two extra functions based on various modalities

4.6 Performance of the proposed algorithm over IEEE CEC benchmark functions

4.6.1 Experimental set up for IEEE CEC benchmark problems

A set of 15 and 30 standard IEEE CEC-2015 and CEC-2017 benchmark functions, respectively, renowned for their optimization characteristics, is utilized to evaluate the effectiveness of the proposed HSO. These functions include variations with shifts, rotations, hybrids, and composites, and a brief summary is provided in Appendix Table 43. In the experimental setup, each algorithm undergoes 30 independent runs, involving 1000 iterations per run and a population size of 30. This thorough configuration ensures fair comparisons between the HSO algorithm and the remaining fifteen algorithms. Mean and standard deviation values for each algorithm are recorded for every function. The results are obtained from computations performed on an Intel(R) Core (TM) i5 7200U processor operating at speeds ranging from 2.5 to 2.71 GHz, equipped with 8 GB of primary memory and 1 TB of secondary memory, respectively.

4.6.2 Analysis of outcomes for IEEE CEC-2015 benchmark functions

Table 11. represents the performance evaluation of the Hyperbolic Sine Optimizer (HSO) algorithm on various CEC benchmark functions reveals noteworthy achievements, particularly when compared to competing algorithms such as MPA, TSA, WOA, GWO, TLBO, GSA, GA, and PSO. To assess the efficacy of HSO, the mean values obtained for each function were scrutinized, with an emphasis on achieving smaller mean values indicative of better performance. Among the benchmark functions, HSO demonstrated its superiority in terms of mean values in several instances, outperforming its competitors.

Table 11 Comparison table of HSO with other metaheuristics on IEEE CEC2015 benchmark test functions

For CEC-3, CEC-6, CEC-7, CEC-9, CEC-12, and CEC-15, HSO secured the best mean values compared to MPA, TSA, WOA, GWO, TLBO, GSA, GA, and PSO. Notably, in CEC-3, HSO exhibited an average of 3.04E+02 with a standard deviation of 1.30E+00, showcasing its efficiency in minimizing the objective function. Similarly, for CEC-6, HSO obtained an average of 6.00E+02 with a minimal standard deviation of 7.40E−02, signifying its robustness in handling the complexities of this function. The consistent outperformance across various functions underscores the effectiveness of HSO in achieving superior mean values. It is worth highlighting that these results signify HSO's capability to converge towards optimal solutions, as reflected by the smaller mean values. The standard deviations, indicating the variability of results, were also competitive, emphasizing the stability of HSO across different functions. This comparative analysis establishes HSO as a promising optimization algorithm for addressing complex optimization problems, especially in scenarios where achieving lower objective function values is critical. The robust performance of HSO across these CEC benchmark functions positions it as a noteworthy candidate for diverse optimization applications.

4.6.3 Analysis of outcomes for IEEE CEC-2017 benchmark functions

In this section, we evaluate the effectiveness of the proposed method in optimizing the CEC 2017 test suite. This suite consists of thirty benchmark functions, categorized into three unimodal functions (C17-F1 to C17-F3), seven multimodal functions (C17-F4 to C17-F10), ten hybrid functions (C17-F11 to C17-F20), and ten composition functions (C17-F21 to C17-F30). The exclusion of C17-F2 from the simulation studies is due to its unstable behaviour, and a comprehensive description of the CEC 2017 test suite can be found in Appendix Table 43. To conduct a scalability analysis, we employed HSO and competitor algorithms to solve the test suite for problem dimensions (number of decision variables) set at 10, 30, 50, and 100. The results of implementing the proposed HSO approach and competitor algorithms on the CEC 2017 test suite are presented in Tables 12, 13, 14 and 15. The simulation outcomes indicate that, for dimensions equal to 10, HSO emerges as the most effective optimizer for handling C17-F1, C17-F3, C17-F5, C17-F6, C17-F8, C17-F9, C17-F10, C17-F19, C17-F20, C17-F21, C17-F22, C17-F24, C17-F27, and C17-F30. For dimensions equal to 30, HSO stands out as the best optimizer for solving C17-F1 to C17-F9, C17-F13 to C17-F15, C17-F18, C17-F23, C17-F25, and C17-F28. Likewise, for dimensions equal to 50, HSO proves to be the most efficient optimizer for handling C17-F1 to C17-F5, C17-F6, C17-F7, C17-F9, C17-F11, C17-F12, C17-F13, C17-F15, C17-F16, C17-F18, C17-F21, C17-F26, C17-F27, and C17-F30. Finally, for dimensions equal to 100, HSO excels as the optimal optimizer for solving C17-F1 to C17-F7 and C17-F9 to C17-F23, C17-F26, C17-F29, and C17-F30.

Table 12 HSO results on the IEEE CEC-2017 benchmark suite. (dimensions = 10)
Table 13 HSO results on the IEEE CEC-2017 benchmark suite (Dimensions = 30)
Table 14 HSO results on the IEEE CEC-2017 benchmark suite (Dimensions = 50)
Table 15 HSO results on the IEEE CEC-2017 benchmark suite (Dimensions = 100)

4.7 Analysis of exploration and exploitation using diversity graph

This section uses 23 standard benchmark functions with 30 dimensions to analyse the behaviour of exploration and exploitation in HSO. The aforementioned experiments examine the effectiveness of HSO using measures like standard deviation and mean value. The HSO algorithm can outperform its competitors on a variety of 23 benchmark functions, as explained in this section. Hussain et al. presented a method to evaluate the ability for exploitation and exploration in metaheuristic algorithms in [109]. This method expands on the mathematical model of dimension-wise variety introduced in [109]. The following formulas are presented as equivalents:

$${Divs}_{j}=\frac{1}{N}\sum_{i=1}^{N}median({z}^{j})-{z}_{i}^{j},Divs=\frac{1}{Dim}\sum_{j=1}^{Dim}{Divs}_{j},$$
(6)

where the \(jth\) dimension's median is represented by the notation \(median({z}^{j})\). Following that, the formulas used to determine the exploration percentage and exploitation % are outlined as follows:

$$Epl\%=\frac{Divs}{{Divs}_{max}}\times 100,Ept\%=\frac{|Divs-{Divs}_{max}|}{{Divs}_{max}}\times 100,$$
(7)

where \({Divs}_{max}\) stands for the highest level of diversity. The terms exploration percentage and exploitation percentage, respectively, are \(Epl\%\) and \(t\%\). The HSO algorithm can enable the preservation of the metaheuristics diversity and successfully prevent premature convergence when applied to handle difficult benchmarks like unimodal, multimodal and fixed dimensional functions because it can also achieve an appropriate level of exploration percentage. In conclusion, the HSO algorithm effectively recognizes a tradE−off between exploration and exploitation as illustrated in Table 24. Figure 3 depicted diversity graph of 23 standard benchmark functions.

Fig. 3
figure 3

Diversity plots of 23 standard benchmark functions

4.8 Sensitivity analysis

The proposed HSO algorithm exhibits the capability to address optimization problems through a process centred on repetitions, utilizing examinations of the search space by members within the population. Consequently, the performance of HSO is subject to the influence of both the population size \((N)\) and the number of iterations \((T)\) performed by the algorithm.

Within this specific section, a sensitivity analysis of HSO with regard to parameters \((N)\) and \((T)\) is undertaken. To investigate the sensitivity to parameter \((N)\), the proposed HSO is employed with various population sizes, including 20, 30, 50, and 100, to address problems F1 to F23. The simulation results of the sensitivity analysis regarding parameter \((N)\) are presented in Table 16. The analysis of these simulation results indicates that an increase in the population enhances the search capabilities of HSO, resulting in a reduction of the values associated with the objective functions. For an examination of sensitivity to parameter \((T)\), the proposed HSO algorithm is executed with diverse values of \((T)\), specifically 100, 500, 1000, and 2000, applied to benchmark functions F1 to F23. The outcomes of the HSO sensitivity analysis under changes in parameter \((T)\) are documented in Table 17. Clearly evident from the simulation results of the sensitivity analysis is that an escalation in the values of \((T)\) prompts the algorithm to converge towards improved solutions, thereby decreasing the values associated with the objective functions.

Table 16 Sensitivity analysis of the HSO for the number of population members (N = 20, 30, 50, 100)
Table 17 Sensitivity analysis of the TDO for the maximum number of iterations (T = 100, 500, 1000, 2000)

5 Statistical measures

5.1 Student t tests

The student t-test [110] is a statistical method used to compare the means of two groups and determine if their differences are statistically significant. Named after William Sealy Gosset, who published under the pseudonym "Student," this parametric test is widely employed in research. It assumes that the data follow a normal distribution and calculates a t-statistic based on sample means, standard deviations, and sample sizes. A smaller p-value indicates a higher likelihood of rejecting the null hypothesis, suggesting a significant difference between the group means. The t-test is valuable in experimental design, allowing researchers to assess the significance of observed differences and make informed conclusions about the populations from which the samples are drawn. In this experimental configuration, the proposed algorithm HSO underwent a t-test in comparison to 15 competitor metaheuristics, including GA, FA, BAT, DE, TSA, GSA, PSO, WOA, GWO, CS, BBO, SCA, MFO, FPA, and AOA. Evaluations were conducted on 23 standard benchmark functions across 30 dimensions, with a focus on noting p-values. Here, p-values less than 0.05 indicate more statistically significant results.

5.1.1 Discussion of the results of the proposed algorithm (HSO) using t-tests across 30 dimensions

The provided Table 18 contains p-values resulting from Student t-tests, which were conducted to compare the performance of different metaheuristics across a range of functions (F1 to F23). The table is organized with each row corresponding to a specific function, and each column representing a distinct metaheuristic such as GA, FA, BAT, DE, TSA, GSA, PSO, WOA, GWO, CS, BBO, SCA, MFO, FPA, and AOA. The reported p-values are in scientific notation, and the symbols (+), (−), and (=) accompanying them signify whether the p-value is less than 0.05 (indicating statistical significance), greater than 0.05 (indicating lack of statistical significance), or equal to 0.05, respectively. To interpret the results, p-values in parentheses approaching zero (e.g., 3.49E−15) signify highly statistically significant outcomes. The presence of a symbol (+) next to a p-value indicates that the corresponding metaheuristic demonstrates statistically significant performance compared to others on the specific function. Conversely, a symbol (−) suggests that the metaheuristic in question does not exhibit statistically significant performance on that particular function. A symbol (=) indicates that the result is not statistically significant, with the p-value being greater than or equal to 0.05. For instance, in function F1, all metaheuristics show p-values less than 0.05, signifying that they perform significantly well on F1. In contrast, in function F2, all metaheuristics, except GWO, have p-values less than 0.05. Furthermore, in function F5, GA, FA, BAT, DE, TSA, GSA, PSO, WOA, and CS demonstrate p-values less than 0.05, suggesting statistically significant performance on F5. It is crucial to note that a lower p-value generally indicates stronger evidence against the null hypothesis, emphasizing the statistical significance of the observed differences. Additionally, considering the practical significance and the context of the study is essential when interpreting these results.

Table 18 Student T tests for the proposed algorithm (HSO) across 30 dimensions

5.2 Friedman ranking tests

The Friedman test [111] is a non-parametric statistical test used to determine if there are significant differences among the means of three or more related groups. Named after economist Milton Friedman, this test is applicable when the data are measured on an ordinal scale and the assumptions for parametric tests cannot be met. The procedure involves ranking the data within each group, calculating average ranks, and then computing a test statistic based on the differences between the observed and mean ranks. The null hypothesis assumes no significant differences among the groups. If the resulting test statistic is statistically significant, it indicates that there are variations among the groups. Post-hoc tests can be applied to pinpoint specific group differences. The Friedman test is particularly valuable for analyzing repeated measures data where normality assumptions may not hold, providing a robust method for assessing differences in central tendencies across multiple related groups. In this experimental configuration, the proposed algorithm HSO underwent Friedman ranking tests in comparison to 15 competitor metaheuristics, encompassing GA, FA, BAT, DE, TSA, GSA, PSO, WOA, GWO, CS, BBO, SCA, MFO, FPA, and AOA. Evaluations were carried out on 23 standard benchmark functions across varying dimensions, specifically 30, 100, 500, and 1000.

5.2.1 Discussion of the results of the proposed algorithm (HSO) using Friedman ranking tests across 30 dimension

Table 19 displays the results of Friedman ranking tests conducted on the proposed algorithm (HSO) in comparison to 15 competitor metaheuristics across 30 dimensions. The HSO ranking algorithm, assessed against fifteen optimization methods across benchmark functions (F1 to F13) on 30 dimensions, secured the top position with an average rank of 1. Outperforming algorithms like GA, FA, BAT, and more, HSO consistently demonstrated effectiveness in individual functions, asserting the top rank in F1, F2, F3, F4, F6, F7, F9, F10, F11, F12, and F13. With an average rank of 2.69, HSO showcased superiority, addressing a diverse range of functions and emerging as a promising choice for optimization tasks. In conclusion, HSO stands out as a robust and effective optimization method, consistently exceeding counterparts across various benchmark functions.

Table 19 Friedman ranking tests of the proposed algorithm with other competitor metaheuristics for 30 dimensions

5.2.2 Discussion of the results of the proposed algorithm (HSO) using Friedman ranking tests across 100 dimension

The proposed HSO ranking algorithm is evaluated across benchmark functions (F1 to F13) in 100 dimensions. Table 20 results show that HSO consistently outperforms, securing top ranks. In F1, it stands out, while the Genetic Algorithm (GA) lags at 14. In F2, HSO leads at 1, surpassing GA and Firefly Algorithm (FA). This trend persists, emphasizing HSO's superiority. GA averages 11th, while FA and BAT Algorithm rank 8 and 16. Overall, HSO maintains dominance with an average rank of 1, affirming its efficacy. In conclusion, HSO demonstrates promising results, positioning it as a robust choice for optimization tasks compared to other algorithms.

Table 20 Friedman ranking tests of the proposed algorithm with other competitor metaheuristics for 100 dimensions

5.2.3 Discussion of the results of the proposed algorithm (HSO) using Friedman ranking tests across 500 dimension

The HSO algorithm excels across benchmark functions (F1 to F13) in 500 dimensions, outperforming fifteen well-known optimization algorithms, as represented in Table 21. In F1, HSO secures the top rank, surpassing GA, FA, BAT, and others. This trend persists across functions; for example, in F5, HSO ranks 4, showcasing its effectiveness. With an average rank of 1.69, HSO demonstrates superior overall performance. Conversely, GA averages the 11th rank, while FA and BAT average ranks of 8 and 15. Overall, HSO emerges as a robust and competitive optimization approach, consistently surpassing or matching other algorithms. The average ranks affirm HSO's efficacy, making it a promising choice for diverse optimization tasks.

Table 21 Friedman ranking tests of the proposed algorithm with other competitor metaheuristics for 500 dimensions

5.2.4 Discussion of the results of the proposed algorithm (HSO) using Friedman ranking tests across 1000 dimension

The HSO algorithm excels when compared against fifteen different optimization algorithms across functions F1 to F13 in 1000 dimensions, as illustrated in Table 22. For instance, in F1, HSO claims the top position with a rank of 1, surpassing PSO (rank 2) and leaving GA and WOA behind with ranks 15 and 16. This trend repeats across functions, emphasizing HSO's consistent superiority. In F5, HSO secures a competitive rank of 5, showcasing its adaptability, even though PSO claims the top spot. Overall, HSO maintains robust performance, exemplified by its rank of 3 in F6, outshining GA and WOA. HSO's lead in F7 and F8, with ranks 1, emphasizes its adaptability and consistent excellence. Function F10 sees HSO with a rank of 2, highlighting its strong performance, while in F11, HSO secures the top position, showcasing versatility. Even in F13, HSO maintains its robustness, securing the third position. The average ranks affirm HSO's dominance with an average of 1.85, surpassing GA and FA with average ranks of 9.77 and 11.69. This concise analysis highlights HSO's exceptional optimization capabilities across a diverse set of functions, making it a promising choice for various optimization tasks.

Table 22 Friedman ranking tests of the proposed algorithm with other competitor metaheuristics for 1000 dimensions

5.2.5 Discussion of the results of the proposed algorithm (HSO) using Friedman ranking tests across fixed dimension

In the proposed algorithm, HSO is ranked against fifteen optimization algorithms across different benchmark functions (F1 to F13) on fixed dimensions. The performance is assessed based on the average ranks of each algorithm, as shown in Table 23. For Function F14, HSO secures the top position with an average rank of 4.10, outperforming the rest of the algorithms. Similarly, in F15, HSO maintains its dominance with an average rank of 3, indicating its superior performance compared to the others. In F16, HSO again emerges as the top-ranked algorithm with an average rank of 2. However, in F17, HSO experiences a slight decline in performance, securing the 7th position with an average rank of 7.40. This trend continues in F18, where HSO is outranked by GA, securing the 2nd position with an average rank of 15. In F19, HSO bounces back, regaining its top position with an average rank of 1. HSO consistently performs well in F20, F21, and F22, securing the 1st, 1st, and 4th positions, respectively. In F23, HSO maintains a competitive edge, securing the 3rd position with an average rank of 5.90. The overall average ranks highlight HSO's strong performance, securing the 1st position with an average rank of 1. In summary, the proposed algorithm HSO demonstrates robust performance across various benchmark functions, consistently outperforming other optimization algorithms in terms of average ranks.

Table 23 Friedman ranking tests of the proposed algorithm with other competitor metaheuristics for fixed dimensions

6 Computational time complexities

The algorithm initiates a search optimization procedure utilizing a population of search agents within an \(m\times n-\) dimensional domain. In the initialization phase, operating in \(O(m\times n)\) time complexity, random values are assigned to the population in the \(m\times n-\) dimensional space. Simultaneously, the evaluation of the target objective function during this phase incurs a time complexity of \(O(m)\) , performed for each of the m search agents. Moving forward, the primary iteration loop demonstrates a time complexity of \((iter \times m\times n)\), reflecting its dependence on \({\prime}iter{\prime}\) iterations. In each iteration, the loop traverses through each of the \({\prime}m{\prime}\) search agents and each of the \({\prime}n{\prime}\) dimensions. Within the exploration phase, the nested loops responsible for updating the population contribute to a time complexity of \(O(m\times n)\) . Additionally, the evaluation of the target objective function in this phase incurs a time complexity of\(O(m)\). Similarly, in the exploitation phase, the nested loops for updating the population and the evaluation of the target objective function both carry a time complexity of \(O(m\times n)\) and \(O(m)\) , respectively. The conditional checks within both the exploration and exploitation phases minimally impact the overall time complexity, involving fundamental arithmetic operations and comparisons. In conclusion, the aggregate time complexity of the algorithm is succinctly expressed as\(O(iter \times m\times n)\), where \({\prime}iter{\prime}\) represents the number of iterations, \({\prime}m{\prime}\) denotes the quantity of search agents, and \({\prime}n{\prime}\) signifies the number of dimensions within the search space.

7 Classical real-world engineering design problems

7.1 Constrained optimization based on IEEE−CEC-20 benchmarks suits

7.1.1 Pressure vessel design

The primary purpose of this pressure vessel design [120] is to keep the overall cost to a minimum. This total cost includes the material, forming, and welding of a cylindrical vessel. The vessel is capped on both ends, and the head is hemispherical in shape. The four variables in this problem are the thickness of the shell (Ts), the thickness of the head (Th), the inner radius (R), and the length of the cylindrical section without considering the head (L).

There are four limits on the subject of this problem. This problem's mathematical model is as follows:

$$\mathrm{Consider }\overrightarrow{X}=\left[{X}_{1}, {X}_{2}, {X}_{3}, {X}_{4}\right]=\left[{T}_{s,} {T}_{h, } R, L\right],$$
$$f\left(\overrightarrow{X}\right)=0.6224{X}_{1}{X}_{3}{X}_{4}+1.7781{X}_{2}{X}_{3}^{2}+3.1661{X}_{1}^{2}{X}_{4}+19.84{X}_{1}^{2}{X}_{3},$$
$${g}_{1}\left(\overrightarrow{X}\right)= -{X}_{1}+0.019{X}_{3}\le 0,$$
$${g}_{2}\left(\overrightarrow{X}\right)= -{X}_{3}+0.00954{X}_{3}\le 0,$$
$${g}_{3}\left(\overrightarrow{X}\right)= -\pi {X}_{3}^{2}{X}_{4}-\frac{4}{3}\pi {X}_{3}^{3}+1296000 \le 0,$$
$${g}_{4}\left(\overrightarrow{X}\right)= {X}_{4}-240 \le 0,$$

Variable range

$$0\le {X}_{1}\le 99,$$
$$0\le {X}_{2}\le 99,$$
$$10\le {X}_{3}\le 200,$$
$$10\le {X}_{4}\le 200,$$

This problem is very common for all optimizers. Researchers and Scientists solve this problem using conventional optimization algorithms and meta-heuristic Nature Inspired algorithms. The comparison table of HSO with some of Nature Inspired Algorithm are as follows:

Pressure Vessel Design is one of the most popular constrained optimization problems due to the complexity of its constraint search spaces (Tables 24 and 25), and it is included in the CEC 2020 real-world constrained optimization benchmarks. Using metaheuristics, eight competitive algorithms are used to solve this problem to ensure fair comparisons. Due to its optimal cost value, HSO ranked first among all algorithms, while TDO and TSA ranked second and third, respectively, in the experimental results. According to Tables 26 and 27, the HSO results are superior. Figure 4 depicts the convergence curve of HSO after 500 iterations. Each run consists of 500 iterations, and 30 runs are performed. All parameter charts for algorithms should be represented in Figs. 5 and 6.

Table 24 Balance analysis between exploration and exploitation of the proposed algorithm (HSO)
Table 25 Advantages and disadvantages of various metaheuristic techniques
Table 26 Comparison table of HSO for pressure vessel design
Table 27 Statistical analysis of HSO for pressure vessel design
Fig. 4
figure 4

Convergence curve of HSO for Pressure Vessel Design

Fig. 5
figure 5

Parameters graph of HSO for Pressure Vessel Design

Fig. 6
figure 6

Statistical graph of HSO for pressure vessel design

7.1.2 Welded beam design

The major goal of this topic is to reduce the time it takes to fabricate a welded beam [121]. In this problem, there are five restrictions to consider: In the beam, shear stress (\({\varvec{\uptau}}\)), bending stress (\(\theta \)), the buckling load on the bar (\({P}_{c}\)), the beam's end deflection (\(\delta \)), and the side limitations.

The thickness of the weld (h), the length of the attached part of the bar (l), the height of the bar (t), and the thickness of the bar (t) are the four variables in this problem (b). The following is the mathematical model for this problem:

Consider \(\overrightarrow{X}=\left[{X}_{1}, {X}_{2}, {X}_{3}, {X}_{4}\right]=\left[h, l, t, b\right]\)

Minimize \(f\left(\overrightarrow{X}\right)=1.10471{X}_{1}^{2}{X}_{2}+0.04811{X}_{3}{X}_{4}\left(14.0+{X}_{2}\right),\)

$${g}_{1}\left(\overrightarrow{X}\right)=\uptau \left(\overrightarrow{X}\right)-{\uptau }_{max}\le 0,$$
$${g}_{2}\left(\overrightarrow{X}\right)=\upsigma \left(\overrightarrow{X}\right)-{\upsigma }_{max}\le 0,$$
$${g}_{3}\left(\overrightarrow{X}\right)=\updelta \left(\overrightarrow{X}\right)-{\updelta }_{max}\le 0,$$
$${g}_{4}\left(\overrightarrow{X}\right)={X}_{1}-{X}_{4}\le 0,$$
$${g}_{5}\left(\overrightarrow{X}\right)=P-{P}_{c}(\overrightarrow{X})\le 0,$$
$${g}_{6}\left(\overrightarrow{X}\right)=0.125-{X}_{1}\le 0,$$
$${g}_{7}\left(\overrightarrow{X}\right)=1.10471{X}_{1}^{2}+0.04811{X}_{3}{X}_{4}\left(14.0+ {X}_{2}\right)-5.0\le 0,$$

Variable range

$$0.1\le {X}_{1}\le 2,$$
$$0.1\le {X}_{2}\le 10,$$
$$0.1\le {X}_{3}\le 10,$$
$$0.1\le {X}_{4}\le 2$$

where \(\tau \left(\overrightarrow{X}\right)=\sqrt{{{(\tau }{\prime})}^{2}+2{\tau }{\prime}{\tau }^{{\prime}{\prime}}\frac{{X}_{2}}{2R}+ {{(\tau }^{{\prime}{\prime}})}^{2}},\)

$${\tau }{\prime}=\frac{P}{\sqrt{2}{X}_{1}{X}_{2}}, {\tau }^{{\prime}{\prime}}=\frac{MR}{J}, M=P\left(L+\frac{{X}_{2}}{2}\right),$$
$$R=\sqrt{\frac{{X}_{2}^{2}}{4}+ {\left(\frac{{X}_{1}+{X}_{2}}{2}\right)}^{2},}$$
$$J=2\left\{\sqrt{2}{X}_{1}{X}_{2}\left[\frac{{X}_{2}^{2}}{4}+{\left(\frac{{X}_{1}+{X}_{2}}{2}\right)}^{2}\right]\right\},$$
$$\sigma \left(\overrightarrow{X}\right)=\frac{6PL}{{X}_{4}{X}_{3}^{2}}, \delta \left(\overrightarrow{X}\right)=\frac{6P{L}^{3}}{E{X}_{3}^{2}{X}_{4}}$$
$${P}_{c}\left(\overrightarrow{X}\right)=\frac{4.013E\sqrt{\frac{{X}_{3}^{2}{X}_{4}^{6}}{36}}}{{L}^{2}}\left(1-\frac{{X}_{3}}{2L}\sqrt{\frac{E}{4G}} \right),$$
$$P=6000 lb, L=14 in. {\delta }_{max}=0.25 in. , E=30 \times {1}^{6} psi, G=12 \times {10}^{6} psi, {\tau }_{max}=13600 psi,$$

\({\sigma }_{max}= 30000 psi\).

The following are the comparison findings of welded beam design of HSO with several Nature Inspired Optimization algorithms are as follows:

The experimental results for the welded beam design problem are displayed in Table 28. To ensure fairness, HSO is compared to seven other competitor algorithms. HSO outperforms other algorithms when determining optimal beam design costs. HSO was ranked first for this problem, while TDO and TSA were ranked second and third, respectively (Figs. 7 and 8). The CEC 2020 benchmark suite for real-world constraint optimization includes welded beam design. The statistical comparison between HSO and other algorithms for this problem is displayed in Table 29. Figures 7 and 9 depict, respectively, a statistical graph and a convergence analysis of HSO over 500 iterations. Figure 8 is a comparison of the HSO parameters graph to other algorithms.

Table 28 Comparison table of HSO for welded beam design
Fig. 7
figure 7

Convergence curve of HSO for Welded beam design

Fig. 8
figure 8

Parameters graph of HSO for Welded beam design

Table 29 Statistical analysis of HSO for welded beam design
Fig. 9
figure 9

Statistical graph of HSO for Welded beam design

7.1.3 Tension compression spring design

The purpose of this Tension Compression Spring Design [122] challenge is to make a tension/compression spring that is as light as feasible. Shear stress, surge frequency, and minimum deflection must all be considered during the minimization process. This optimization is influenced by wire diameter (d), mean coil diameter (D), and the number of active coils (N) (Figs. 9 and 10).

Fig. 10
figure 10

Convergence curve of HSO for tension compression spring design

The following is the mathematical model for this problem:

$$\overrightarrow{X}=\left[{X}_{1}, {X}_{2,}{X}_{3}\right]=\left[d, D, N\right],$$
$$\mathrm{Minimize }f(\overrightarrow{X)}=\left({X}_{3}+2\right){X}_{2}{X}_{1}^{2},$$
$$\mathrm{Subject to }{g}_{1}\left(\overrightarrow{X}\right)=1-\frac{{X}_{2}^{3}{X}_{3}}{7185{X}_{1}^{4}}\le 0,$$
$${g}_{2}\left(\overrightarrow{X}\right)=\frac{4{X}_{2}^{2}-{X}_{1}{X}_{2}}{12566({X}_{2}{X}_{1}^{3}-{X}_{1}^{4})}+\frac{1}{5108{X}_{1}^{2}}\le 0,$$
$${g}_{3}\left(\overrightarrow{X}\right)=1-\frac{140.45{X}_{1}}{{X}_{2}^{2}{X}_{3}}\le 0,$$
$${g}_{4}\left(\overrightarrow{X}\right)=\frac{{X}_{1}+{X}_{2}}{1.5}-1\le 0,$$

Variable range

$$0.05\le {X}_{1}\le 2.00,$$
$$0.25\le {X}_{2}\le 1.30,$$
$$2.00\le {X}_{3}\le 15.0$$

This optimization problem is solved by both mathematical and heuristic approach. To solve this problem by HSO, various Nature Inspired algorithms and their comparison results are reported in comparison table:

Tension compression spring design is an additional CEC 2020 real-world engineering problem benchmark suite. This problem is recognized due to its complex mathematical model and complex constraint search spaces. The experimental results for HSO for this problem are shown in Tables 30 and 31 respectively. Then, compare it to seven additional metaheuristics algorithms. The results indicate that HSO performed exceptionally well and placed first, while TDO and TSA placed second and third, respectively, due to their optimal cost. Statistical and parameter graphs are depicted in Figs. 11 and 12, respectively. The convergence curve of HSO after 500 iterations is depicted in Fig. 10.

Table 30 Comparison table of HSO for tension compression spring design
Table 31 Statistical analysis of HSO for tension compression spring design
Fig. 11
figure 11

Parameters graph of HSO for tension compression spring design

Fig. 12
figure 12

Statistical graph of HSO for tension compression spring design

7.1.4 Speed reducer design

The Speed reducer Design Problem [123], which is a discrete problem, has four design constraints: the bending stress of the gear teeth, the covering stress, the transverse deflections of the shafts, and the stresses in the shafts. The main goal of the problem is to determine the minimum weight of the speed reducer to satisfy these constraints. As a result, six continuous variables and one discrete variable are detected. In this instance, \({x}_{1}\) represents the face width, \({x}_{2}\) the module of teeth, and \({x}_{3}\) is a discrete design variable that displays the pinion's teeth. Similar to this, \({x}_{4}\) represents the distance between the first and second shafts' bearings, and \({x}_{5}\) represents that distance. The diameters of the first and second shafts, \({x}_{6}\) and \({x}_{7}\), respectively, are the sixth and seventh design variables.

The mathematical model of this problem are as follows:

Consider: \(X=[{x}_{1}, {x}_{2}, {x}_{3},{x}_{4}, {x}_{5},{x}_{6},{x}_{7}]\)

$$X=[b,m,p,{l}_{1},{l}_{2},{d}_{1},{d}_{2}]$$

Minimize: \(f\left(x\right)=0.7854{x}_{1}{x}_{2}^{2}\left(3.3333{x}_{3}^{2}+14.9334{x}_{3}-43.0934\right)-1.508{x}_{1}\left({x}_{6}^{2}+{x}_{7}^{2}\right)+7.4777\left({x}_{6}^{3}+{x}_{7}^{3}\right)+0.7854({x}_{4}{x}_{6}^{2}+{x}_{5}{x}_{7}^{2})\)

Subject to: \({g}_{1}\left(x\right)=\frac{27}{{x}_{1}{x}_{2}^{2}{x}_{3}}-1\le 0,\)

$${g}_{2}\left(x\right)=\frac{397.5}{{x}_{1}{x}_{2}^{2}{x}_{3}}-1\le 0,$$
$${g}_{3}\left(x\right)=\frac{1.93{x}_{4}^{3}}{{x}_{2}{x}_{3}{x}_{6}^{4}}-1\le 0,$$
$${g}_{4}\left(x\right)=\frac{1.93{x}_{5}^{3}}{{x}_{2}{x}_{3}{x}_{7}^{4}}-1\le 0,$$
$${g}_{5}\left(x\right)=\frac{1}{110{x}_{6}^{3}}\sqrt{{\left(\frac{745{x}_{4}}{{x}_{2}{x}_{3}}\right)}^{2}+16.9.{10}^{6}}-1\le 0,$$
$${g}_{6}\left(x\right)=\frac{1}{85{x}_{7}^{3}} \sqrt{{\left(\frac{745{x}_{5}}{{x}_{2}{x}_{3}}\right)}^{2}+157.5.{10}^{6}}-1\le 0,$$
$${g}_{7}\left(x\right)=\frac{{x}_{2}{x}_{3}}{40}-1\le 0,$$
$${g}_{8}\left(x\right)=\frac{5{x}_{2}}{{x}_{1}}-1\le 0,$$
$${g}_{9}\left(x\right)=\frac{{x}_{1}}{12{x}_{2}}-1\le 0,$$
$${g}_{10}\left(x\right)=\frac{1.5{x}_{6}+1.9}{{x}_{4}}-1\le 0,$$
$${g}_{11}\left(x\right)= \frac{1.1{x}_{7}+1.9}{{x}_{5}}-1\le 0$$
$$2.6\le {x}_{1}\le 3.6, 0.7\le {x}_{2}\le 0.8, 17\le {x}_{3}\le 28, 7.3\le {x}_{4}\le 8.3, 7.8\le {x}_{5}\le 8.3, 2.9\le {x}_{6}\le 3.9, 5\le {x}_{7}\le 5.5$$

Speed Reducer Design is yet another constrained real-world engineering design problem (Fig. 13). This problem falls under the CEC 2020 benchmarking suit. This problem is recognized due to its complex nature, complicated search space, and high dimensionality with constraints. This problem is solved using HSO, and the experimental results are shown in Tables 32 and 33 respectively. The experimental results show that HSO performs marginally well in this problem. TDO and TSA outperforms and were ranked first and second, respectively. The statistical and parameters graph of HSO with other algorithms are shown in Figs. 14 and 15, respectively, while the convergence curve of HSO is shown in Fig. 13.

Fig. 13
figure 13

Convergence curve of HSO for speed reducer design

Table 32 Comparison table of HSO for Speed reducer design
Table 33 Statistical analysis of speed reducer design
Fig. 14
figure 14

Parameters graph of HSO for speed reducer design

Fig. 15
figure 15

Statistical graph of HSO for speed reducer design

7.1.5 3 Bar truss design

The goal of this engineering design problem is to build a truss [121] with three bars to save weight. The search space for this problem is extremely constrained.

The mathematical model of this problem are as follows:

Consider: \(\left[{x}_{1}, {x}_{2}\right],\)

Minimize: \(f\left(x\right)=\left(2\sqrt{2}{x}_{1}+{x}_{2}\right)\times L,\)

Subject to:

$${g}_{1}= \frac{\sqrt{2}{x}_{1}+{x}_{2}}{\sqrt{2}{x}_{1}^{2}+{2{x}_{1}x}_{2}}P-\sigma \le 0,$$
$${g}_{2}= \frac{{x}_{2}}{\sqrt{2}{x}_{1}^{2}+{2{x}_{1}x}_{2}}P-\sigma \le 0,$$
$${g}_{3}= \frac{1}{{x}_{1}+\sqrt{2}{x}_{2}}P-\sigma \le 0,$$
$$where 0\le {x}_{1}\le 1. The constants are L=100 cm, P=\frac{2KN}{{cm}^{2}}and \sigma =\frac{2KN}{{cm}^{2}}.$$

The comparison table of HSO with various algorithms are as follows:

3 Bar truss is another CEC 2020 real-world constrained optimization problem benchmarking suit. Table 34 shows the comparison results of HSO for the three Bar truss problems. HSO slightly outperforms other competing algorithms in this case. DEDS and SSA performed similarly and were ranked first, with MBA and PSO-DE ranking second and third. In this category, HSO was ranked sixth, with a slight advantage over the others in determining the optimal weight. Figures 16 and 17 depicts a convergence and parameters graph of the threE−bar truss design problem respectively.

Table 34 Comparison table of HSO for 3 Bar truss design
Fig. 16
figure 16

Convergence curve of HSO for 3 bar truss design

Fig. 17
figure 17

Parameters graph of HSO for 3 bar truss design

7.1.6 Optimal gas production facilities

Since the consumption of gas products is so great today in the actual world, it is imperative to upgrade the facilities for the production of gas. It is quite challenging to supply the most facilities required for gas production [124] everywhere because there are many locations where they cannot be. Therefore, determining the ideal production facility capacity that joins to form an oxygen producing and storing system is an optimization problem.

This optimization problem can be modelled mathematically as:

Consider: \(X=[{X}_{1},{X}_{2}]\)

Minimize:\(F\left(x\right)=61.8+5.72{x}_{1}+0.2623{\left[\left(40-{x}_{1}\right)In\left(\frac{{x}_{2}}{200}\right)\right]}^{-0.85}+0.087\left(40-{x}_{1}\right)In\left(\frac{{x}_{2}}{200}\right)+{700.23{x}_{2}}^{-0.75}\)

Subject to: \({x}_{1}\ge 17.5, {x}_{2}\ge 300\)

Bounds: \(17.5\le {x}_{1}\le 40, 300\le {x}_{2}\le 600\)

Determining the optimal capacity of gas production facilities is another IEEE CEC-2020-based constrained optimization problem. The HSO has been evaluated for this issue, and it outperformed four other algorithms in terms of performance (Fig. 18). With only 120 evaluations, HSO received the highest score and ranked first, followed by ALO and DE, in that order. The outcomes of a comparison between HSO and other algorithms for this problem, as well as the parameters graph, are displayed in Table 35 and Fig. 19, respectively. The convergence curve of this problem is illustrated in Fig. 18.

Fig. 18
figure 18

Convergence curve of HSO for optimal gas production design

Table 35 Comparison table of HSO for Gas production design
Fig. 19
figure 19

Parameters graph of HSO for optimal gas production design

7.2 Unconstrained real-world engineering design problems

7.2.1 Gear train design problem

Cost minimization of the gear ratio of a complex gear train [125] is the goal of this unconstrained discrete optimization gear train design problem. The definition of the gear ratio is:

$$Gea{r}_{ratio}=\frac{angular velocity of the output shaft}{angular velocity of the input shaft}$$

Four gears make up the gear design problem, and there should be less of a discrepancy between the necessary gear ratio (1/6.931) and the actual gear ratio. The total number of gearwheel teeth is the number of variables. Therefore, the goal is to determine the four-gear train's best tooth configuration to reduce the gear ratio.

The mathematical model of the problem is:

Consider:\(X=\left[{X}_{1}, {X}_{2}, {X}_{3}, {X}_{4}\right]=\left[{n}_{1}, {n}_{2}, {n}_{3},{n}_{4}\right],\)

Minimize: \(f\left(X\right)={\left(\frac{1}{6.931}-\frac{{X}_{3}{X}_{2}}{{X}_{1}{X}_{4}}\right)}^{2},\)

Subject to: \(12\le {X}_{1}, {X}_{2}, {X}_{3}, {X}_{4}\le 60,\)

The gear train design problem is an unconstrained real-world engineering design problem (Fig. 20). Table 36 shows the results of testing HSO to solve this problem. For this problem, HSO is compared to eight other competing algorithms. HSO excelled in this category, finishing first out of 500 total evaluations. ALO and CS were ranked second and third, respectively. Figure 21 shows a parameters graph of HSO with other algorithms while Fig. 20 represents convergence curve of HSO of this problem.

Fig. 20
figure 20

Convergence curve of HSO for Gear train design problem

Table 36 Comparison table of HSO for Gear Train design problem
Fig. 21
figure 21

Parameters graph of HSO for Gear train design problem

7.2.2 Parameter estimation for frequency–modulated (FM) sound waves.

Parameter estimation for Frequency-Modulated (FM) [126] sound waves one of the most challenging problem. The most challenging factors of today’s modern music systems are the FM sound wave synthesis. Its play’s a significant role for sound systems. To optimize the FM synthesizer parameter, the major issue is that it contains six dimensions. The six parameters vector of \(X\) are as follows:

\(X=\left[{a}_{1}, {\omega }_{1}, {a}_{2}, {\omega }_{2}, {a}_{3}, {\omega }_{3}\right].\) The following equation is used to optimize a sound wave using this vector. This is one of the most difficult problems since it involves a multimodal problem with a strong mathematical concept. This problem is optimized by various Nature Inspired optimization algorithms. This problem contains the lowest value \(f\left(\overrightarrow{{X}_{sol}}\right)=0\). The following is the mathematical model for this problem:

$$y\left(t\right)={a}_{1}.{\text{sin}}\left({\omega }_{1}.t.\theta + {a}_{2}.{\text{sin}}\left({\omega }_{2}.t.\theta + {a}_{3}.{\text{sin}}\left({\omega }_{3}.t.\theta \right)\right)\right)$$
$${y}_{0}\left(t\right)=\left(1.0\right).{\text{sin}}\left(\left(5.0\right).t.\theta -\left(1.5\right).{\text{sin}}\left(\left(4.8\right).t.\theta +\left(2.0\right).{\text{sin}}\left(\left(4.9\right).t.\theta \right)\right)\right)$$

Here \(\theta =2\pi /100\) and the parameters are defined in the range of \(\left[-6.4, 6.35\right]\) and the cost functions is calculated by using the sum of the square errors between the estimate wave and the target wave as follows:

$$f\left(\overrightarrow{X}\right)={\sum }_{i=0}^{100}{(y\left(t\right)-{y}_{0}\left(t\right))}^{2}$$

Frequency modulation sound waves represent an unrestricted engineering design problem in the real world. Using HSO to resolve this issue would yield marginal results at best. HSO achieved satisfactory results and ranked fifth in the minimization problem, whereas GWO and MFO achieved outstanding results and ranked first and second, respectively. The results of all algorithms discovered for this problem are summarised in Table 37. Figures 22 and 23 depict the parameters graph and convergence curve for this problem, respectively.

Table 37 Comparison table of HSO for parameter estimation of frequency modulation sound waves
Fig. 22
figure 22

Convergence curve of HSO for frequency modulation sound waves

Fig. 23
figure 23

Parameters graph of HSO for frequency modulation sound waves

8 Effect of the proposed algorithm (HSO) on training multilayer perceptron

The proposed MSCA is put to use for the purposes of multilayer perceptron training within this section (MLP). The term "MLP" refers to the neural networks that only have a single hidden layer and connections that only go in one direction between their neurons. Additionally, MLPs only have one hidden layer. In these MLPs, the layer that comes first is known as the input layer, and the layer that comes last is known as the output layer. After the weights, inputs, and biases have been provided, the following steps need to be taken in order to calculate the outputs of MLP:

  1. 1.

    The weighted sum of inputs is first calculated as follows:

    $${s}_{j}=\sum_{i=1}^{n}{W}_{i,j}{X}_{i}-{\theta }_{j}, j=\mathrm{1,2},\dots .,h$$

    \({W}_{i,j}\) represents the connection weights from the \({i}^{th}\) input node to the \({j}^{th}\) node in the hidden layer; \({x}_{i}\) is the \({i}^{th}\) input; and \({\theta }_{j}\) is the threshold or bias of the \({j}^{th}\) node in the hidden layer.

  2. 2.

    Each node's output in the hidden layer is calculated as follows:

    $${S}_{j}=\frac{1}{1+{e}^{{-s}_{j}}}=sigmoid\left({s}_{j}\right), j=\mathrm{1,2},..,h$$
  3. 3.

    Ultimately, the final output can be calculated using the outputs of the hidden layer nodes.

    $${o}_{k}=\sum_{j=1}^{h}{W}_{j,k}{S}_{j}-{\theta }_{k}{\prime}, k=\mathrm{1,2},\dots ,m$$
    $${o}_{k}=\frac{1}{1+{e}^{{-o}_{k}}}=sigmoid\left({o}_{k}\right), k=\mathrm{1,2},\dots ,m$$

    where \({W}_{j,k}\) indicate the connection weight from the \({j}^{th}\) hidden node to the \({k}^{th}\) output node, \({\theta }_{k}{\prime}\) is the threshold or bias of the \({k}^{th}\) output node. It is evident that the biases and weights used in an MLP model are what ultimately decides what the model produces in response to a set of inputs. Since training MLP's goal is to achieve highest classification accuracy for both training and testing samples. A common metric, which is used to evaluate the MLP is Mean Square Error (MSE) and is defined as follows:

    $$MSE=\sum_{i=1}^{m}{({o}_{i}^{k}-{d}_{i}^{k})}^{2}$$

    The desired output for the \({i}^{th}\) input is denoted by\({d}_{i}^{k}\), and the actual output, \({o}_{i}^{k}\), is given when the \({i}^{th}\) training sample appears in the input (where \(m\) is the total number of outputs). The MLP's efficacy can be tested by ensuring it optimises itself to produce the smallest mean squared error (MSE) across all training samples. Consider the following definition for the mean squared error,\(F\):

    $$F=\frac{\sum_{k=1}^{N}\sum_{i=1}^{m}{({o}_{i}^{k}-{d}_{i}^{k})}^{2}}{N}$$

    Here \(N\) denotes the number of training samples. Thus, during the training of MLP, the task is to minimize the objective function \(F\) given above with the decision variables called weights and biases.

    The number of samples used for training, denoted here by. Training an MLP entails, therefore, minimizing the objective function \(F\) with the decision variables weights and biases (Fig. 24).

    Fig. 24
    figure 24

    Training of multilayer perceptron

In this section, the effectiveness of the proposed HSO for training MLP is evaluated through the utilization of four different classification data sets, namely XOR, balloon, iris, and breast cancer. These data sets were obtained from the Machine Learning Repository at the University of California, Irvine (UCI). The experimental results are compared with eight popular competitor metaheuristics algorithm such as Sine Cosine Algorithm (SCA) [52], Equilibrium Optimizer (EO) [51], Whale Optimization Algorithm (WOA) [14], Grey Wolf Optimizer (GWO) [13], Moth Flame Optimization (MFO) [107], Arithmetic Optimization Algorithm (AOA) [68], Particle Swarm Optimization (PSO) [127], Multiverse Optimizer (MVO) [128]. In this scenario, the decision variables are constrained to a range of [10, 10]. For the XOR and Balloon dataset, the population size of all algorithms is set to 50, while for all other datasets, it is set to 200. Each algorithm has a hard limit of 250 iterations. Table 38 provides information about the datasets. The number of hidden nodes is assumed to be 2N+1 in this work, where N is the total number of inputs to the dataset. Table 39 displays the mean and standard deviation value of the objective function F (average of MSE) based on 10 replicate runs of each algorithm on each dataset. Classification success rates for all used algorithms are summarized in the same table. The proposed HSO outperforms the other classical meta heuristics algorithms in terms of MSE, standard deviation, classification rate, and statistical outcomes, suggesting that it is an effective MLP trainer. The HSO's enhanced capability to avoid the local optima during the search procedure is the root cause of the reduced MSE. Thus, the proposed HSO is preferable to other classical metaheuristics for use in practical optimization tasks like MLP training, as shown by the comparative study.

Table 38 Details of the datasets
Table 39 Experimental results for different datasets

9 Conclusion and future scope

In summary, this research presents the Hyperbolic Sine Optimizer (HSO), an innovative meta-heuristic algorithm developed to address scientific optimization issues. What distinguishes HSO from traditional approaches is its emphasis on mathematical principles, particularly the investigation of algebraic and hyperbolic \(sinh\) function in the context of population dynamics. This study stands out by uniquely focusing on the behavioral patterns linked to hyperbolic function convergence in both exploration and exploitation phases, thereby promoting the active involvement of population members and improving overall efficiency.

The conducted experiments, spanning 23 standard and 42 varied benchmark functions, including prominent IEEE CEC-2015 and CEC-2017 benchmarks, demonstrate the superior performance of HSO. Notably, HSO outperforms other metaheuristics for unimodal functions in 30 dimensions and achieves exact global optima for specific functions across various dimensions. The stability and predictability of HSO are further affirmed in results for 100, 500, and 1000 dimensions. In addressing multimodal functions, where local stagnation avoidance is crucial, HSO exhibits remarkable performance, surpassing various metaheuristics for specific functions and providing exceptional outcomes for others. The evaluation extends to 42 supplementary functions, showcasing HSO's ability to attain precise global optima across diverse categories. Extensive analysis and qualitative assessment, including trajectory plots, average fitness values, contour plots, and diversity graphs, reinforce the optimization prowess of HSO. Real-world engineering design problems, both constrained and unconstrained, further validate HSO's superiority in solution quality and computational efficiency over well-known optimization algorithms. Moreover, our exploration extends to the application of HSO in training multilayer perceptron’s, revealing its superiority over alternative metaheuristics.

As we look towards the future, this study identifies key research directions to enhance optimization strategies and problem-solving capabilities. These directions include combining metaheuristics, developing real-time adaptive metaheuristics for dynamic environments, scaling for large problems using parallel and distributed computing, and improving explain-ability and interpretability for critical decision-making. As a future extension of this research, incorporating arithmetic and evolutionary operators, along with integrating complementary methods such as Levy flight, disruption, mutation, and opposition-based learning, is planned. This aims to propose new and enhanced versions of HSO tailored for optimization problems involving binary, discrete, or multiple objectives. Boosting the efficiency of HSO will involve further integration with stochastic elements like local search or global search strategies. Exploring the diverse applications of HSO in various fields signifies a significant advancement and opens avenues for continued innovation in meta-heuristic optimization.