1 Introduction

Optimization is the process of searching the optimal values for a particular problem. Optimization problems can be consulted in a variety of scientific fields such as economy, engineering, and medicine. Therefore, the development of optimization algorithms is necessary and many researchers all over the world are working in this field.

One of the main weaknesses of classic optimization algorithms is local optima stagnation, whereby they lack sufficient ability to find global optima. Some of them need derivation of search space as well. Therefore, these kinds of algorithms are not highly efficient in solving real-world problems.

In comparison to classic optimization algorithms, metaheuristic algorithms are problem-independent with stochastic operators for solving optimization problems. Randomness is one of the main characteristics of these algorithms. Metaheuristic algorithms are becoming increasingly popular because 1) they are more robust in avoiding local optima than classical optimization algorithms, and 2) they do not require the gradient of the cost function.

Metaheuristic algorithms are divided into two classes: single-based and population-based. Single-based metaheuristic algorithms start with a single solution and try to improve it over some iteration processes. Tabu Search [1], Simulated Annealing [2, 3], Variable Neighbourhood Search [4], Hill Climbing [5], and Iterated Local Search [6] are some of the most famous algorithms in this class. Unlike single-based metaheuristics, population-based metaheuristics start with a set of solutions (population). They then iteratively create a new population of solutions. In this way, information can be exchanged among the set of solutions. The main advantage of population-based metaheuristics is that they avoid getting stuck in the local optima. This type of metaheuristic algorithms is one of the most well-known optimization algorithms that has been widely applied in various applications such as medical systems [7, 8], car engine design [9], fault diagnosis [10], and food quality [11].

From another perspective, metaheuristics are divided into three categories: evolutionary, swarm-based, and physics-based algorithms. Evolutionary algorithms are inspired by evolutionary behaviours in nature. Genetic algorithm, (GA) which was proposed by Holland [12], is the most well-known evolutionary algorithm. The general idea of this algorithm is based on Darwin’s theory of evolution. GA starts with random candidate solutions. Then, the recombination and mutation operators are applied to generate new solutions. Finally, a selection approach is used to select solutions for the next generation. Some of the other evolutionary-based metaheuristics are Evolution Strategy (ES) [13], Genetic Programming (GP) [14], Differential Evolution (DE) [15], Probability-based Incremental Learning (PBIL) [16], Evolutionary Programming (EP) [17] and Biogeographybased Optimization (BBO) [18]

The next category of metaheuristic algorithms, i.e. swarm-based algorithms (SA) is inspired by the social behaviour of animals in nature. Some of the popular SAs are Particle Swarm Optimization (PSO) [19, 20] inspired by the social and individual behaviour of birds, Artificial Bee Colony (ABC) [21] inspired by the food searching behaviour of bee swarm, Cuckoo Search (CS) [22] that mimics the unusual behaviour in the laying of eggs, Firefly Algorithm (FA) [23] inspired by the flashing characteristics of fireflies, Shuffled Frog Leaping Algorithm (SFLA) [24] that gets the idea from the social behaviour of frogs, Grey Wolf Optimizer (GWO) [25] that simulates the hunting behaviour of grey wolves, and Whale Optimization Algorithm(WOA) [26] inspired by the social behaviour of humpback whales.

The third category of metaheuristic algorithms is physics-based algorithms which mimic physical rules in nature. Some of the most popular algorithms in this class are Simulated Annealing (SA) [2, 27], Gravitational Search Algorithm (GSA) [28], Water Cycle Algorithm (WCA) [29], and Mine Blast Algorithm (MBA) [30].

Human-based metaheuristic (HM)algorithms are introduced as a new category in some papers [26, 31]. These algorithms imitate human behaviours and characteristics. Harmony Search (HS) [32] and Imperialist Competitive Algorithm (ICA) [33] are two examples of human-based metaheuristics.

The two common characteristics among population-based metaheuristic algorithms are intensification (exploitation) and diversification (exploration). Intensification tries to find better solutions by searching around the best solutions. In contrast, diversification refers to the algorithm’s ability to explore the promising area of search space. These two criteria are usually in conflict with each other, and finding a proper trade-off between intensification and diversification is one of the most important challenges in the development of metaheuristic algorithms.

According to No Free Lunch (NFL) theorem [34], there is no metaheuristic algorithm to solve all optimization problems optimally. In other words, a metaheuristic algorithm can be highly efficient for some problems, while it may be poorly efficient for some others. Hence, the development of this research area is an open problem, and many researchers try to propose new metaheuristic algorithms or improve one of them.

The present study proposes a novel population-based metaheuristic algorithm called Human Mental Search (HMS). The HMS algorithm is inspired by the exploration strategies of the bid space in online auctions. The HMS algorithm has three leading operators: mental search, grouping, and moving. The mental search creates some new solutions around a solution based on Levy flight that leads to enhanced diversification and intensification properties, simultaneously. Another operator is grouping, whereby the solutions are grouped into some regions using a clustering algorithm. Finally, the moving operator tries other solutions to get close to the promising region. Preliminary studies indicate that HMS could outperform existing algorithms such as PSO, HS, SFLA, ABC, ICA, BBO, FA, GWO, and WOA. The remainder of this paper is organized as follows:

In Section 2, the proposed Human Mental Search (HMS) algorithm is explained. The statistical results for standard benchmarks are discussed in Section 3. Finally, Section 4 presents the conclusions of the present study and some recommendations for future researches.

2 Human Mental Search (HMS)

The current study proposes a new population-based metaheuristic algorithm based on the exploration strategies of the bid space in online auctions called Human Mental Search (HMS). This section first explains the source of inspiration and, then presents HMS algorithm

2.1 The source of inspiration

Recently, Radicchi et al [35, 36] demonstrated that humans apply the Levy flight strategy to explore the bids space in online auctions. The exploration of bid space is a search process, but of the mental kind because it works in an abstract space. To this end, they are considered participants in a new generation of online auction called Lowest Unique Bid (LUB). The auction winner might be able to buy an expensive product at the lowest price; cars, electronic devices, and even houses can be purchased with just a few hundred dollars.

The period of an auction is announced in advance. A bid could be of any value from a minimum value L to a maximum value H. Each time a participant makes a bid, he/she has to pay a fee. Every participant has permission to go for multiple bids. The winner of the auction is a person who has placed the lowest unique number and can buy the product for the value of the winning bid. For example, in Fig. 1, the winner is the participant who made a bid of $3 because this bid shows the lowest unique bid. Other bids are not unique except for $5, which is not the lowest one.

The Highest Unique Bid (HUB) auction has the same mechanism except that the winner is the participant who has the highest unique bid. In Fig. 1, the participant with $5 is the winner in the HUB auction. Participants in LUB and HUB auctions attempt to find a single target whose position is determined by the bids.

Fig. 1
figure 1

Lowest unique bid auction

Radicchi et al. [35, 36] showed that the bid space exploration performed by the participants has an explosive manner, which means that the consecutive bid values are close together but sometimes, the participants do longer jumps. Figure 2 shows a typical example of consecutive bid values by a person. In other words, the exploration of the bid space is consistent with Levy flight. At the end of each auction, the losing participants tend to pick the winner strategy, and so they get close to the winner’s strategy for the next auction.

Fig. 2
figure 2

A typical example of consecutive bid values by a person

2.2 HMS algorithm

This subsection explains HMS algorithm. The following concepts are used to develop this algorithm:

  1. 1.

    Each participant has a strategy α,

  2. 2.

    Each person can provide a bid,

  3. 3.

    The next bid of every person is consistent with the Levy flight distribution,

  4. 4.

    Multiple bids are allowed,

  5. 5.

    The losing participants try to pick the winner’s strategy for the subsequent auctions.

The HMS algorithm is explained in detail below.

2.2.1 Generating initial bids

The HMS algorithm is a population-based metaheuristic algorithm. Like other population-based metaheuristic algorithms, the searching process starts with the generation of a random population of candidate solutions. In this algorithm, each single solution is called a bid. In an N V a rdimensional optimization problem, a bid is represented as follows:

$$ bid=[x_{1},x_{2},\ldots,x_{N_{Var}}] $$
(1)

Cost value of a bid is obtained by evaluating the cost function, as:

$$ Cost \ Value \ of \ \text{a}\ bid=f(bid)=f(x_{1},x_{2},\ldots,x_{N_{Var}}) $$
(2)

First, a bids matrix of size N p o p × N V a r is generated as follows:

$$ X=\left[\begin{array}{c} X_{1}\\X_{2}\\:\\X^{N}_{pop} \end{array}\right]= \left[\begin{array}{cccc} x^{1}_{1}&x^{1}_{2}&\ldots&x^{1}_{N_{Var}}\\ x^{2}_{1}&x^{2}_{2}&\ldots&x^{2}_{{N}_{Var}}\\ :&:&:&:\\ x^{N_{popo}}_{1}&x^{N_{popo}}_{2}&\ldots&x^{N_{pop}}_{V_{Var}} \end{array}\right] $$
(3)

where N p o p is the number of bids, N V a r is the number of variables, and X is the bids matrix.

2.2.2 Mental search

The mental search represents the number of consecutive values produced for each bid. In this stage, some new bids are created around a bid based on Levy flight. The number of other new bids for each bid is a random integer number between the upper and lower limits. Levy flight is a particular type of random walk determining step size with a Levy distribution. Random walk is a Markov chain in which the next position depends only on the current position. Figure 3 shows an example of Levy flight against a Brownian motion. As shown in the figure, there are a lot of small steps and sometimes long jumps in Levy flight. In other words, Levy flight increases the quality of diversification and intensification simultaneously. It is a valuable point that Levy flight is more efficient than Brownian motion to explore the unknown spaces.

Fig. 3
figure 3

An example of Levy flight against Brownian motion

The following equation shows the Levy distribution:

$$ L(x)=\frac{1}{\pi}{\int}^{\infty}_{0}\exp(-\alpha q^{\beta})\cos(qx)dx $$
(4)

where β is called distribution index which is limited to 0 < β ≤ 2 and α is the distribution scale factor.

To generate each new position in the mental search, Levy flight is applied based on (5):

$$ NS=X^{i}+S $$
(5)

And S is calculated as below:

$$ \mathrm{S}=(2-iter^{\ast}(2\ \max \ iter))^{\ast}\alpha \oplus Levy $$
(6)

where max iter is the maximum iteration, iter is the current iteration, and α is a random number. The product ⊕ means entry-wise multiplications. Component of \((2-iter^{\ast } (2/\max iter))\) is a reduction factor, and it is actually reduced from 2 to 0. This factor lays emphasis on the diversification and intensification. The bigger reduction factor shows the long jumps and it increases the process of diversification at the beginning of the algorithm, while the smaller reduction factor indicates the smaller jumps and it enhances the process of intensification in the later stages.

The generation of step size S in not trivial while using Levy flight. A simple method discussed in detail by Yang [37, 38] can be summarized as follows:

$$\begin{array}{@{}rcl@{}} \mathrm{S}&=&(2-iter^{\ast}(2/ \max iter))^{\ast}\alpha \oplus Levy\\ &&=(2-iter^{\ast}(2/ \max iter))^{\ast}0.01^{\ast}\frac{u}{v^{1/\beta}} {^\ast(\text{x}^{i}-x^{\ast})} \end{array} $$
(7)

where x is the best position obtained so far, and u and v are the random numbers from the normal distribution as below:

$$ u:N(0,\sigma^{2}_{u}), v : N (\sigma^{2}_{v}) $$
(8)

with

$$ \sigma_{u}=\left\{\begin{array}{c} \frac{\Gamma(1+\beta)\sin(\frac{\pi\beta}{2})}{\Gamma[(\frac{1+\beta}{2})]\beta2^{(\beta-1)/2}} \end{array}\right\}\text{} ^{1/\beta},\sigma_{v}=1 $$
(9)

where Γ is the standard Gamma function.

One of the main parameters in Levy flight is β. This parameter is different for each bid because each person has a different strategy. For each bid, a random number is assigned to the parameter β between 0 and 2. The lower β shows the bigger jumps, and increases the ability to explore unknown area (or diversification). The higher β indicates the smaller jumps, and increases the intensification process.

Figure 4 illustrates mental search process for three specified bids A, B, and C in a one dimensional problem. The number of other new bids for bids A, B, and C are 4, 3, and 5 respectively. As can be observed, each bid produces other new bids (red and blue points in Fig. 4) with random positions around a bid that increases the intensification property. Moreover, sometimes there are long jumps that help the diversification property. Finally, each bid will be replaced with the best bid generated by using the mental search operator (red points in Fig. 4). This process must be conducted for all the bids.

Fig. 4
figure 4

Mental search

Fig. 5
figure 5

Grouping operator for a problem with one dimension

2.2.3 Grouping the bids

Every person may make multiple bids. To simulate the multiple bids, a grouping procedure is proposed. Each group shows the bids belonging to a person. The process of grouping is performed by a clustering algorithm. Clustering is a pattern recognition technique for grouping a set of instances, whereby the instances in the same group are more similar to each other than to those in the other groups. In this paper, well-known clustering algorithm K-means algorithm [39] is chosen for this purpose. After grouping, the mean cost value of each group is calculated. It can be said that as the number of local optima goes up, a greater number of clusters is required. However, the number of local optima is unknown in advance.

In other words, the search space is divided into some regions with the promising region chosen by the mean cost value. Figure 5 illustrates the grouping operator for a problem with one dimension (N v a r = 1 ). In this figure, there are 12 candidate solutions, which are divided into three groups.

2.2.4 Moving bids toward the best strategy

As mentioned earlier, the losers try to get close to the winner’s strategy. After the bid groups are created, the bid group with the best mean cost value is selected as the winner group for other bids that determine a promising region. Then, the best bid in the winner cluster is selected in order to move the rest of the bids toward it. It is worth mentioning that the best cost value among all the bids might not belong to the winner group.

The following formula is proposed in this regard:

$$ ^{t+1}x^{i}_{x}=^{t}\!x^{i}_{x}+C^{\ast}(r\times^{t}\!winner_{n}-^{t}x^{t}_{n}) $$
(10)

where \(^{t+1}{x_{n}^{i}}\) indicates the nth element of X i at iteration of t + 1,t w i n n e r n is the nth element of the best bid in the winner cluster t is the current iteration, C is a constant number (In this paper, C = 2), and r is a random number drawn from the uniform distribution between 0 and 1. Figure 6 shows how a bid updates its position using the moving operator.

Fig. 6
figure 6

Position updating in the HMS algorithm

2.2.5 General structure of HMS algorithm

In this section, the HMS algorithm for solving optimization problems is explained. The pseudo code for the HMS algorithm is presented in Fig. 7. Similar to other population-based metaheuristic algorithms, the HMS algorithm starts with a set of random bids(X). Then, the cost of each bid is calculated. For each bid, an integer number (q) is generated that shows the number of mental searches for each bid. In this step, the mental search operator is applied, which generates some new bids (NS) around each bid X i using Levy flight. Then, the best solution generated in the previous step is replaced by X i if its cost value is better than X i. Later, the search space is divided into some groups by using a clustering operator. The winner group is the group with the best mean cost value that determines a promising region. In the next step, the other bids move toward the best bid in the winner cluster.At each stage, the best bid is saved. This process will continue until a stop condition is satisfied.

Fig. 7
figure 7

The pseudo code for the HMS algorithm

To see how the HMS algorithm can be effective in solving optimization problems, some points are noted below:

  • Mental search allows obtaining neighbouring solutions around a solution. Therefore, it enhances the quality of intensification simultaneously. In addition, this operator increases their diversification property because sometimes there are the long jumps.

  • Reduction factor allows the HMS algorithm to move smoothly from diversification to intensification.

  • The grouping operator quickly finds the promising regions of the search space. It clearly differs from other population-based metaheuristic algorithms which, try to find the promising region by using the best solution. The best solution may not be a good representative for the promising region.

  • The best solution in the winner cluster guides other solutions toward the promising regions of the search space.

  • There is a high probability of solving local optima stagnation because of Levy flight.

  • HMS algorithm is a population-based metaheuristic algorithm. Therefore, it intrinsically takes the advantages of high diversification and the local optima avoidance as compared to the single-based metaheuristic algorithms.

  • The best solution of each iteration is saved (elite).

  • HMS algorithm has very few parameters to be adjusted.

  • HMS algorithm is a gradient-free algorithm and considers problems as a black-box. Hence, various problems in different fields can be solved by using the HMS algorithm.

2.2.6 Differences between HMS and other population-based metaheuristic algorithms

This section points out some distinctions of the HMS algorithm vis-à-vis other population-based metaheuristic algorithms. Some of the most important distinctions are listed below.

  • One of the major differences, is that the HMS algorithm uses a clustering algorithm to determine the promising region. Most of metaheuristic algorithms such as PSO, ICA, and DE find the promising regions by using the best solutions. However, the best solution may not be a good representative for the promising region. Figure 8 shows that the best solution (red circle) sometimes does not show the best region. In the HMS algorithm, we used a clustering procedure to find the promising region. As a result, finding the promising region is based on several similar (close) solutions, thereby increasing the probability of finding a promising region.

  • Since the HMS algorithm is mostly similar to strategies such as PSO and DE, a comparison between HMS algorithm and these algorithms is given below:

    1. 1)

      HMS, PSO, and DE algorithms use the solution movement, but the movement strategy is different. In the PSO algorithm, the movement direction of each agent is calculated by using only the two best positions, pbest i , and gbest. In the DE algorithm, each agent moves on the basis of the differences among other solutions. But in the HMS algorithm, the agent direction is calculated on the basis of the best solution in the winner cluster. As has been achieved so far, it is likely that the best solution in the winner cluster is not necessarily the best one.

    2. 2)

      Unlike PSO and DE, HMS searches around a solution using Levy flight.

    3. 3)

      PSO uses a type of memory for updating the velocity (because of pbest i , and gbest). However, HMS is memory-less and only the current position of solutions affects the updating procedure.

  • In mental search, we use an operator based on Levy flight. Although Levy flight can also be seen in the Cuckoo search (CS) algorithm. CS algorithm has used Levy flight for generating a new solution by using the following equation:

$$ x^{t+1}_{i}=x^{t}_{i}+\alpha.levy $$
(11)

where α is a constant (step size).

In the following, the main differences between the HMS and the CS algorithms are explained:

  1. 1)

    The proposed algorithm has used a different strategy to generate a new solution. In (7), a reduction factor, \((2-iter\ast (2/\max iter))\), is used to increase the efficiency of the algorithm. It is reduced from 2 to 0. This factor increases the ability of the diversification and the intensification. At the beginning of the algorithm, the reduction factor has a big value which enhances the process of diversification. In the later stages, it meets a reduction, which increases the intensification property

  2. 2)

    The point that parameter β in (7) is different for each solution, leads to an increase in efficiency. Lower β emphasizes on the diversification and higher β shows the intensification.

  3. 3)

    The HMS algorithm uses moving solutions toward the best strategy, which is not observed in the CS algorithm.

  4. 4)

    As previously mentioned, the HMS algorithm uses a clustering algorithm to find the promising region.

  5. 5)

    Eventually, it’s noticed that the search ideas of these algorithms are different. CS mimics the behaviour of cuckoos, while HMS simulates the human mental search.

Fig. 8
figure 8

The best solution may not be a good representative for the promising region

3 Experimental results

In this section, experimental results are presented from different aspects to study the proposed algorithm’s efficiency. The test functions can be divided into seven groups: unimodal, multimodal, fix-dimension, high dimensional, composite functions, shifted, and rotated test functions, and classic engineering problems [17, 4042]. The proposed algorithm is compared with nine state-of-the-art population-based metaheuristic algorithms, which are briefly described below.

  • Particle Swarm Optimization (PSO) [19, 20]: It is one of the most well-known population-based metaheuristic algorithms inspired by the social behaviour of birds. Each search agent updates its position using its velocity, its own best position, and the best position of the overall search agents.

  • Harmony Search (HS) [32]: HS is one of the most popular population-based metaheuristic algorithms. It is inspired by the improvisation of music players. The HS algorithm has three main operators: harmony memory consideration, pitch adjustment, and randomization.

  • Shuffled Frog-leaping Algorithm (SFLA) [24]: The SFLA algorithm mimics some interesting behaviour of frogs. Actually, this algorithm is a memetic algorithm that combines local and global searches, simultaneously.

  • Artificial Bee Colony (ABC) [43]: The ABC algorithm imitates the foraging behaviour of honey bees. There are two groups of bees in the ABC algorithm: scouts who search the area surrounding the nest for new food sources, and onlookers who find a food source through the information shared by the employed artificial bees

  • Imperialist Competitive Algorithm (ICA) [33]: This algorithm is inspired by the imperialistic competition among countries. The solutions are divided into two groups based on their power: imperialists and colonies. The two leading operators in this algorithm are assimilation and revolution. Assimilation makes the colonies get closer to the imperialist and revolution is a random sudden change in the position of some solutions.

  • Biogeography-based Optimization (BBO) [18]: BBO is based on the mathematical model of biogeography. There are two main operators in the BBO algorithm: migration and mutation. Information is shared among solutions that depend on the emigration rate and the immigration rate of each solution. In addition, mutation is used to enhance the diversity of the population.

  • Firefly Algorithm (FA) [23]: FA mimics the flashing behaviour of fireflies. In this algorithm, for any two fireflies, the less cost value will be attracted by the more cost value.

  • Grey Wolf Optimizer (GWO) [25]: The GWO algorithm is one of the studies conducted recently in this field. This algorithm simulates the hunting method of grey wolves. GWO has four primary operators: hunting, searching for prey, encircling, and attacking it. This algorithm has provided competitive performance in comparison to other algorithms.

  • Whale Optimization Algorithm (WOA) [26]: This algorithm is one of latest population-based metaheuristic algorithms that mimics the hunting behaviour of humpback whales. WOA includes three operators: searching for prey, encircling prey, and bubble-net foraging behaviour of humpback whales. This algorithm has presented competitive results compared to other state-of-the-art population-based metaheuristic algorithms.

In all the experiments, the population size and the number of iterations are set as 30 and 500, respectively, for all the algorithms. For the HMS algorithm, the number of clusters in the experiments is supposed to be 5. In addition, the default parameter values for C, M L, and M H are set as 2, 2, and 10, respectively. The default parameter settings for the other algorithms compared are listed in Table 1. It is worth mentioning that parameter tuning is a trial and error process. Therefore, finding a proper value for the parameters is time-consuming, and so we have not made an effort to find the best parameters. Hence, the analysis of parameters needs more research in the future.

Table 1 Default parameter settings

Metaheuristic algorithms are stochastic algorithms. Therefore, the results are not the same in each run of the algorithm. Consequently, each algorithm is run 50 times and the statistical results including average, standard deviation, minimum, and maximum are reported.

3.1 Results of the algorithm on unimodal test functions

The unimodal functions have single global optimum and no local optima, and thus they can be used to evaluate the intensification of algorithms. These test functions are listed in Table 2 in which D is the dimension of the problem, Range is the boundary of the search space, and f m i n is the optimum value. Figure 9 shows the typical 2D plot of test functions. The results of unimodal functions are reported in Table 3. According to this table, the HMS algorithm provides very competitive results in most of the test cases. The HMS algorithm finds the best solution for F1, F2, F3, F4, and F7, but fails to find the best solution for F5 and F6. Nevertheless, its performance is acceptable. These results can also be observed from the convergence curves in Fig. 10. According to Fig. 10, the convergence rate of HMS is high in most cases. This is due to the grouping operator introduced previously

Fig. 9
figure 9

Search space of unimodal test functions

Fig. 10
figure 10

Convergence curves on unimodal test functions

Table 2 Unimodal test functions
Table 3 The statistical results of unimodal test functions

The rank of each algorithm is shown from the smallest average to the highest average in Table 3. To this end, the average rank of each algorithm, and subsequently the overall ranks are reported. The results show that the HMS algorithm achieved the highest rank, which indicates its competence regarding intensification.

3.2 Results of the algorithm on multimodal test functions

Multimodal test functions have many local optima. These functions are useful for examining the diversification of the algorithm and escaping from the local optima. Table 4 presents the details of multimodal test functions. A 2D plot of the search space in these functions is shown in Fig. 11. Table 5 presents the results of the HMS algorithm on multimodal test functions. The HMS algorithm gives the best performance for functions F9, F10, and F11 and the second best performance in most of the remaining test functions. The rank of algorithms is computed on the basis of the average value of each test function, and the average and overall ranks are calculated for each test function. In the context of this table, the HMS algorithm earns the best rank compared to the others. As per the characteristics of multimodal functions, it can be said that the HMS algorithm has an excellent capability for diversification. The convergence curves of the test functions are shown in Fig. 12, which indicates that the HMS algorithm converges faster than the other algorithms in most of the cases. Although HMS and WOA can find the best optimum in F9 function, according to Fig. 12, the HMS algorithm can find the best optimum in less than 30 iterations.

Table 4 Multimodal test functions
Table 5 The statistical results of multimodal test functions
Fig. 11
figure 11

Search space in multimodal test functions

Fig. 12
figure 12

Convergence curves on multimodal test functions

3.3 Results of the algorithm on fixed-dimension multimodal functions

These functions are similar to the multimodal functions; however, their dimensions are low and fixed. Table 6 presents the details of the test functions. A 2D plot of the functions is shown in Fig. 13. The results of the HMS algorithm on fixed-dimension multimodal functions are given in Table 7 and Fig. 14. Table 7 shows that the HMS algorithm has the best performance on the majority of the test functions. The rank of each algorithm is computed in Table 7. It can be observed from the last row of this table, that the HMS algorithm is ranked first for the fixed-dimension functions.

Fig. 13
figure 13

Search space in the fixed-dimension multimodal test functions

figure c
Fig. 14
figure 14

The convergence curves on fixed-dimension multimodal test functions

Table 6 Fixed-dimension multimodal test functions
Table 7 The statistical results of fixed-dimension multimodal test functions

3.4 Results of the algorithm on high dimensional test functions

To demonstrate the competence of the HMS algorithm, the results for solving the 200 dimensional problems on unimodal and multimodal test functions in the previous subsections are reported in Table 8. As seen in Table 8, the HMS algorithm has the best optimization algorithm in nine out of 13 test functions. WOA can also find the best solution for five out of 13 test functions. This table also shows that the HMS algorithm was ranked first, while the second rank went to the WOA algorithm. The weak performance of most of the algorithms shows that high-dimensional test functions are very challenging, which certifies the HMS algorithm’s high efficiency to solve high-dimensional problems.

Table 8 The statistical results of high dimensional test functions

3.5 Results of the algorithm on complex test functions

This category of test functions comprises composite functions, which are very challenging because they have enormous number of local optima and various shapes for different regions of the search space. The details of the composite functions are presented in Table 9. The search space of the complex test functions is shown in Fig. 15. These functions can check the balance between intensification and diversification, which is a cardinal characteristic of population-based metaheuristics. Table 10 presents the statistical results of 50 independent runs. According to this table, the HMS algorithm is very competitive in comparison to others; it especially presents the most efficient results for all the composite functions except for F29, for which it gives the second best result. It can also be observed that HMS has the overall first rank to solve composite test functions. It is evident that the HMS algorithm can balance intensification and diversification more than the other algorithms compared. Figure 16 shows that the HMS algorithm converges toward the optimum solution faster than the others.

Table 9 The complex test functions
Table 10 The statistical results of the complex test functions
Fig. 15
figure 15

Search space in the complex test functions

Fig. 16
figure 16

The convergence curves on the complex test functions

3.6 Results of the algorithm on shifted and rotated test functions

To further verify the effectiveness of the HMS algorithm, a set of 15 CEC 2005 optima shifted and rotated test functions were evaluated in this subsection. In these functions, the optimum is shifted or rotated to other locations to provide the more challenging test functions. A summary of these functions is shown in Table 11, and their further details can be found in [44]. The 2D plot of these functions can be seen in Fig. 17. Table 12 presents the experimental results of the algorithms. From the results, the HMS algorithm performs the best on seven test algorithms and the second best in five test functions. The BBO algorithm was the best in two test functions. The rank of each algorithm is computed in Table 12. It can be seen that the HMS algorithm ranks first for the shifted and rotated test functions.

Table 11 Shifted and rotated test functions
Table 12 The statistical results of shifted and rotated test functions
figure d
Fig. 17
figure 17

Search space in shifted and rotated test functions

3.7 Nonparametric statistical analysis results

In this section, nonparametric statistical methods are applied to compare the HMS algorithm with the other algorithms. Such statistical results are necessary due to the stochastic nature of metaheuristics [45]. Nonparametric statistical tests are divided into two classes: pairwise comparisons and multiple comparisons. The pairwise comparison is a comparison between two algorithms, while the multiple comparisons compare more than two algorithms. In this paper, two well-known nonparametric tests, the Wilcoxon signed rank test (pairwise comparison) and the Friedman test (multiple comparisons), are conducted for this purpose. Further details about nonparametric statistical tests for metaheuristics can be found in [45].

The null hypothesis H0 states that there is no difference between two algorithms, whereas the alternative hypothesis H 1 indicates a difference. A level of statistical significance (α) is used to determine the probability of the rejecting null hypothesis while it is true. If the p-value is less than α then H0 is rejected.

Table 13 shows the results of the Wilcoxon signed ranked test. However, this P-value is not suitable because the same data is evaluated several times [45, 46] and an accumulated error is obtained because of the combination of pairwise comparisons. Thus, a post hoc test is necessary to control Family-Wise Error Rate (FWER), which is the probability that at least one comparison test would reject a correct null hypothesis. For this purpose, the well-known Holm method [47] is applied for post hoc analysis (see [45] for more information about this method). Table 12 presents the unadjusted and adjusted P-values. It is evident that the HMS algorithm gives a significant improvement over all the compared algorithms with a significance level of 0.05.

Table 13 Wilcoxon signed ranked test

Another statistical test conducted to demonstrate the effectiveness of HMS algorithm is Friedman test. As mentioned earlier, it is a multiple comparison test. The details of this statistical approach can be found in [45]. Table 14 shows the average ranks computed with the Friedman statistical test. As per this table, the HMS algorithm provides the best rank. In Table 14, the P-value substantially depicts existence of significant differences among the examined algorithms.

Table 14 The Friedman ranks

3.8 Sensitivity analysis

The following will discuss the sensitivity analysis on the input parameters of the HMS algorithm. Sensitivity analysis is the study of how different values of input parameters impact the output. This analysis represents the robustness of the HMS algorithm to changes in the input parameters. It is clear that fewer values are preferable because it shows easier tuning procedure.

For this purpose, F10 and F13 are studied to evaluate sensitivity analysis (this approach is inspired by the method applied in [46, 48] for sensitivity analysis). The reason for choosing the F10 function is that the HMS algorithm provides the best results. Therefore, this discussion aims to show that this superiority is kept after changing the value of the input parameters. The reason for choosing F13 function is that the HMS algorithm does not provide the best result, and so the following discussion shows that the HMS algorithm can provide better results by changing the input parameters.

As shown in Figs. 18 and 19, the results are highly dependent on the number of clusters. Hence, finding the appropriate number of clusters is necessary to achieve a better solution. F7 is a unimodal function. Therefore, it has one local optimum; so, a large number of clusters do not need to find a suitable region. As seen in Fig. 18, the performance decreases by increasing the number of clusters. F13 has many local optima. Therefore, more clusters are needed. It is also seen in Fig. 19, as expected, that increasing the number of clusters enhances the performance of the algorithm.

Fig. 18
figure 18

Plot of input parameters corresponding to F7 test function solved by HMS

Fig. 19
figure 19

Plot of input parameters corresponding to F13 test function solved by HMS

Another parameter in the HMS algorithm is the maximum and minimum mental processes. To analyse the sensitivity of these parameters, the algorithm was tested with different values. According to the experimental results, the maximum and minimum mental processes significantly affect the performance. According to Figs. 18 and 19, increasing the number of minimum and maximum mental processes enhances performance because of more searches being carried out around a specific point.

3.9 Classical engineering problems

In this section, the performance of HMS algorithm is evaluated with three engineering design problems: pressure vessel design, welded beam design, and three-bar truss design. These problems are constrained. Constraint optimization is the process of finding the optimal value of an objective function with respect to some constraints on decision variables. There are two types of constraints on decision variables: inequality and equality constraints. To solve these problems, a constrained handling approach should be added to the optimization algorithm. There are different approaches for constrained handling such as penalty functions, repair algorithms, and special operators. Finding a proper constraint handling approach is out of the scope of this work, and therefore one of the simplest methods, death penalty, is chosen to optimize a constraint problem. To this end, death penalty will assign a big objective function value if a solution breaks any of constraints.

3.9.1 Pressure vessel design

Pressure vessel design is a popular engineering problem in the literature (Fig. 20). The aim of this problem is to minimize the fabrication cost of a vessel. This problem has four variables and four equality constraints. The four variables are thickness of the head (T h ), thickness of the shell (T s ), inner radius(R), and length of the cylindrical section without considering the head (L). These parameters are shown in Fig. 20. This problem is formulated as follows:

$$ \begin{array}{ll} \text{Consider}&\vec{x}=[x_{1},\mathrm{x}_{2},\mathrm{x}_{3},\mathrm{x}_{4}]=[T_{s},T_{h},\mathrm{R},\mathrm{L}],\\ \text{Minimize}&f(\vec{x})=0.6224\mathrm{x}^{1}\mathrm{x}_{3}\mathrm{x}_{4}+1.7781\mathrm{x}_{2}\mathrm{x}^{2}_{3}+3.1661\mathrm{x}^{2}_{1}\mathrm{x}_{4}+19.84\mathrm{x}^{2}_{1}\mathrm{x}^{3},\\ \text{Subject \ to}&g_{1}(x)=-x_{1}+0.0193x_{3}\leq 0,\\ &g_{2}(x)=-x_{3}+0.0095x_{3}\leq 0,\\ &g_{3}(x)=-\pi x^{2}_{3}x_{4}-\frac{4}{3}\pi x^{3}_{3}+1296000\leq 0,\\ &g_{4}(x)=x_{4}-240\leq 0,\\ \text{Variable \ range:}&0\leq x_{1}\leq 99,\\ &0\leq x_{2}\leq 99,\\ &0\leq x_{3}\leq 200,\\ &0\leq x_{4}\leq 200, \end{array} $$
(12)
Fig. 20
figure 20

Pressure vessel design problem

This problem is popular among researchers. The literature review shows that metaheuristic and mathematical approaches have been applied to optimize the variable of this problem. Some metaheuristic methods applied on this problem are GA [4951] PSO [52] DE [53] ES [54], ACO [55] GWO [25], and WOA [26]. The mathematical methods are augmented Lagrangian multiplier [56] and branch-and-bound [57]. The results of the HMS algorithm compared to the other algorithms in literature are presented in Table 15. As can be observed, the HMS algorithm outperforms all the other compared algorithms.

Table 15 Comparison results for the pressure vessel design problem

3.9.2 Welded beam design

The aim of this problem is to design a welded beam with the lowest fabrication cost. The overall structure of this problem is shown in Fig. 21. There are four variables to be optimized, including the weld thickness (h), the length of the bar’s attached part (l), the bar’s height (t), and the bar’s thickness (b). These variables should satisfy seven constraints.

Fig. 21
figure 21

Welded beam design problem

This problem can be written as follows:

$$\begin{array}{@{}rcl@{}} \text{Consider}&\vec{x}=[x_{1},\mathrm{x}_{2},\mathrm{x}_{3},\mathrm{x}_{4}]=[h,l,t,b],\\ \text{Minimize}&f(\vec{x})=1.10471x^{2}_{l}x_{2}+0.04811x_{3}x_{4}(14.0+x_{2}),\\ \text{Subject \ to}&g_{1}(\vec{x})=\tau (\vec{x})-\tau_{\max}\leq 0,\\ &g_{2}(\vec{x})=\alpha (\vec{x})-\alpha_{\max}\leq 0,\\ &g_{3}(\vec{x})= \delta (\vec{x})-\delta_{\max}\leq 0, \end{array} $$
(13)
$$\begin{array}{@{}rcl@{}} &&g_{4}(\vec{x})= x_{1}-x_{4}\leq 0,\\ &&g_{5}(\vec{x})= P-P_{c}(\vec{x})\leq 0,\\ &&g_{6}(\vec{x})= 0.125-x_{1}\leq 0,\\ &&g_{7}(\vec{x})= 0.10471x^{2}_{1}+0.04811x_{3}x_{4}(14.0+x_{2})-5.0\leq0\\ &&\tau(\vec{x})=\sqrt{(\dot{\tau})+2\dot{\tau}\dot{\tau}\frac{x_{2}}{2R}+(\dot{\tau})^{2}},\\ &&\dot{\tau}=\frac{P}{\sqrt{2x_{1}x_{2}}},\\ &&\dot{\tau}=\frac{MR}{J},\\ &&M=P(L+\frac{x_{2}}{2}), \end{array} $$
$$\begin{array}{@{}rcl@{}} &&R=\sqrt{\frac{x^{2}_{2}}{4}+\left( \frac{x_{1}+x_{3}}{2}\right)^{2}} \\ &\text{where:}&J=2\left\{\sqrt{2}x_{1}x_{2}\left[\frac{x^{2}_{2}}{4}+\left( \frac{x_{1}+x_{3}}{2}\right)^{2}\right]\right\},\\ &&\alpha(\vec{x})=\frac{6PL}{x_{4}x^{2}_{3}},\\ &&\delta(\vec{x})=\frac{6PL^{3}}{Ex^{2}_{3}x_{4}},\\ &&P_{c}(\vec{x})=\frac{4.013E\sqrt{\frac{x^{2}_{3}x^{6}_{4}}{36}}}{L^{2}}\left( 1-\frac{x_{3}}{2L}\sqrt{\frac{E}{4G}}\right), \end{array} $$
$$\begin{array}{@{}rcl@{}} &&P=6000lb,L=14in.,\delta_{\max}=0.25in,\\ &&E=13600psi,\mathrm{E}=12\times10^{6}psi,\\ &&\tau_{\max}=13600psi,\sigma_{\max}=30000psi\\ &\text{Variable \ range:}&=0.1\leq x_{1}\leq 2,\\ &&=0.1\leq x_{2}\leq 10,\\ &&=0.1\leq x_{3}\leq 10,\\ &&=0.1\leq x_{4}\leq 2 \end{array} $$

Keeping in mind the popularity of this problem, numerous lines can be found on it in the literature. Some are GA [58, 59], HS [60], CPSO [61], GWO [25], WOA [26], Richardson’s random method [62], Simplex method [62], Davidon-Fletcher-Powell [62], and Griffith and Stewart’s successive linear approximation [62]. The comparison results are presented in Table 16. As seen from the table, the HMS algorithm can find a design with the minimum cost.

Table 16 Comparison results for the welded beam design problem

3.9.3 Three-bar truss design problem

The objective of the three-bar truss design problem is to design a truss with the minimum weights. The overall structure of this problem is shown in Fig. 22. It is popular in the literature [6366] due to its difficult constraint search space [64, 67]. In fact, constraints are the most important issues in designing a truss. It can be formulated as below:

$$\begin{array}{@{}rcl@{}} &\text{Consider}&\vec{x}=[x_{1},\mathrm{x}_{2}]=[A_{1}, A_{2}],\\ &\text{Minimize}&f(\vec{x})=(2\sqrt{2}x_{1}+x_{2})^{\ast}l,\\ &\text{Subject \ to}&g_{1}(\vec{x})=\frac{\sqrt{2}x_{1}+x_{2}}{\sqrt{2}x_{1}^{2}+2x_{1}x_{2}}P-\sigma\leq0,\\ &&g_{2}({x})=\frac{x_{2}}{\sqrt{2}x_{1}^{2}+2x_{1}x_{2}}P-\sigma\leq0,\\ &&g_{3}({x})= \frac{1}{\sqrt{2}x_{2}+x_{1}}P-\sigma\leq0,\\ &\text{Variable range:}& 0\leq x_{1},x_{2}\leq 1,\\ &\text{where}&l=100cm\quad p=2kN/cm^{2} \quad \sigma=2kN/cm^{2} \end{array} $$
(14)
Fig. 22
figure 22

Three-bar truss design problem [64]

Table 17 compares the results of the HMS algorithm with the other algorithms. The results of the algorithms demonstrate that the HMS algorithm outperforms four algorithms. In addition, HMS shows very close results to MFO, DEDS, and PSO-DE algorithms. It shows that the HMS algorithm can solve the three-bar truss design problem effectively. It should be noted that the TSA algorithm does not satisfy one of the constraints [67].

Table 17 Comparison results for three-bar truss design problem

4 Conclusion

In this study, we proposed a simple but powerful population-based metaheuristic called human mental search (HMS). HMS is inspired by the exploration strategy of the bid space in online auctions. The HMS algorithm employs three important behaviours namely mental search, grouping, and moving. First, each solution produces other new solutions based on Levy flight. Levy flight simultaneously enhances the quality of the diversification and the intensification. The solutions are then placed in the different groups. We used K-means algorithm, a well-known clustering algorithm for grouping the solutions. In moving strategy, each solution moves toward the best group. To verify the efficiency of the HMS algorithm, several test functions that are commonly applied in the literature were conducted, including unimodal, multimodal, fix-dimension, complex, high dimensional, shifted, and rotated functions. The performance of the HMS algorithm was compared with nine state-of-the-art population-based metaheuristics. The results revealed that the HMS algorithm provides superior performance in most cases. Moreover, three classic engineering problems were evaluated to show the HMS algorithm’s efficiency in unknown and challenging search spaces.

Also, some nonparametric statistical methods, including Wilcoxon signed rank test and Friedman test, were provided. The results of the Wilcoxon signed rank test followed by the post hoc analysis indicated an improvement of HMS over all the compared algorithms with the level of significance being α = 0.05. The Friedman rank test showed that HMS was ranked first. Moreover, the P-value confirmed the existence of significant differences among the compared algorithms.

Another test conducted was the sensitivity analysis on the input parameters of HMS. The sensitivity analysis showed that HMS kept its superiority even with changing the parameters values. The sensitivity analysis on another test function revealed that HMS can improve performance by changing the settings.

The reasons for the HMS algorithm’s high performance could be: (1) mental search operator as it searched around a solution that enhances intensification property, (2) grouping operator because it quickly finds the promising regions of search space, (3) reduction factor as it allows the HMS algorithm to move smoothly from diversification to intensification, (4) moving operator because other solutions move toward the promising region using this operator, and (5) the type of the HMS algorithm because it’s a population-based algorithm that intrinsically takes advantages of high diversification and local optima avoidance compared to the single-based metaheuristic algorithms.

Not withstanding the significant performance of the HMS algorithm, the following points should be considered for future investigations:

  • Some test functions are used for evaluating the performance of the HMS algorithm. In the future, the performance of HMS on some more problems can be examined, especially real world applications such as scheduling, knapsack problem, controller design, microwave design problem, and water resource management.

  • In the current work, the k-means clustering algorithm is used for grouping the solutions. Other clustering algorithms such as hierarchical clustering and distribution-based clustering algorithms can be applied as well.

  • One of the most important parameters of the HMS algorithm is the number of clusters. Some clustering algorithms can find the optimum number of clusters automatically. For future research, it is recommended to use such algorithms for clustering.

  • In the current work, the performance of the HMS algorithm is experimentally conducted only on standard test functions. The convergence of the HMS algorithm should be analysed theoretically by dynamic systems in the future.