Abstract
Image segmentation is the process of separating pixels of an image into multiple classes, enabling the analysis of objects in the image. Multilevel thresholding (MTH) is a method used to perform this task, and the problem is to obtain an optimal threshold that properly segments each image. Methods such as the Kapur entropy or the Otsu method, which can be used as objective functions to determine the optimal threshold, are efficient in determining the best threshold for bi-level thresholding; however, they are not effective for MTH due to their high computational cost. This paper integrates an efficient method for MTH image segmentation called the heap-based optimizer (HBO) with opposition-based learning termed improved heap-based optimizer (IHBO) to solve the problem of high computational cost for MTH and overcome the weaknesses of the original HBO. The IHBO was proposed to improve the convergence rate and local search efficiency of search agents of the basic HBO, the IHBO is applied to solve the problem of MTH using the Otsu and Kapur methods as objective functions. The performance of the IHBO-based method was evaluated on the CEC’2020 test suite and compared against seven well-known metaheuristic algorithms including the basic HBO, salp swarm algorithm, moth flame optimization, gray wolf optimization, sine cosine algorithm, harmony search optimization, and electromagnetism optimization. The experimental results revealed that the proposed IHBO algorithm outperformed the counterparts in terms of the fitness values as well as other performance indicators, such as the structural similarity index (SSIM), feature similarity index (FSIM), peak signal-to-noise ratio. Therefore, the IHBO algorithm was found to be superior to other segmentation methods for MTH image segmentation.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Introduction
Segmentation has an important role in the field of image processing1. Segmentation is the process of separating an image into two or more homogeneous segments based on the characteristics of the pixels in the image. It is utilized in various scopes, such as industry and medicine2, agriculture3, and surveillance4. Thresholding is one of the most common image segmentation approaches. To define the thresholds, most methods use the histogram of the image5, which is vital for determining the probability distribution value of pixels in the image6. Thresholding obtains the information of the histogram from an image and determines the best threshold ((th)) for categorizing the pixels into various groups. Image thresholding approaches can be categorized into two types: multi-level and bi-level thresholding. Bi-level thresholding techniques use one threshold to separate an image into two groups, whereas multi-level thresholding (MTH) uses two or more thresholds to separate an image into many groups1.
To obtain the best threshold values in MTH segmentation, thresholding techniques can be classified into two approaches: non-parametric and parametric. In parametric techniques, each group of grayscale range should be consistent with a Gaussian distribution. Parametric approaches are dependent on the evaluation of the histogram using mathematical operations. The Gaussian mixture is widespread, where used to define the set of operations that convergent the histogram, and the best thresholds are then selected. Non-parametric approaches employ distinct methods to separate the pixels into homogeneous areas; then, the best threshold is defined using statistical information, such as entropy or variance. The Kapur method7 and Otsu method8 are used in this study. The Otsu method selects the best thresholds by the maximization of the variance among groups. In the Kapur method, the threshold value is defined by minimizing the cross entropy between a segmented image and the original image. These methods are efficient for one or two th values of thresholds. However, they have several restrictions; for example, they are very costly in computation, mostly when the number of thresholds increases. Non-parametric techniques have several advantages. Specifically, in terms of computation, these methods are computationally faster than parametric methods, especially when used in optimization problems. Metaheuristic algorithms (MAs) can be used in the search process. Generally, these algorithms provide better results than techniques dependent on thresholding methods9,10.
Metaheuristic algorithms are used to solve challenging real-world problems. In the past several decades, researchers have extensively demonstrated the ability of MAs to solve several types of difficult optimization problems in various areas, such as optimization11, communications12, bioinformatics13, drug design14, Image segmentation15,16 and feature selection17, mainly due to the fact that these algorithms are general-purpose and easy to implement18. MAs are commonly inspired by nature and can be classified into four main categories: Evolutionary-based, swarm-based, physics-based, and human-based algorithms. Evolutionary-based algorithms (use mechanisms inspired by biological evolution, such as recombination, crossover, mutation, and the heritage of features in offspring19. Candidate solutions to optimization problems are represented as individuals in a population, and the quality of the solutions is determined by the fitness function. Two main Evolutionary-based algorithms are differential evolution (DE)20 and the genetic algorithm (GA)21, which are inspired by biological evolution, while swarm-based algorithms mimic the mass behavior of living creatures. Living creatures interact with each other in nature to achieve optimal mass behavior22. An offshoot is particle swarm optimization (PSO)23, which mimics the hunting behavior of groups of fish and birds. Physics-based algorithms are generally inspired by physics to generate factors that enable search for the optimal solution in the search scope24,25. Some of the most common categories in this branch are the gravitational search algorithm (GSA)26 and electromagnetism optimization (EMO)27. Human-based algorithms are inspired by human gregarious demeanor. The common and recent used algorithms in this category are teaching–learning-based optimization (TLBO)28, and the heap-based optimizer (HBO)29.
With respect to MTH in image processing, it is possible to use thresholding approaches such as the Otsu or Kapur method30 as the objective function. The problem is not only concerned with the increased number of thresholds, but is also related to the image; for this reason, each image is an autonomous problem concerned with the levels of thresholding used for segmentation31. The optimal segmentation threshold values must be highly accurate in most processes. Therefore, the use of MAs has been expanded in this field. The moth swarm algorithm discussed in32 was used to obtain the best threshold values with the Kapur method based on previous literature. In addition, a modified firefly algorithm was proposed in33 for image processing, and used the Kapur and Otsu methods as objective functions. In34, ant colony optimization was used in image segmentation based on a multi-threshold image segmentation method with Kapur entropy and a non-local two-dimensional histogram. In35, the researchers used a novel concept called a hyper-heuristic with MTH image segmentation, in which each iteration determined the optimal execution sequence of MAs to determine the best threshold values.
In10, the black widow optimization algorithm10 was proposed to determine the optimal threshold using the Kapur or Otsu method as an objective function with a multi-level threshold. In36, the crow search algorithm was utilized in conjunction with the Kapur approach and 30th values to obtain the optimal threshold. In37, the authors proposed the efficient krill herd algorithm to determine the best thresholds at various levels for color images, where the Tsallis entropy, Otsu method, and Kapur entropy were utilized as fitness functions. Harris hawks optimization (HHO) is a new algorithm, and its hybridization was achieved by adding another powerful algorithm, the differential evolution (DE) algorithm38. Specifically, the entire population was split into two equal subpopulations, which were assigned to the HHO and DE algorithms, respectively. This hybridization used the Otsu and Kapur approaches as fitness functions. In39, the authors combined the classical Otsu’s method with an energy curve for applying the segmentation of colored images in multilevel thresholding. The water cycle algorithm (WCA) is integrated with Masi entropy (Masi-WCA) and Tsallis in40 to segment the color image. the results of the experiment proved the superiority of the WCA for multilevel thresholding with Masi entropy compared to other competitive algorithms. The authors in41 used a multi-verse optimizer (MVO) algorithm based on the Energy-MCE thresholding approach for searching the accurate and near-optimal thresholds for segmentation.
In the same context, Elaziz et al.42 proposed DE as a technique to select the best MAs to determine the optimal threshold for the Otsu method. Opposition-based learning (OBL)is one of the important effective methods to improve search efficiency of meta-heuristic algorithms43. The hyper-heuristic method based on a genetic algorithm was presented in44 and estimates various MAs for determining the optimal threshold for each image using a predetermined value of th using the Otsu method. In45, new efficient version of the recent chimp optimization algorithm (ChOA) was proposed to overcome the weaknesses of the original ChOA and called opposition-based Lévy Flight chimp optimizer (IChOA). The IChOA is applied to solve the problem of MTH using the Otsu and Kapur methods as objective functions. In this paper, several MAs, including SCA, MFO, SSA, and EMO, were combined with Otsu. As mentioned, the utilization of MAs in MTH is growing rapidly, and a summary of various approaches can be found in46.
According to the No Free Lunch theorem, this signifies that there is no ideal algorithm for a particular problem47. For this reason, any algorithm must be evaluated for a real problem to demonstrate its performance. MTH based on OBL are frequently used to solve a diversity of other optimization problems. Therefore, this paper seeks to further the research in the image segmentation field by utilizing the recent heap-based optimizer (HBO). The HBO was introduced in29 for optimization. This algorithm mimics the job responsibilities and descriptions of employees. The staff are coordinated in a hierarchy, and a nonlinear tree-shaped data structure is used to represent the heap. The benefit of these algorithms is that types with unsuitable fitness are deleted from the circle, leading to improved convergence speed. Based on the advantages of the HBO and the No Free Lunch theorem, this paper aims to present an alternative version from HBO called IHBO algorithm to discover the optimal solution of complex MTH problems and overcoming the weaknesses of the original HBO.
The proposed method for MTH based on the HBO is called IHBO, and applies the Kapur and Otsu methods individually to obtain the optimal threshold from benchmark images. IHBO explores the search area determined by a histogram technique to provide the best threshold values using a set of factors inspired by humans’ career hierarchy. The performance of IHBO is evaluated through various tests in which benchmark images are utilized with many levels of complexity. The segmentation results are estimated using various assessments, such as the structural similarity (SSIM) index48, feature similarity (FSIM) index49 and peak signal-to-noise ratio (PSNR)50. Furthermore, IHBO algorithm was evaluated on the CEC’2020 test suite and compared against seven well-known metaheuristic algorithms including the basic HBO29, SSA51, MFO52, GWO53, SCA54, HS55, and EMO27. The evaluations are executed through various non-parametric and statistical techniques to determine whether the optimal solutions provided by the IHBO are superior.
The main contributions of this paper can be summarized as follows:
-
An efficient HBO based on OBL called IHBO to overcome the weaknesses of the original HBO is presented.
-
Evaluating the effectiveness of IHBO on the CEC’2020 test suite.
-
IHBO is proposed to solve the problem of high computational cost for MTH .
-
Proving the ability of the IHBO to solve the image segmentation problems using the Kapur’s entropy and Otsu’s method as fitness function.
-
Verify the image quality using set of metrics FSIM, PSNR and SSIM to obtain optimal solutions.
-
Evaluating the performance of the provided method based on the various segmentation degrees to estimate stability of the optimizer and evaluate quality of the segmentation.
The remainder of this paper is organized as follows. “Preliminaries” section describes the materials and methods used in this study, while “The proposed IHBO algorithm” section presents the proposed algorithm. “Environmental and experimental requirements” section illustrates the environmental and experimental requirements, while “Experimental results and discussion” section presents the performance evaluation and experimental results. Finally, conclusions and proposals for future work are provided in “Conclusions and future works” section.
Preliminaries
This section introduces the materials required to implement the proposed segmentation method, as well as the approaches implemented based on the above-mentioned approaches.
Objective functions formulation
The entropy criterion of the Kapur7 approach and between-class variance of the Otsu8 approach are widely utilized to determine the optimal threshold value th in image segmentation. Both algorithms were developed for bi-level thresholding techniques. An approach can be readily extended for solving MTH problems.
Otsu method for segmentation
The Otsu method is an automatic and non-parametric technique used to determine the optimal thresholds of an image8. This method is based on the maximum variance of the various classes as a criterion to segment the image. The intensity levels L are taken from a grayscale image, and the equation below is used to calculate the probability distribution of the intensity value:
where i is a specific intensity level in the range \(0 \le i \le L-1\) and \(n_i\) is the number of gray level i appearing in the image. The number of pixels in the image is nk and \(Ph_i\) is the probability distribution of the intensity levels. For the simplest segmentation (bi-level), two classes are represented as
The probability distribution for \(C_1\) and \(C_2\) are \(\omega _0(th)\) and \(\omega _1(th)\), respectively, as illustrated in (3).
It is necessary to calculate the mean levels \(\mu _0\) and \(\mu _1\) that define the classes using (4). Once these values are calculated, the Otsu based between classes \(\sigma _B^{2}\) is calculated using (5) as follows:
Moreover, \(\sigma _1\) and \(\sigma _2\) in (5) indicate the variance of regions \(C_1\) and \(C_2\), and are calculated as
where \(\mu _T=\omega _0\mu _0+\omega _1\mu _1\) and \(\omega _0+\omega _1=1\). Based on the values \(\sigma _1\) and \(\sigma _2\), (7) provides the fitness function. Subsequently, the optimization problem is reduced to determine the intensity level that maximizes (7):
where \(\sigma _B^{2}(th)\) is the Otsu method variance for a given th value. EBO methods are used determine the intensity level th for maximizing the fitness function according to (7). The fitness or objective function \(F_{Otsu}(th)\) can be modulated for MTH as follows:
where \(TH= [th_1,th_2,\ldots th_n-1]\) represents a vector including MTH, and the variance calculations are as illustrated in (9).
Here i represents a class, and \(\omega _i\) is the occurrence probability, and \(\mu _j\) is the mean of a class. For MTH, these values are obtained as
and
Kapur entropy
Another non-parametric method used to determine the best threshold value of an image was proposed by Kapur in7. The approach determines the best (th) implying the overall entropy to be maximized. For a bi-level scenario, the Kapur target capacity can be determined as
where the entropies \(H_1\) and \(H_2\) are computed as follows:
In (13), \(Ph_i\) is the probability distribution of the intensity levels, which is computed by (1), and \(\omega _0(th)\) and \(\omega _1(th)\) are the probability distributions of classes \(C_1\) and \(C_2\), respectively. ln(.) represents the natural logarithm. Like the Otsu method, the entropy-based method can be modulated for MTH values. In this case, it is necessary to separate an image into n groups using a similar number of thresholds. The equation below can define the new objective function:
where \(TH=[th_1,th_2,\ldots th_{n-1}]\) is the vector including MTH. Each entropy is computed separately with its respective th values; thus, (14) is expanded for n entropies as follows:
Therefore, the values of probability occurrence \((\omega _0^c , \omega _1,\ldots , \omega _{n-1})\) of n classes can be determined using (10) and the probability distribution \(Ph_i\) in (1).
Heap-based optimizer (HBO)
The HBO mimics the job responsibilities and descriptions of the employees within a company29. Although the job title differ from company to another and from business to another, they are organized in a hierarchy and many of titles are given like corporate hierarchy structure, organizational chart tree, or corporate rank hierarchy (CRH), etc. The collection of methods that outlines how particular activities are directed to realize the goals of an organization and also defines how information flows among levels within the company56 is called an organizational structure. In this section, we explain the mathematical model of the Heap-based optimizer.
Mathematical modeling of the interaction with immediate boss
The upper levels set the rules and laws for employees within the centralized structure and subordinates follow their immediate boss. By the assumption that each immediate boss is a parent node of its children, thus we can model this behaviour by upgrading the location of each search agent \({{\vec {x}}_{i}}\) with reference to its original node B by using the below equation:
where t is the current iteration, and | | calculates the absolute value. \(\lambda ^k\) is the \(k^{th}\) component of vector \({\vec {\lambda }}\), and it is generated random as following
where r is a random number in range \(\left[ 0,1 \right]\). In Eq. (16), the designed parameter is \(\gamma\), this parameter is computed by the following rule:
The current iteration is t, T is the maximum iteration’s number, and C is a user defined parameter. while executing the iterations, \(\gamma\) decrease linearly from 2 to 0 and when reach to 0, it will increase again to 2 with iterations.
Modeling the interaction between colleagues mathematically
The employees having the same position are considered to be colleagues. Each employee interact with others to achieve the goals of an organization. By assuming that the nodes at the same level in heap are colleagues and others are search agents \({{\vec {x}}_{i}}\)and they update their position based on the position of others selected colleagues \({{\vec S}_r}\), the position of a search agent is calculated as follows:
where f is the objective function and calculates the fitness of each search agent. Equation (19) enables the search agents to explore the search space \(S_r^k\) if \(({{\vec S}_r}) < f({{\vec x}_i}(t))\) and allows to explore the search space \(x_i^k\) otherwise.
Self contribution of an employee
This stage explains the concept of employees self contribution. Modeling of this behavior are executed by retaining the prior position of the employee in the next iteration, as illustrated in below equation:
In Eq. (20), the search agent \({{\vec x}_i}\) does not change its rank for it’s kth design parameter in the next iteration. We used this behavior to organize the rate of change of each search agent in population.
Putting it all together
This phase explains how to combine the equations of position updating and modelling in previous subsections in one equation. There are three probabilities of selection that are used to determine equation used in updating position of search agents, this probabilities of selection is used to switch between exploration and exploitation phase. These probabilities is divided into three proportions \(p_1\), \(p_2\), and \(p_3\). The search agent updates its location using Eq. (20) according to the proportion \(p_1\). The below equation computes the outlines of \(p_1\).
The current iteration t, T is the maximum number of iterations. The search agent updates its location using Eq. (16) according to the selection of proportion \(p_2\). The below equation compute the outlines of \(p_2\).
Finally, the search agent updates its location using Eq. (19) according to the selection of \(p_3\). The below equation computes the outlines of \(p_3\).
A general position updating mechanism of HBO is computed as follows:
where \(p_1\), \(p_2\) and \(p_3\) are random numbers inside range [0, 1]. This subsection clarifies that the Eq. (20) improves exploration phase, Eq. (16) improves exploitation phase and convergence, and Eq. (19) allows the search agent to move from the exploration phase to exploitation phase. According to this observations, \(p_1\) is higher initially and decreases linearly over iterations, this decreases the exploration phase and improves exploitation phase with iterations. After calculating \(p_1\), the remainder of the span is splitted into two equal portions, which makes attraction towards the colleague and boss equally probable.
Steps of HBO
This section summarizes the HBO steps and clarifies details about their implementation-related calculations.
-
Parameters initialization and definition: At first, all the search agents are randomly initialized in a potential solution space. The minimum and maximum boundaries of the search space are defined by variables \((L_i,\ U_i)\) respectively. The number of the population is (N) and maximum number of iteration (T). The specific parameter C can be calculated from \(C=\left\lfloor T/25 \right\rfloor\).
-
Population initialization: The random population P is generated from N search agents, each consisting of D dimensions. The population’s representation P is shown as follows:
$$\begin{aligned} p = \left[ {\begin{array}{*{20}{c}} {\vec x_1^T}\\ {\vec x_2^T}\\ \vdots \\ {\vec x_N^T} \end{array}} \right] = \left[ {\begin{array}{*{20}{c}} {x_1^1}&{}\quad {x_1^2}&{}\quad {x_1^3}&{}\quad {}&{}\quad {x_1^D}\\ {x_2^1}&{}\quad {x_2^2}&{}\quad {x_2^3}&{}\quad {}&{}\quad {x_2^D}\\ {}&{}\quad {}&{}\quad {}&{}\quad {}&{}\quad {}\\ {x_N^1}&{}\quad {x_N^2}&{}\quad {x_N^3}&{}\quad {}&{}\quad {x_N^D} \end{array}} \right] \end{aligned}$$ -
Heap building: We utilize \(3-ary\) heap to execute CRH. Although heap is a tree shaped data structure, it can be executed using an array. The below operations are \(d-ary\) heap based operations required for the HBO execution.
-
1.
parent (i): By the assumption that the heap is performed as an array, this method receives the node’s index then retrieves its parent’s index. The formulation of parent’s index for a node i is calculated by below equation:
$$\begin{aligned} parent(i) = \left\lfloor {\frac{{i + 1}}{D}} \right\rfloor \end{aligned}$$(25)where \(\lfloor \rfloor\) indicates the floor function, which retrieves the highest integer less than or equal to a given input.
-
2.
child (i; k): The node can own a maximum of 3 childrens in a \(3-ary\) heap. Therefore we can say, the manager may not manages more than 3 direct persons. The index of the kth child of a node i is returned by this function. The below equation shows mathematical formulation of this function.
$$\begin{aligned} child(i,k) = D \times i - D + k + 1 \end{aligned}$$(26)For example,index of the 3nd child of 3nd node is calculated as:
$$\begin{aligned} child(3,3) = 12 - 4 + 3 + 1 = 12 \end{aligned}$$ -
3.
depth (i): Assuming the last level depth equals to 0, therefore we can calculate the depth of any node i in constant time through below formula:
$$\begin{aligned} depth(i) = \left\lceil {\log (D \times i - i + 1)} \right\rceil - 1 \end{aligned}$$(27)The ceil function is \(\lceil \rceil\), which retrieves the smallest integer greater than or equal to the input. For example, depth of a node indexed 27 in heap is calculated as: \(depth(27) = \left\lceil {\log _{3} (81 - 27 + 1)} \right\rceil - 1 = \left\lceil {{\text{2}}{\text{.6476}}} \right\rceil = 3\)
-
4.
colleague (i): Assuming that nodes at the same level of node i are the colleagues of this node. The index of any elected colleague of node i is returned by this step and the index can be calculated by generating any random integer in the range \(\frac{dd^{depth(i)-1)}-1}{D-1} +1, \frac{dd^{depth(i)-1)}-1}{D-1}\).
-
5.
Heapify_Up (i): searching upward in the heap then add node i at its correct place to save the heap property. Algorithm 1 show the pseudo code of this operation.
Finally, the algorithm to build the heap is shown in Algorithm 2.
-
6.
Repeated applications of position updating mechanism: search agents’ position is repeatedly updated according to previously explained equations trying to converge on the optimum global. The pseudo code of HBO is shown in Algorithm 3.
Opposition-based learning (OBL)
The idea of opposition-based learning (OBL) is applicable strategy of search strategy to avoid stagnancy in candidate solutions. OBL is a novel concept inspired from the opposite relationship between entities57. The concept of opposition was presented in 2005 as the first time, which has attracted a many of research efforts in the last decennium. Many of Met-heuristic algorithms use the concept of OBL to develop their performance such as harmony search algorithm58, grasshopper optimization59, ant colony optimization60, artificial bee colony61 and etc. OBL improve the exploitation phase of a search mechanism. Mostly in meta-heuristic algorithms, convergence occurs quickly when the initial solutions are closer to the optimal location; moreover, late convergence is expected. So that, OBL method produce novel solutions by considering opposite search areas which may prove to be nearer to the best solution. OBL is regraded not only the candidate solutions obtained by a stochastic iteration scheme, but also their ’opposite solutions’ located in opposite parts of the search space. The OBL method has been hybridized with many bio-inspired optimization gives shorter expected distances to the best solution compared to randomly sampled solution pairs62 such as cuckoo optimization algorithm63, shuffled complex evolution algorithm64, particle swarm optimization65, harmony search66, chaotic differential evolution algorithm67, and shuffled frog algorithm68. In optimization problems, the strategy of simultaneously examining a candidate and its opposite solution has the purpose of accelerating the convergence rate towards a globally best solution. According to previous related works, in initialization phase utilize OBL only to improve the convergence and prevent stuck in local optima of HBO, then IHBO is utilized to solve problem of multi-thresholding for image segmentation by use two objective functions called Kapur and Otsu.
The proposed IHBO algorithm
In this paper, the HBO algorithm is enhanced based on the OBL as local search strategy called IHBO to evade the drawbacks of the random population and improve the rate of convergence of the algorithm by developing the variety of its solutions. IHBO uses OBL strategy in the initialization phase to improve the search process as following:
where \(Q_i\) is a vector-maintaining solution resulting from the use of OBL, and \(UB_j\) and \(LB_j\) are the upper and lower bounds of the \(j^{t} h\) component of a vector X. The phases of the proposed image thresholding model are described in depth below.
Initialization phase
In this phase, the algorithm starts by reading the image, converting it to grayscale, computing the histogram of the selected benchmark images, and then computing the probability distribution by (1). The algorithm initializes the IHBO parameters, which are the population size (N), maximum iteration number (T), boundaries of the search space (\(L_{I}\), \(U_{I}\)), and number of iterations per cycle (t). Thereafter, the OBL strategy is utilized to calculate the \(Q_i\) vector-maintaining solution by (28).
Updating phase
This phase provides the best threshold values by evaluating the fitness value of \(X_i\) and \(Q_i\) populations. To update the search agents’ positions (X), we use the fitness value of the optimal threshold of the Otsu \(F_{Otsu}\) method (8) or Kapur \(F_{kapur}\) method (14) as the objective function then comparing the fitness value of \(X_i\) and \(Q_i\) and saving the global best solution with the highest fitness. We define the position of each agent based on the fitness value. In addition, we determine three probabilities of selection \(P_{1}\), \(P_{2}\), and \(P_{3}\) using (21), (22), and (23) sequentially, and then, based on the probabilities, we calculate the position of each agent within the heap using (24). The agent’s position (X) is updated using important \(D-ary\) heap-based operations, such as Heapify_Up(i), which is used to search for the superior node in the heap, and we insert the node at its correct position to preserve the heap characteristics, as demonstrated in Algorithm 1. Then, each agent upgrades its location frequently according to the best fitness value, and seeks the global optimum, as depicted in Algorithm 3. Optimization scenarios of implementing the proposed IHBO algorithm illustrated in Figure 1.
Segmentation phase
In this phase, we generate the segmented image with the optimal threshold value in an image after setting \(x_{heap}[1].value\) as the threshold value of the image. The pseudo-code of the proposed IHBO algorithm is illustrated in in Algorithm 4.
Computational complexity of the IHBO
This section discusses the computational complexity of IHBO algorithm. The complexity of the population’s initialization can be represented as \({\mathcal {O}}(N \times D)\) time complexity, where D and N indicate the dimension of the problem and the size of the population, respectively. Additionally, the IHBO calculates the complexity with the fitness of each search agent as \({\mathcal {O}}\left( N \times D \times T_{\max } \right)\), where the maximum number of iterations is \(T_{\max }\). Besides, the IHBO requires \({\mathcal {O}}(t)\) time complexity for executing t number of its main operations. Therefore, the time complexity of the proposed IHBO is \({\mathcal {O}}\left( N \times D \times t \times T_{\max }\right)\). But, the total amount of space occupied by the algorithm is called the space complexity. So, the space complexity of the proposed IHBO can be represented by \({\mathcal {O}}(N \times D)\).
Performance evaluation of the proposed IHBO algorithm
Parameter settings
This section provides the estimation of the proposed IHBO algorithm. As we all know, adjusting parameters will certainly affect the performance of an algorithm. However, according to the suggestion of Arcuri et al.69, when comparing algorithm performance, all algorithm parameters should be kept at their default values, which come from their original papers, to ensure they are in a relatively optimal state. Moreover, we reduce the risk of better parametrization bias as each algorithm is set to its default values. Therefore, in this work, all algorithm parameters are kept at their default values.
Thus, the performance of the proposed IHBO algorithm is evaluated over the IEEE Congress on Evolutionary Computation (CEC’2020)70 as test problems. The CEC’2020 benchmark functions is utilized to test the performance of IHBO algorithm. Initially, this benchmark functions contained 10 test functions referred to as \(f_1\)–\(f_{10}\). Consequently, function 1 is unimodal functions, functions 2–4 are multimodal functions, functions 5–7 are hybrid functions, and functions 8–10 are composition functions. Table 1 illustrates the parameters setting and mathematical formulation of the CEC’2020 benchmark functions; ’Fi*’ refers to the best global value. Figure 2 illustrates a 2D visualization of the CEC’2020 benchmark functions to understand the differences and the nature of each problem.
Statistical results analysis of CEC’2020 benchmark
This section illustrates CEC’2020 benchmark test are utilized to estimate the performance of the proposed IHBO that contain qualitatively and quantitatively metrics. The standard deviation (STD) and mean of optimal solutions acquired by the proposed algorithm and all another algorithms utilized in the comparison is calculated. Furthermore, the qualitative metrics consists of average fitness history, convergence curve, and search history is used to evaluate the performance of the IHBO on the CEC’2020 test suite against seven well-known metaheuristic algorithms including the original HBO algorithm, SSA, MFO, GWO, SCA, HS, and EMO. Table 2 shows the STD and mean of the optimal value obtained from the proposed algorithm and the other competing algorithms for each CEC’2020 benchmark functions with 20 dimensional, and the optimal results of the STD and mean is minimum values in results. The results of the mean and STD of the proposed algorithm are proved superiority in solving seven CEC’2020 benchmark functions against to other competing algorithms.
Boxplot analysis
Boxplot analysis is a graphical technique used to display data distribution characteristics. The boxplot technique is designed to report data that follow a normal distribution and have homogeneous variances, the results of boxplot for all algorithms and them functions are illustrated in Fig. 3. Boxplot is most important plots to describe data distributions into quartiles. This quartiles are the median of the first half of the data is first quartile, the second quartile is the median, the third quartile is median of the second half of the data, and the largest observation. The region among the first and third quartile is called the interquartile range and used to give an indication of spread in the data. The ends of the rectangles determine the lower and upper quartiles and a narrow boxplot refers to highest agreement among data. Figure 3 shows the boxplots of the proposed IHBO algorithm and illustrates the results of ten functions boxplot for Dim = 20. In reality, the results of proposed algorithm are proved superiority than all other competing algorithms on most of the test functions, but the performance of proposed algorithm is limited on F2, and F7.
Convergence curves analysis
This subsection explains the convergence plots of the proposed algorithm with other competitor algorithms. Figure 4 illustrates the convergence plots of IHBO, HBO, SSA, GWO, MFO, HS, and SCA for the CEC’2020 benchmark functions. Furthermore, the proposed algorithm achieved optimal solutions and reached a stable point for most functions. Thus, IHBO can solve problems that require fast computation, such as online optimization problems. The proposed algorithm exhibited stable behavior, and its solutions converged easily in most of the problems it was tested on. Due to space limitations.
Qualitative metrics analysis
Even though the earlier outcome analyses assure the high performance of the proposed IHBO algorithm, the performance of more experiments and analyses would help us to draw more clear conclusions about the algorithm performance in real problem solving. Figure 5 illustrates the qualitative analysis of the proposed IHBO algorithm. The first column illustrates a set of the CEC’2020 benchmark functions as plots in two-dimensional space. The second column illustrates the search history of search agents, from the first to the last iteration and display their exploitation behavior to realize the desired outcomes. The third column shows the average fitness history over 350 iterations, explaining the general behavior of the agents and the role that they play in the search of the best solution. According to average fitness history, all the history curves are decreasing, which means that the population improves at each iteration. The fourth column shows convergence curve and optimization history revealed the progress of fitness over a number of iterations. Optimization history is decreasing indicates that the solutions are optimized during iterations until reach the best solution.
Environmental and experimental requirements
This section presents the test images used for the experiments, then describes the empirical setup, and analyzes the results.
Benchmark images
To evaluate the proposed method, ten images of common benchmark were used. The selected benchmark images due to the various levels of complexity and included the following images: Baboon, Lena, Butterfly, Pirate, Cameraman, Peppers, Bridge, Living Room, Barbara, and Jetplane71,72. Most images had the same dimensions (512 \(\times\) 512 pixels); however, two images (Cameraman and Lena) were 256 \(\times\) 256 pixels. Table 3 displays the set of test images used.
Environmental setup
In this study, the proposed IHBO is compared with seven well-known metaheuristic algorithms including the original HBO, SSA, MFO, GWO, SCA, HS, and EMO. All competitor algorithms were applied and executed in Matlab 2015, and implemented on PC with 6G RAM running in a Windows 8.1 64-bit environment with an Intel Core I5 processor. The counterparts were executed 30 times per test image, number of iterations was set to 350, and population size is 50. The parameters settings of each algorithm were determined according to standard criteria and information found in previous literature (default values). The number of thresholds used was \(th_2,th_3,th_4,\) and \(th_5\) according to related literature73. The parameters settings of the IHBO and their values are presented in Table 4.
Evaluation metrics
Two metrics were utilized to estimate the performance of the IHBO algorithm. The first metric was used to evaluate the quality of the image, while the second metric was used to compare the edges of the segmented image. These metrics are important for evaluating the performance of the IHBO approach based on the Otsu and Kapur methods as objective functions. Statistical tests, such as the standard deviation (STD), Wilcoxon rank test, and average, were used to analyze the fitness of the proposed algorithm. We used the SSIM48, FSIM74, and PSNR75 to measure the quality and stability of the image.
Structural similarity index (SSIM)
The SSIM48 index is a metric that is used to analyze the internal structures in a segmented image. A higher SSIM value indicates better segmentation of the original image due to the fact that structures in the two images match. The equation below describes the SSIM:
The mean of the intensities of the original image I and segmented image Seg are \(\mu _I\) and \(\mu _{Seg}\), respectively, and \(\sigma _I\) and \(\sigma _{Seg}\) are the standard deviations of the original image I and segmented image Seg, respectively. \(\sigma _{I,Seg}\) is the covariance of the original image I and segmented image Seg, and \(c_1\) and \(c_2\) are two constants.
Feature similarity index (FSIM)
The FSIM74 index is a metric that is used to compute the similarity between the segmented image and original image based on their internal features. A higher FSIM value indicates better segmentation by the thresholding method. The FSIM can be described in the following steps:
The entire domain of the image is \(\omega\):
and G is the image’s gradient magnitude and can be computed as follows:
The vector’s magnitude in v on n is E(v), and the local amplitude of scale n is \(A_n(v)\). The small positive number is \(\epsilon\) and \(PC_m(v) = max(PC_1(v),PC2(v))\).
Peak signal-to-noise ratio (PSNR)
The PSNR75 is another metric used to evaluate the quality of segmentation by determining the difference between the quality of the original image and that of the segmented image. The PSNR is used to compare the original and segmented image using the root mean square error (RMSE) of each pixel, as expressed in (37). The PSNR can be defined as follows:
where
In (37), I and Seg are the segmented and original images of size \(M \times N\), respectively. A higher PSNR value indicates that there is higher similarity between the segmented and original images, which reflects a more effective segmentation process.
Experimental results and discussion
The experimental results are discussed in this section to evaluate the efficiency of the proposed algorithm.
Otsu results analysis
This subsection analyzes the outcomes of the IHBO based on the between-class variance as the fitness function, as proposed by Otsu. Table 7 illustrates the best threshold values obtained by applying the IHBO with the Otsu entropy as the objective function (8). Tables 5 and 6 present a graphical analysis of the thresholds, illustrating the resulting images of the IHBO with a different number of thresholds. Table 8 shows the computational time values of comparison algorithms obtained by Otsu’s method. The IHBO proved its superiority in computational time compared to other competitive algorithms with 23 cases in 40 experiments and came in the first place. GWO came in second place with 10 experiments, while HBO come in third place with nine experiments, followed by EMO with two experiments. Finally, the MFO came in fifth place with only one experiment, and the remaining algorithms could not obtain the best computational time in any of the experiments. Table 9 illustrates the Otsu STD and average of the fitness results for the benchmark images. The IHBO demonstrated superiority in MTH by obtaining an optimal fitness values for 23 cases in 40 experiments. The HBO algorithm obtained the best fitness value in eight experiments, while the SCA obtained the optimal fitness value in five experiments and SSA come in fourth place with four experiments followed by MFO with three experiments. Finally, HS obtained the optimal fitness value in only one experiments and the remaining algorithms could not obtain the optimal fitness value in any of the experiments. Table 9 illustrates the STD values calculated for the 40 independent outcomes for each tested image with various thresholds. A lower STD value indicates that the algorithm is more stable.
Table 10 presents the STD and mean PSNR for the benchmark images using the eight MAs. The IHBO was in first place in terms of the mean values of PSNR in 22 experiments. The SSA was in second place in seven experiments, while HBO was in third place, as it was superior in only six experiments. In fourth place was SCA with the best PSNR in only five experiments followed by MFO and HS with three experiments. Finally, the worst results were obtained by EMO which did not obtain the optimal values of the PSNR in any of the experiments. With respect to the STD, the IHBO was not the best alternative for lower dimensions (2 or 3 th). This is because the STD value was higher, which represents higher instability of the algorithm. However, MFO was a more unstable algorithm in terms of the PSNR. For the remaining approaches, the STD values followed the same tendency: lower for small dimensions and higher for four thresholds. However, the SSA was the least unstable algorithm, while the SCA was in second place. HBO was in the third place, HS was in fourth place, and the IHBO was in fifth place. Furthermore, GWO was in sixth place, and EMO was in seventh place.
Table 11 illustrates the STD and mean of the FSIM obtained from 40 experiments. The results of the FSIM indicate that the IHBO obtained the highest FSIM in 22 experiments and was in first place, while the HBO was in second place in ten experiments. However, the SCA was in third place in eight experiments. SSA was in fourth place in two experiments, followed by HS and EMO, which appeared in fifth place in only one experiment. Finally, GWO and MFO came in last place in the experiments. The SCA was thus the best approach in terms of the STD because its values were lower in most experiments. The SSA came in second place, followed by EMO in third place. Then, the IHBO appeared in fourth place. GWO was in fifth place, followed by HBO. Finally, the least stable approaches were MFO and HS due to their high STD values in most cases.
Table 12 presents the results of the STD and mean SSIM obtained in 40 experiments. The IHBO came in first place in terms of mean PSNR with the best SSIM in 22 experiments, while the SCA, HBO, and SSA came in second place in six experiments with higher SSIM values. EMO came in the third place in two experiments, followed by HS and MFO, which came in fourth place with only one experiment. Finally, GWO came in last place in the experiments. Because it provided the largest number of minimum values of the STD of all algorithms, SCA was the best method. In second place was the HBO, followed by EMO, which was in third place. The IHBO was in fourth place, while GWO was in fifth. Finally, MFO, HS, and SSA had no minimum STD values in the experiments.
Table 7 illustrates the thresholds that were applied on the selected benchmark images. In Tables 5 and 6, the histograms are illustrated with the respective threshold values and the segmented images of the selected images using 2, 3, 4, and 5 thresholds. These results indicate that for some images, there was improvement in the quality of their contrast as the number of thresholds increased, particularly for the images Butterfly, Living Room, Jetplane, Lena, Pirate, Cameraman, Lake, and Bridge, presenting a higher amount of information in the image with the largest number of thresholds when compared with an image with only two thresholds. The most difficult histograms to segment were for Test 6, 9, and 10, relating to Bridge, Butterfly, and Barbara, respectively. The complexity was due to different numbers of pixels in the images, which could produce several classes or even make it impossible to select the optimal thresholds.
Table 13 presents the p-values resulting from the Wilcoxon test for fitness using the Otsu fitness function. This table presents the difference between the proposed algorithm and the compared algorithms (HBO, SSA, MFO, GWO, SCA, HS, and EMO).
A difference between the SCA and MFO in comparison to the IHBO can be observed, which indicates that the proposed algorithm has a significant development. However, for the number of thresholds (nTh) = 5, the differences between the IHBO and most of the competing algorithms are clear by performing the comparison over 30 runs in each experiment. In the results, NaN indicates that the dataset to be compared is the same. This signifies that the algorithms obtained the same solution; thus, their results from the Wilcoxon test reveal that they are similar and that there are no differences between the methods.
Kapur results analysis
The best results are illustrated in Table 16, and were obtained by the IHBO using a fitness function such as Kapur entropy (14). Tables 14, and 15, present the histogram distribution of the benchmark images and segmented images with different numbers of thresholds produced by the IHBO. The results in Table 18 illustrate that the proposed algorithm with the Kapur entropy method proved outperform other algorithms in terms of SSIM (Table 18); in addition to, it outperformed other algorithms in terms of the mean FSIM (Table 20), PSNR (Table 19), and mean fitness.
The values of the computational time of comparison algorithms obtained by Otsu’s method are presented in Table 17. The IHBO came in first place with 24 cases in 40 experiments and proved its superiority in computational time compared to other competitive algorithms. HBO came in second place with 13 experiments, while GWO came in third place with ten experiments, followed by SSA with four experiments. Finally, the SCA came in fifth place with only one experiment, and the remaining algorithms could not obtain the best computational time in any of the experiments.
Table 18 presents the STD and average fitness results of the Kapur method on the benchmark images. The IHBO was in first place by obtaining optimal fitness values with 24 cases in 40 experiments. The SCA was in second place in seven experiments, while the HBO was in third place in five experiments. SSA was in fourth place in three experiments, and HS was in fifth place in two experiments followed by GWO in sixth place in one experiments. Finally, EMO and MFO could not produce optimal fitness values. Table 18 also presents the STD values to demonstrate the stability of the algorithm according to the repetition of the values.
Table 19 illustrates the STD and mean PSNR. The IHBO came in first place in 23 experiments with optimal PSNR values, while the SCA came in second place in ten experiments only. HBO came in third place in four experiments, while HS and GWO came in fourth place in two experiments. Finally, SSA, MFO, and EMO came in last place with no experiments. According to the STD values, EMO came in first place with the maximum number of minimum STD cases, followed by IHBO in second place, HS in third place, MFO and the HBO in fourth place, SSA in fifth place, and SCA in sixth place. GWO had no optimal STD values.
Table 20 provides the STD and mean FSIM. It can be observed that the IHBO was in first place in 17 experiments, while SCA was in second place in nine experiments. HS came in third place with six experiments, while MFO, HBO and GWO came in fourth place with three experiments, followed by SSA in fifth place with one experiment. Finally, EMO came in last place with no experiments. In terms of the STD, MFO came in first place with the maximum number of minimum STD cases, followed by SSA in second place, EMO and the IHBO in third place, HS in fourth place, HBO in fifth place, and SCA in sixth place. GWO had no optimal STD values.
The mean and STD of SSIM are presented in Table 21. The results indicate that IHBO was in first place in 19 experiments in terms of the SSIM, followed by SCA, which were in second place with seven experiments. The GWO and MFO were in third place in six experiments, while HBO and HS were in fourth place with three experiments, followed by SSA in fifth place with only one experiment. Lastly, EMO had no optimal experiments in terms of SSIM. According to the STD values, MFO came in first place with the maximum number of minimum STD cases. GWO, EMO, and SCA were in second place, while the IHBO was in third place. SSA and HBO were in fourth place, while HS was in fifth place.
Finally, Table 22 presents the p-values resulting from the Wilcoxon test for fitness using the Kapur fitness function. This table presents the difference between the proposed algorithm and the compared algorithms (HBO, SSA, MFO, GWO, SCA, HS, and EMO). The results in Table 22 indicate that the IHBO was different from the SCA and EMO but similar to the remaining algorithms. The exceptions occurred for nTH = 5, where in some cases the values exhibited differences as well as similarities (NaN values).
Human participants or animals
This article does not contain any studies with human participants or animals performed by any of the authors.
Conclusions and future works
Image segmentation is the most substantial pivotal phase that should be performed for image analysis and understanding. To handle this growing challenge, different methods using MTH, including feature-based, threshold-based, and region-based segmentation, have been implemented. The most common technique used to perform and analyze image segmentation is threshold-based segmentation. This paper presented an improved variant of the Heap-based optimizer (HBO) called IHBO. The effectiveness of the proposed IHBO was estimated using the functions in the CEC’2020 benchmark functions, however, the proposed algorithm superiority on the competing algorithms regarding various statistical metrics. In addition, IHBO was applied to image segmentation using objective functions such as the Otsu and Kapur methods. The main target of IHBO is to determine the best thresholds that maximize the Otsu and Kapur methods. The IHBO was implemented on a set of test images with different characteristics, and the results were compared against seven well-known metaheuristic algorithms including the original HBO algorithm, SSA, MFO, GWO, SCA, HS, and EMO. The experimental results revealed that the IHBO algorithm outperformed all counterparts in terms of FSIM, SSIM, and PSNR. It should be noted that the IHBO results using the Otsu method provided better class variance in most metrics. However, when applying the Kapur method, the IHBO produced SSIM, FSIM, PSNR, and fitness values were better than those of all counterparts. The IHBO produced promising results because it preserved an effective balance between exploration and exploitation, and had the ability to avoid being trapped in local optima.
For future work, there are many research directions in this field, such as studying the performance of the IHBO algorithm on different datasets, and other real-world complex problems. In addition, future work can study the hybridization of the original HBO with other metaheuristic or machine learning algorithms to automate the search process for the optimal number of thresholds in a specific image.
References
Abd El Aziz, M., Ewees, A. A. & Hassanien, A. E. Whale optimization algorithm and moth-flame optimization for multilevel thresholding image segmentation. Expert Syst. Appl. 83, 242–256 (2017).
Rodríguez-Esparza, E., Zanella-Calzada, L. A., Oliva, D. & Pérez-Cisneros, M. Automatic detection and classification of abnormal tissues on digital mammograms based on a bag-of-visual-words approach. In Medical Imaging 2020: Computer-Aided Diagnosis, vol. 11314, 1131424 (International Society for Optics and Photonics, 2020).
Montalvo, M., Guijarro, M. & Ribeiro, Á. A novel threshold to identify plant textures in agricultural images by Otsu and principal component analysis. J. Intell. Fuzzy Syst. 34, 4103–4111 (2018).
Sengar, S. S. & Mukhopadhyay, S. Motion segmentation-based surveillance video compression using adaptive particle swarm optimization. Neural Comput. Appl. 32, 11443–11457 (2019).
Yin, P.-Y. & Chen, L.-H. A fast iterative scheme for multilevel thresholding methods. Signal Process. 60, 305–313 (1997).
Sarkar, S., Sen, N., Kundu, A., Das, S. & Chaudhuri, S. S. A differential evolutionary multilevel segmentation of near infra-red images using Renyi’s entropy. In Proceedings of the international conference on frontiers of intelligent computing: Theory and applications (FICTA), 699–706 (Springer, 2013).
Kapur, J. N., Sahoo, P. K. & Wong, A. K. A new method for gray-level picture thresholding using the entropy of the histogram. Comput. Vis. Graphics Image Process. 29, 273–285 (1985).
Otsu, N. A threshold selection method from gray-level histograms. IEEE Trans. Syst. Man Cybern. 9, 62–66 (1979).
Bhargavi, K. & Jyothi, S. A survey on threshold based segmentation technique in image processing. Int. J. Innov. Res. Dev. 3, 234–239 (2014).
Houssein, E. H., Helmy, B.E.-D., Oliva, D., Elngar, A. A. & Shaban, H. A novel black widow optimization algorithm for multilevel thresholding image segmentation. Expert Syst. Appl. 167, 114159 (2020).
Houssein, E. H., Saad, M. R., Hashim, F. A., Shaban, H. & Hassaballah, M. Lévy flight distribution: A new metaheuristic algorithm for solving engineering optimization problems. Eng. Appl. Artif. Intell. 94, 103731 (2020).
Houssein, E. H. et al. Optimal sink node placement in large scale wireless sensor networks based on Harris’ hawk optimization algorithm. IEEE Access 8, 19381–19397 (2020).
Hashim, F. A., Houssein, E. H., Hussain, K., Mabrouk, M. S. & Al-Atabany, W. A modified henry gas solubility optimization for solving motif discovery problem. Neural Comput. Appl. 32, 10759–10771 (2020).
Houssein, E. H., Hosney, M. E., Oliva, D., Mohamed, W. M. & Hassaballah, M. A novel hybrid Harris hawks optimization and support vector machines for drug design and discovery. Comput. Chem. Eng. 133, 106656 (2020).
Houssein, E. H. et al. An improved opposition-based marine predators algorithm for global optimization and multilevel thresholding image segmentation. Knowl.-Based Syst. 229, 107348 (2021).
Houssein, E. H., Emam, M. M. & Ali, A. A. Improved manta ray foraging optimization for multi-level thresholding using covid-19 ct images. Neural Comput. Appl. 33, 16899–16919 (2021).
Neggaz, N., Houssein, E. H. & Hussain, K. An efficient henry gas solubility optimization for feature selection. Expert Syst. Appl. 152, 113364 (2020).
Houssein, E. H., Younan, M. & Hassanien, A. E. Nature-inspired algorithms: A comprehensive review. Hybrid Comput. Intell. Res. Appl. 1, 1–25 (2019).
Deb, K. Multi-objective Optimization Using Evolutionary Algorithms Vol. 16 (Wiley, 2001).
Storn, R. & Price, K. Differential evolution-a simple and efficient heuristic for global optimization over continuous spaces. J. Global Optim. 11, 341–359 (1997).
Holland, J. H. Genetic algorithms. Sci. Am. 267, 66–73 (1992).
Eberhart, R. C. & Shi, Y. Comparison between genetic algorithms and particle swarm optimization. In International Conference on Evolutionary Programming, 611–616 (Springer, 1998).
Eberhart, R. & Kennedy, J. A new optimizer using particle swarm theory. In MHS’95. Proceedings of the Sixth International Symposium on Micro Machine and Human Science, 39–43 (Ieee, 1995).
Hashim, F. A., Houssein, E. H., Mabrouk, M. S., Al-Atabany, W. & Mirjalili, S. Henry gas solubility optimization: A novel physics-based algorithm. Futur. Gener. Comput. Syst. 101, 646–667 (2019).
Hashim, F. A., Hussain, K., Houssein, E. H., Mabrouk, M. S. & Al-Atabany, W. Archimedes optimization algorithm: A new metaheuristic algorithm for solving optimization problems. Appl. Intell. 51, 1531–1551 (2020).
Rashedi, E., Nezamabadi-Pour, H. & Saryazdi, S. Gsa: A gravitational search algorithm. Inf. Sci. 179, 2232–2248 (2009).
Birbil, Şİ & Fang, S.-C. An electromagnetism-like mechanism for global optimization. J. Global Optim. 25, 263–282 (2003).
Rao, R. V., Savsani, V. J. & Vakharia, D. Teaching-learning-based optimization: A novel method for constrained mechanical design optimization problems. Comput. Aided Des. 43, 303–315 (2011).
Askari, Q., Saeed, M. & Younas, I. Heap-based optimizer inspired by corporate rank hierarchy for global optimization. Expert Syst. Appl. 161, 113702 (2020).
Oliva, D. & Cuevas, E. Advances and Applications of Optimised Algorithms in Image Processing (Springer, 2017).
Oliva, D., Elaziz, M. A. & Hinojosa, S. Metaheuristic Algorithms for Image Segmentation: Theory and Applications Vol. 825 (Springer, 2019).
Zhou, Y., Yang, X., Ling, Y. & Zhang, J. Meta-heuristic moth swarm algorithm for multilevel thresholding image segmentation. Multimed. Tools Appl. 77, 23699–23727 (2018).
He, L. & Huang, S. Modified firefly algorithm based multilevel thresholding for color image segmentation. Neurocomputing 240, 152–174 (2017).
Zhao, D. et al. Ant colony optimization with horizontal and vertical crossover search: Fundamental visions for multi-threshold image segmentation. Expert Syst. Appl. 167, 114122 (2020).
Abd Elaziz, M., Ewees, A. A. & Oliva, D. Hyper-heuristic method for multilevel thresholding image segmentation. Expert Syst. Appl. 146, 113201 (2020).
Upadhyay, P. & Chhabra, J. K. Kapur’s entropy based optimal multilevel image segmentation using crow search algorithm. Appl. Soft Comput. 97, 105522 (2019).
He, L. & Huang, S. An efficient krill herd algorithm for color image multilevel thresholding segmentation problem. Appl. Soft Comput. 89, 106063 (2020).
Bao, X., Jia, H. & Lang, C. A novel hybrid Harris Hawks optimization for color image multilevel thresholding segmentation. IEEE Access 7, 76529–76546 (2019).
Kandhway, P. & Bhandari, A. K. Spatial context-based optimal multilevel energy curve thresholding for image segmentation using soft computing techniques. Neural Comput. Appl. 32, 8901–8937 (2020).
Kandhway, P. & Bhandari, A. K. A water cycle algorithm-based multilevel thresholding system for color image segmentation using masi entropy. Circuits Syst. Signal Process. 38, 3058–3106 (2019).
Kandhway, P. & Bhandari, A. K. Spatial context cross entropy function based multilevel image segmentation using multi-verse optimizer. Multimed. Tools Appl. 78, 22613–22641 (2019).
Elaziz, M. A., Bhattacharyya, S. & Lu, S. Swarm selection method for multilevel thresholding image segmentation. Expert Syst. Appl. 138, 112818 (2019).
Rojas-Morales, N., Rojas, M.-C.R. & Ureta, E. M. A survey and classification of opposition-based metaheuristics. Comput. Ind. Eng. 110, 424–435 (2017).
Elaziz, M. A., Ewees, A. A. & Oliva, D. Hyper-heuristic method for multilevel thresholding image segmentation. Expert Syst. Appl. 146, 113201 (2020).
Houssein, E. H., Emam, M. M. & Ali, A. A. An efficient multilevel thresholding segmentation method for thermography breast cancer imaging based on improved chimp optimization algorithm. Expert Syst. Appl. 185, 115651 (2021).
Dhal, K. G., Das, A., Ray, S., Gálvez, J. & Das, S. Nature-inspired optimization algorithms and their application in multi-thresholding image segmentation. Arch. Comput. Methods Eng. 27, 855–888 (2019).
Wolpert, D. H. & Macready, W. G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1, 67–82 (1997).
Wang, Z., Bovik, A. C., Sheikh, H. R. & Simoncelli, E. P. Image quality assessment: From error visibility to structural similarity. IEEE Trans. Image Process. 13, 600–612 (2004).
Zhang, L., Zhang, L., Mou, X. & Zhang, D. Fsim: A feature similarity index for image quality assessment. IEEE Trans. Image Process. 20, 2378–2386 (2011).
Hore, A. & Ziou, D. Image quality metrics: Psnr vs. ssim. In 2010 20th International Conference on Pattern Recognition, 2366–2369 (IEEE, 2010).
Mirjalili, S. et al. Salp swarm algorithm: A bio-inspired optimizer for engineering design problems. Adv. Eng. Softw. 114, 163–191 (2017).
Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl.-Based Syst. 89, 228–249 (2015).
Mirjalili, S., Mirjalili, S. M. & Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 69, 46–61 (2014).
Mirjalili, S. Sca: A sine cosine algorithm for solving optimization problems. Knowl.-Based Syst. 96, 120–133 (2016).
Geem, Z. W., Kim, J. H. & Loganathan, G. V. A new heuristic optimization algorithm: Harmony search. Simulation 76, 60–68 (2001).
Ahmady, G. A., Mehrpour, M. & Nikooravesh, A. Organizational structure. Procedia. Soc. Behav. Sci. 230, 455–462 (2016).
Mahdavi, S., Rahnamayan, S. & Deb, K. Opposition based learning: A literature review. Swarm Evol. Comput. 39, 1–23 (2018).
Sarkhel, R., Das, N., Saha, A. K. & Nasipuri, M. An improved harmony search algorithm embedded with a novel piecewise opposition based learning algorithm. Eng. Appl. Artif. Intell. 67, 317–330 (2018).
Ewees, A. A., Abd Elaziz, M. & Houssein, E. H. Improved grasshopper optimization algorithm using opposition-based learning. Expert Syst. Appl. 112, 156–172 (2018).
Malisia, A. R. & Tizhoosh, H. R. Applying opposition-based ideas to the ant colony system. In 2007 IEEE Swarm Intelligence Symposium, 182–189 (IEEE, 2007).
Rajasekhar, A., Jatoth, R. K. & Abraham, A. Design of intelligent pid/pi\(\lambda\)d\(\mu\) speed controller for chopper fed dc motor drive using opposition based artificial bee colony algorithm. Eng. Appl. Artif. Intell. 29, 13–32 (2014).
Xu, H., Erdbrink, C. D. & Krzhizhanovskaya, V. V. How to speed up optimization? Opposite-center learning and its application to differential evolution. Procedia Comput. Sci. 51, 805–814 (2015).
Li, J., Chen, T., Zhang, T. & Li, Y. X. A cuckoo optimization algorithm using elite opposition-based learning and chaotic disturbance. J. Softw. Eng. 10, 16–28 (2016).
Zhao, F., Zhang, J., Wang, J. & Zhang, C. A shuffled complex evolution algorithm with opposition-based learning for a permutation flow shop scheduling problem. Int. J. Comput. Integr. Manuf. 28, 1220–1235 (2015).
Shang, J. et al. An improved opposition-based learning particle swarm optimization for the detection of snp-snp interactions. BioMed Res. Int.2015, 524821 (2015).
Gao, X., Wang, X., Ovaska, S. & Zenger, K. A hybrid optimization method of harmony search and opposition-based learning. Eng. Optim. 44, 895–914 (2012).
Thangaraj, R., Pant, M., Chelliah, T. R. & Abraham, A. Opposition based chaotic differential evolution algorithm for solving global optimization problems. In Nature and Biologically Inspired Computing (NaBIC), 2012 Fourth World Congress on, 1–7 (IEEE, 2012).
Ahandani, M. A. & Alavi-Rad, H. Opposition-based learning in shuffled frog leaping: An application for parameter identification. Inf. Sci. 291, 19–42 (2015).
Arcuri, A. & Fraser, G. Parameter tuning or default values? An empirical investigation in search-based software engineering. Empir. Softw. Eng. 18, 594–623 (2013).
Mohamed, A. W., Hadi, A. A., Mohamed, A. K. & Awad, N. H. Evaluating the performance of adaptive gainingsharing knowledge based algorithm on cec 2020 benchmark problems. In 2020 IEEE Congress on Evolutionary Computation (CEC), 1–8 (IEEE, 2020).
Oliva, D. et al. Image segmentation by minimum cross entropy using evolutionary methods. Soft Comput. 23, 431–450 (2017).
Elaziz, M. A., Oliva, D., Ewees, A. A. & Xiong, S. Multi-level thresholding-based grey scale image segmentation using multi-objective multi-verse optimizer. Expert Syst. Appl. 125, 112–129 (2019).
Oliva, D., Cuevas, E., Pajares, G., Zaldivar, D. & Osuna, V. A multilevel thresholding algorithm using electromagnetism optimization. Neurocomputing 139, 357–381 (2014).
Sara, U., Akter, M. & Uddin, M. S. Image quality assessment through fsim, ssim, mse and psnr-a comparative study. J. Comput. Commun. 7, 8–18 (2019).
Huynh-Thu, Q. & Ghanbari, M. Scope of validity of psnr in image/video quality assessment. Electron. Lett. 44, 800–801 (2008).
Funding
Open access funding provided by The Science, Technology & Innovation Funding Authority (STDF) in cooperation with The Egyptian Knowledge Bank (EKB).
Author information
Authors and Affiliations
Contributions
E.H.H. performed the supervision, methodology, investigation, visualization; E.H.H. and G.M.M. develop the software; E.H.H.and I.A.I. and Y.M.W. participated in conceptualization, formal analysis, performed the experiments and analyzed the results, and wrote the paper; I.A.I. and Y.M.W. performed the validation; G.M.M. wrote the original draft, resources. All authors discussed the results and approved the final paper.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Publisher's note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Houssein, E.H., Mohamed, G.M., Ibrahim, I.A. et al. An efficient multilevel image thresholding method based on improved heap-based optimizer. Sci Rep 13, 9094 (2023). https://doi.org/10.1038/s41598-023-36066-8
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41598-023-36066-8
- Springer Nature Limited
This article is cited by
-
Multilevel thresholding with divergence measure and improved particle swarm optimization algorithm for crack image segmentation
Scientific Reports (2024)
-
RhizoNet segments plant roots to assess biomass and growth for enabling self-driving labs
Scientific Reports (2024)
-
Particle Swarm Optimizer Variants for Multi-level Thresholding: Theory, Performance Enhancement and Evaluation
Archives of Computational Methods in Engineering (2024)
-
A cross entropy and whale optimization algorithm based image segmentation for aerial images
International Journal of Information Technology (2024)
-
Multi-level thresholding segmentation based on levy horse optimized machine learning approach
Multimedia Tools and Applications (2024)