Keywords

1 Introduction

Multi-objective optimization research area is responsible for studying and solving problems with multiple conflicting objectives. Problems of this kind are widespread nowadays in the different areas. For example, the design of machines with low energy consumption and high power, or high-quality software built in short time.

The multi-objective algorithms have been proposed to solve this type of problems. These algorithms are used when there is no evidence of an optimal solution, and obtaining this solution or a close one with deterministic algorithms can be time-consuming [1]. The result of these methods is not a single solution (as in the case of mono-objective optimization), but a set of the best possible solutions found [2].

Some existing approaches to the design of multi-objective algorithms have been gathered in Frameworks, such as paradisEO-MOEO and MOEA. These frameworks have emerged to make these algorithms accessible to the community so they can solve real problems. Among these frameworks, MOEA is one of the most outstanding. MOEA is a free, open-source, Java-based library that allows implementing new algorithms and experimenting with the problems from the IEEE CEC multi-objective optimization competition, and with other real problems documented in the framework. For the present research work, the Multi-Objective Global-Best Harmony Search (MOGBHS) was implemented [3] in MOEA Framework.

MOGBHS is a multi-objective algorithm based on Global-Best Harmony Search, non-dominated ordering, and crowding distance. MOGBHS has been used to solve routes and schedules problem in a massive transport system (Bus rapid system). In this problem, MOGBHS has been used to minimize the costs of the system and maximize the satisfaction of the users [3]. The implementation of MOGBHS in MOEA framework has allowed to carry out a comparative analysis of this algorithm with other widely used algorithms: SPEA2, MOEA/D, NSGA-II and MSOPS [4]. These algorithms are implemented in the MOEA framework.

The present analysis used twelve (12) multi-objective continuous problems without constraints and nine (9) with constraints. These problems were taken from the multi-purpose optimization competency repository of the IEEE Evolutionary Computing Congress [5, 6]. The study sought to determine the impact of the number (2000, 5000, 10000 and 20000) of evaluations of the objective function (EOF) on the algorithms with different types of problems (with and without constraints).

The comparative analysis used the following metrics commonly used in the multiobjective optimization field: Hypervolume, Epsilon, Generational Distance, Inverted Generational Distance, and Spacing [2]. Nonparametric statistical tests (Wilcoxon and Friedman) were also performed. The MOGHBS algorithm ranked fourth out of five compared algorithms when all the metrics were considered. Also, MOGHBS ranked first when only the inverted generational distance metric was considered (this algorithm generates very good solutions according to the accuracy and diversity). Finally, MOGHBS always ranked first on the CF7 problem for all the EOFs.

The rest of this paper is structured as follows. In Sect. 2, the related works are exposed. Then in Sect. 3 the characterization of the multi-objective algorithms to be compared is presented. Then, Sect. 4 the metrics used in the comparison are explained. The results and their analysis are shown in Sect. 5. Finally, conclusions and future work are presented in Sect. 6.

2 Related Works

Algorithms that implement techniques based on stochastic optimization (meta-heuristics) are used to solve problems with n objectives. These algorithms use randomness to obtain optimal solutions to NP-hard problems (with a high degree of complexity). A detailed and updated survey of multi objective evolutionary algorithms is presented by Zhang and Xing in [7]. Likewise, another review of the related approaches in this field is presented by Vachhani et al. in [8]. A brief description of some approaches for multi-objective optimization is given below, as well as a discussion of the main differences of these approaches with the present work.

In 2017 [1] ε- MOABC was presented. ε- MOABC is an algorithm based on performance indicators to solve multi-objective and many objectives optimization problems. This algorithm creates an external file with non-dominated solutions produced in each generation based on the Pareto preference and dominance indicators. This algorithm has demonstrated to be competitive in multi-objective and many-objective optimization problems compared to other state-of-the-art algorithms such as NSGA-II SPEA2 and MOEA/D. The algorithm was analyzed in problems with constraints and more than one objective function like CEC09, LZ09, and DTLZ.

Trivedi et al. presented an exhaustive study of the MOEAs proposed over the last ten years [9]. This study is focused on the decomposition-based and hybrid (based on decomposition and dominance, etc.) MOEAs. This work includes the efforts made so far to expand the framework based on the decomposition of constrained multi-objective optimization and the many-objectives optimization. Authors conclude that there have been many attempts to create and apply decomposition-based MOEAs to solve complex real-world optimization problems.

In [10] multi-objective evolutionary algorithms are classified into set approximation methods and decomposition methods. In this work, a set approximation MOEA is combined with a sequential decomposition mechanism. Using this combination, a better running time is achieved on synthetic problems compared to the corresponding set approximation MOEAs by a factor n (problem size). Also, in recently (2017) published works, distributed parallel approaches [11, 12] are proposed for solving multi-objective problems of large-scale optimization.

In 2016 [13] a new selection scheme for multi-objective evolutionary algorithms based on the ∆ρ indicator was proposed. To solve the problem of the definition of the reference set in ∆ρ based approaches, a reference set is created at each generation using e-dominance. Besides, a set of non-dominated solutions is also created. The proposal outperforms MOEA/D using Penalty Boundary Intersection (decomposition approach), and SMS-EMOA-HYPE (a SMS-EMOA version that uses the hypervolume indicator) on standard test functions with 3–6 objective functions.

In 2014, a new approach that combines dominance and decomposition for multi-objective and many objectives optimization was presented in [14]. This approach takes advantage of both approaches (dominance and decomposition) to balance the convergence and diversity of the evolutionary process. The performance of the proposed algorithm was validated and compared with four state-of-the-art algorithms (unconstrained problems with up to fifteen objectives). The empirical results demonstrate the superiority of the proposed algorithm in the tests. Also, the proposed algorithm showed a highly competitive performance in all constrained optimization problems.

An improved version of the TAA algorithm (ITAA) was proposed in [15]. This algorithm incorporates a classification mechanism for updating the convergence file. The efficiency of ITAA was demonstrated with experimental studies on problems with up to 20 objectives. ITAA performance was assessed in 16 DTLZ test cases with 5–20 targets. The experimental results showed that ITAA exceeded TAA regarding the IGD convergence metric and the GSpread diversity metric.

A knee point-driven evolutionary algorithm for many-objective optimization KnEA is presented in [2]. KnEA significantly reduces computational complexity compared to other multi-objective algorithms. The experimental results show that KnEA is significantly superior to MOEA/D and hypo, and is comparable with GreA and NSGA-III in the optimization with more than three objectives. KnEA is computationally much more efficient compared to other Pareto-based MOEAs such as GREAT. Therefore, the overall performance of KnEA is highly competitive compared to the state-of-the-art MOEAs to solve problems with more than three objectives.

A new algorithm (MD-MOEA) was proposed in [16]. This algorithm is based on crossover and mutation operators of the NSGA-II algorithm. This algorithm includes a new selection mechanism based on the maximum fitness function, and a technique based on Euclidean distances between solutions to improve the diversity of the population in objective function space. This approach obtains good results in both low dimensionality and high dimensionality in objective function space when compared with MOEA/D using Penalty Boundary Intersection, and SMS-EMOA-HYPE.

Also in this year (2014), it was developed the algorithm proposed for the present analysis called multi-objective Global-Best Harmony Search (MOGBHS) [3]. This algorithm generates a set of harmonies and stores them in the harmonic memory (HM). Also, the algorithm evaluates all targets for each element of the HM and then orders them using the Pareto front. This algorithm was used to improve the definition of routes and schedules in a mass transit system (MEGABUS system in the city of Pereira, Colombia) using simulation based on discrete events. MOGBHS was compared to an NSGA-II implementation in the test case and showed better performance.

Finally, in 2009 [17], a comparative analysis of several multi-objective evolutionary algorithms was presented. The behavior of these algorithms with different complexities and problems (NP-hard) was observed. The compared algorithms were based on the Pareto front approach: NSGA-II, SPEA2, MOEA/D, GDE3, and POSDE. The results showed that GDE3 produces the best performances in these problems with the lowest time complexity.

Unlike the works mentioned in state of the art, the present work conducted four evaluations of the objective function per algorithm with different values (2000, 5000, 10000, and 20000 EOFs). Also, the non-parametric Friedman test was performed to observe the position or ranking of the evaluated algorithms. Then the Wilcoxon test was applied to observe the dominance relation between algorithms. Finally, a general performance analysis was done for each of the metrics mentioned above.

3 Compared Algorithms

NSGA-II: this genetic algorithm was proposed by Deb et al. in 2002 [18]. This algorithm generates an additional population from an original population by using the genetic operators of selection (binary tournament), crossover (SBX) and mutation (Polimonial). From here, the most promising individuals from both populations are selected for the next generation according to their rank (Pareto front number) and crowding distance. NSGA-II is used to solve continuous problems.

SPEA2: Is a genetic algorithm proposed by Zitler et al. 2001 in [19]. In this algorithm, a fitness value is assigned to each individual. This fitness is the sum of the strength raw fitness and a density estimation. SPEA2 applies selection, crossover and mutation operators to fill in a solution file (environmental selection, SBX, and polynomial mutation). Non-dominated solutions from the original population and the solution file are copied into a new population. If the number of solutions exceeds the maximum size of the population, a truncation operator is used based on the distance to the nearest kth neighbor.

MOEA/D: is a multi-objective optimization algorithm based on the decomposition of a problem. MOEA/D uses evolutionary operators to combine optimal solutions thus allowing high convergence. MOEA/D uses the differential evolution operator followed by a mutation of polynomials to create descendants, and the weighted Tchebycheff or boundary intersection as the decomposition method. Equally, a mechanism of diversity preservation, as proposed in the work of Zhang and Li [17].

MSOPS is a proposed multi-objective optimization algorithm proposed by He and Yen in 2014 [20]. MSOPS works in parallel to generate convergent systems of solutions. This algorithm is based on aggregate optimization that is driven by its weight or target vector. Thus, the algorithm uses an array of target vectors to find the best solutions in parallel. This algorithm does not rely on Pareto classification and provides better high-dimensional objective space pressure.

MOGBHS is the Multi-Objective Global-Best Harmony Search algorithm and was proposed in 2016 [3]. This algorithm randomly generates a set of harmonies and stores them in Harmonic Memory (HM). Then all objectives for each element in the HM are evaluated. From here, the ordering is carried out using the Pareto front based on non-dominated ordering and crowding distance. Afterward, a certain number of improvisations (evolutionary iterations) are performed. In each iteration: (A) a new harmony is generated applying the logic of the GBHS algorithm. (B) The new harmony is evaluated against all the objectives to optimize. (C) The new harmony is added to the existing HM. (D) The HM is ordered by Pareto front and crowding distance. (E) All elements that cause that the HM exceed the maximum size (defined by the Harmony Memory Size parameter of the algorithm) are removed (these are the worst elements in the HM).

4 Comparison Metrics

The following metrics were used to measure of performance and competitiveness of the algorithms: Hypervolume, Generational Distance, Inverted Generational Distance, Epsilon, and Spacing. These metrics are the most used to evaluate MOEAs [21] (Table 1).

Table 1. Metrics for evaluation of multi-objective algorithms

These metrics make it possible to compare different algorithms. This comparison is based on the accuracy, diversity and separability of the solutions found by each algorithm. A description of these metrics is presented below [21]:

Hypervolume (HV) This metric calculates the volume (in objective space) covered by members of a given set, \( Q \), of non-dominated solutions to problems where all objectives are to be minimized. Mathematically, for each \( i \in Q \) an hypercube \( v_{i} \) is built with a reference point \( W \) an the solution \( i \) that represents the diagonal of the hypercube. The point \( W \) can be obtained with the worst values of the objective functions. The hypervolume \( \left( {HV} \right) \) is defined by the union of all the hypercubes [22] as is shown in Eq. 1.

$$ HV = volume \left( {\bigcup\nolimits_{i = 1}^{\left| Q \right|} {v_{i} } } \right) $$
(1)

The algorithms that reach higher values of \( HV \) are better (Maximize). Since \( HV \) depends on the values of the objective function it is necessary to normalize the non-dominated solutions.

Epsilon (EP) is a measure of the smallest distance required to translate each solution in A so that it dominates in the Optimal Pareto Front of the evaluated problem. More formally, given \( \vec{z}^{1} = \left( {Z_{1}^{1} , \ldots .,Z_{n}^{1} } \right) and \vec{z}^{2} = \left( {Z_{1}^{2} , \ldots .,Z_{n}^{2} } \right) \), where \( n \) is the number of objectives [23] (Minimize).

$$ I_{\epsilon + }^{1} \left(A\right) = \inf_{\epsilon \in R} \left(\forall \vec{z}^{2} \in PF^{*} \exists \vec{z}^{1} \in A: \vec{z}^{1}\, \preccurlyeq_{\epsilon} \,\vec{z}^{2} \right) $$

Where \( \vec{z}^{1} \) \( \preccurlyeq_{\epsilon} \) \( \vec{z}^{2} \) if and only if \( \forall 1 \le i \le n: Z_{i}^{1} < \epsilon\, +\, Z_{i}^{2}. \)

Generational distance (GD) and inverted generational distance (IGD) were proposed by Van Veldhuizen in 1999 [21]. GD calculates the average distance of a set of candidate solutions \( Z \), with respect to a reference set \( p^{*} \) representing the Pareto front (PF). Formally, DG is defined by Eq. 2 (Minimize):

$$ GD = \frac{{\sqrt {\varSigma_{x \in Z} d\left( x \right)^{2} } }}{\left| z \right|} $$
(2)

Where \( d\left( X \right) \) is the Euclidean distance between the solution \( X \) and the closest point \( p^{*} \) expressed by Eq. 3.

(3)

Although GD is a metric for evaluating convergence, if we reverse the roles of \( Z \) and \( p^{*} \) in Eqs. 5 and 6, The generalized inverted distance (IGD) is obtained. It is also possible to consider the diversity of the whole set \( Z \). Thus, A low value for IGD will indicate both good convergence and good distribution of the solutions (Minimize). Formally, the IGD metric is defined by Eq. 4.

$$ IGD = \frac{{\sqrt {\varSigma_{{x^{*} \in p^{*} }} d\left( {X^{*} } \right)^{2} } }}{{\left| { P^{*} } \right|}} $$
(4)

Where \( d (X^{*} ) \) Is the Euclidean distance between the reference point \( X^{*} \) and the closest solution Z (Maximize) expressed by Eq. 5.

(5)

Spacing (SP) Is a separation metric (SP) suggested by Schott [21]. Sp is calculated by measuring the relative distance between consecutive solutions in the non-dominated set obtained, as Eq. 6.

$$ S = \sqrt {\frac{1}{\left| Q \right|}\sum\nolimits_{i = 1}^{\left| Q \right|} ( } d_{i} - \bar{d})^{2} $$
(6)

Where \( d_{i} = min\mathop { \hbox{min} }\limits_{KeQ \wedge K \ne i} \{ \sum\nolimits_{m = 1}^{M} {|f_{m}^{i} - f_{m}^{k} } \} and\, \bar{d} \) is the medium value of the above measures \( \bar{d} = \sum\nolimits_{i = 1}^{\left| Q \right|} {d_{i} /\left| Q \right|.} \) The distance measure is the minimum value of the absolute sum in the values of the objective function between the ith solution and any other solution of the non-dominated set. This distance is different from the minimum Euclidean distance between the two solutions. Therefore, a search algorithm for a set of non-dominated solutions that have smaller spacing (SP) is better (Minimize).

5 Results and Analysis

Here, the analysis of the metrics mentioned above for a different number of EOFs is presented. Also, the analysis of the nonparametric tests of Friedman and Wilcoxon is also included.

5.1 Analysis with 2000 EOFs

Regarding Hypervolume the Friedman Test was conducted according to a chi square distribution with 4 degrees of freedom: 7.12 and p = 0.12968. Results show that SPEA2 achieved better performance, followed by MOGBHS, Then NSGA-II sharing second place with MOEA/D and finally MSOPS. The result of the Wilcoxon showed that SPEA2 algorithm dominates NSGA-II, MOEA/D y MSOPS and MOGBHS algorithms with a significance level (SL) of 95%.

Regarding Epsilon Friedman test was conducted according to a chi square distribution with 4 degrees of freedom: 18.8 and p = 0.00119. Results indicate that SPEA2 achieved the best performance, followed by MOGBHS, NSGA-II in third place, followed by MOEA/D and MSOPS in the last place. Wilcoxon test shows that SPEA2 dominates NSGA-II, MOEA/D, and MSOPS with SL = 95%. Finally, MOGBHS dominates NSGA-II and MSOPS also with SL of 95%.

With regard to Generational Distance, the Friedman test was run according to a chi-square distribution with 4 degrees of freedom: 30.4 and p = 4.05733E−6, results show that SPEA2 achieved the best performance, followed by MOEA/D, MOGBHS, NSGA-II and finally MSOPS. Wilcoxon test showed that SPEA2, MOEA/D and, MOGBHS dominates MSOPS with an SL of 95%, besides MOEA/D, MOGBHS and SPEA2 dominate NSGA-II with the same SL.

With regard to Inverted Generational Distance, the Friedman test was run according to a chi-square distribution with 4 degrees of freedom: 24.28 and p = 7.01876E−5. Results show that MSOPS achieved the best performance, followed by MOEA/D, NSGA-II, MOGBHS and finally SPEA2. Wilcoxon test showed that MSOPS dominates NSGA-II, MOEA/D, MOGBHS and SPEA2 with an SL of 95% and that NSGA-II and MOEA/D dominate MOGBHS and SPEA2 with the same SL.

Regarding Spacing, the Friedman test was run according to a chi-square distribution with 4 degrees of freedom: 47.08 and p = 1.50153E−9. Results show that MOEA/D achieved the best performance, followed by SPEA2, MOGBHS, NSGA-II and finally MSOPS. Wilcoxon test showed that MOEA/D dominate NSGA-II, MSOPS y MOGBHS with an SL of 95%, Besides SPEA2 dominates NSGA-II and MSOPS with the same SL. Finally, MOGBHS dominates MSOPS y NSGA-II with SL of 90%.

In the present analysis, the ranking of 1 to 5 is created to determine the place occupied by each algorithm (see Table 2). The result is shown on the right side of the value obtained by each algorithm. Results indicate that the proposed algorithm MOGBHS is highly competitive with 2000 EOFs as it occupied the second place in the general ranking. Also, MOGBHS achieved the best solutions to problems such as CF4, CF5, CF6, CF7, UF1, UF6, UF8, and UF11 occupying the first place (according to the metrics HV, GD, IGD, EP, and SP) for each problem as observed in Table 2.

Table 2. MOGBHS with 2000 EOFs. Best results in bold.

5.2 Analysis with 5000 EOFs

Regarding Hypervolume: The Friedman test was run according to a chi-square distribution with 4 degrees of freedom: 6.84 and p = 0.14459. Results showed a ranking where SPEA2 achieved the best performance, followed by MOGBHS, MSOPS, MOEA/D y Finally NSGA-II. Wilcoxon test showed that SPEA2 dominates NSGA-II, MOEA/D with an SL of 95%.

With regard to Epsilon: The Friedman test was run according to a chi-square distribution with 4 degrees of freedom: 10.84 and p = 0.02842. Results showed a ranking where SPEA2 achieved the best performance, followed by MSOPS y NSGA-II, MOGBHS and finally MOEA/D. Wilcoxon test showed that SPEA2 dominates MOEA/D, MOGBHS, and NSGA-II with an SL of 95%.

With regard to Generational Distance: The Friedman test was run according to a chi-square distribution with 4 degrees of freedom: 29.2 and p = 7.11914E−6. Results showed a ranking where SPEA2 achieved the best performance, followed by MOEA/D MOGBHS, NSGA-II, and finally MSOPS. Wilcoxon test showed that SPEA2 MOEA/D and MOGBHS dominate NSGA-II and MSOPS with an SL of 95%, besides SPEA2 dominates MOGBHS with the same SL.

With regard to Inverted Generational Distance: The Friedman test was run according to a chi-square distribution with 4 degrees of freedom 5.96 and p = 0.20216. Results showed a ranking where NSGA-II achieved the best performance, followed by MOGBHS, MOEA/D, MSOPS, and finally SPEA2. Wilcoxon test showed that NSGA-II and MOGBHS dominate SPEA2 with an SL of 95%.

With respect to Spacing: The Friedman test was run according to a chi-square distribution with 4 degrees of freedom: 38.6 and p = 8.42835E−8. Results showed a ranking where SPEA2 achieved the best performance, followed by MOEA/D, NSGA-II and finally MOGBHS and MSOPS. Wilcoxon Test showed that SPEA2 dominate NSGA-II, MOEA/D, MOGBHS, and MSOPS with an SL of 95%, Besides MOGBHS dominates MOEA/D with the same SL. Finally, MOGBHS, MOEA/D, and NSGA-II dominate MSOPS with SL of 90%.

Results showed that MOGBHS is competitive with 5000 EOFs as it occupied the second place in the general ranking. Besides MOGBHS found the best solutions to problems with constraints such as CF4, CF5, CF7 and occupied the first place according to the metrics HV, SP, GD, and EP as can be seen in Table 3.

Table 3. MOGBHS in 5000 EOFs. Best results in bold.

5.3 Analysis of the Algorithms with 10000 EOFs

Regarding Hypervolume: The Friedman test was run according to a chi-square distribution with 4 degrees of freedom: 11.47 and p = 0.02176. Results showed a ranking where MSOPS achieved the best performance, followed by SPEA2, MOEA/D, NSGA-II and finally MOGBHS. Wilcoxon test showed that MSOPS and SPEA2 dominate NSGA-II with an SL of 95%, besides MSOPS, SPEA2 and MOEA/D dominate MOHBGS with the same SL.

With respect to Epsilon: The Friedman test was run according to a chi-square distribution with 4 degrees of freedom: 23.36 and p = 1.07290E−4. Results showed a ranking where MSOPS achieved the best performance, followed by SPEA2, NSGA-II and MOEA/D in the fourth place, and finally MOGBHS. Wilcoxon test showed that MSOPS and SPEA2 dominate NSGA-II and MOGBHS with SL of 95%.

With regard to Generational Distance: The Friedman test was run according to a chi-square distribution with 4 degrees of freedom: 24.8 and p = 5.51891E−5. Results showed a ranking where SPEA2 achieved the best performance, followed by MOEA/D, NSGA-II, MOGBHS, and finally MSOPS. Wilcoxon test showed that SPEA2 and MOEA/D dominate NSGA-II, MSOPS, and MOGBHS with an SL of 95%.

With regard to Inverted Generational Distance: The Friedman test was run according to a chi-square distribution with 4 degrees of freedom 20.32 and p = 4.31750E−4. Results showed a ranking where MOGBHS achieved the best performance, followed by NSGA-II, MOEA/D, and finally SPEA2 and MSOPS sharing the third place. Wilcoxon test showed that MOGBHS and NSGA-II dominate SPEA2 and MSOPS with an SL of 95%.

Regarding Spacing: The Friedman test was run according to a chi-square distribution with 4 degrees of freedom: 37.16 and p = 1.66982E−7. Results showed a ranking where SPEA2 achieved the best performance, followed by NSGA-II, MOEA/D, MOGBHS and finally MSOPS. Wilcoxon test showed that SPEA2 dominates NSGA-II, MOEA/D, MOGBHS, and MSOPS with an SL of 95%. Besides MOGBHS dominates MSOPS with the same SL.

Results evidence that MOGBHS is competitive with 10000 EOFs in some problems achieving the best performance with constrained problems such as CF5, CF7. With these problems, MOGBHS occupied the first place according to the metrics GD, IGD and SP as can be seen in Table 4.

Table 4. MOGBHS with 10000 EOFs. Best results in bold.

5.4 Analysis of Algorithms with 20000 EOFs

Regarding Hypervolume: The Friedman test was run according to a chi-square distribution with 4 degrees of freedom: 16.39 and p = 0.00254. Results showed a ranking where MSOPS achieved the best performance, followed by SPEA2, MOEA/D, NSGA-II, and finally MOGBHS. Wilcoxon test showed that MSOPS, MOEA/D, and SPEA2, dominate NSGA-II and MOGBHS with an SL of 95%.

With regard to Epsilon: The Friedman test was run according to a chi-square distribution with 4 degrees of freedom: 26.08 and p = 3.04907E−5. Results showed a ranking where MSOPS achieved the best performance, followed by SPEA2, NSGA-II, MOEA/D, and finally MOGBHS. The Wilcoxon test showed that MSOPS, and SPEA2 dominate MOEA/D, NSGA-II, and MOGBHS with an SL of 95%.

With regard to Generational Distance: The Friedman test was run according to a chi-square distribution with 4 degrees of freedom: 24.96 and p = 5.12501E−5. Results showed a ranking where SPEA2 achieved the best performance, followed by MOEA/D, NSGA-II, MSOPS and finally MOGBHS. Wilcoxon test showed that SPEA2 and MOEA/D dominate NSGA-II, MSOPS, and MOGBHS with an SL of 95%.

With regard to Inverted Generational Distance: The Friedman test was run according to a chi-square distribution with 4 degrees of freedom 17.48 and p = 0.00156. Results showed a ranking where MOGBHS achieved the best performance, followed by NSGA-II and MOEA/D with the same average value, the third place is occupied by SPEA2 and finally MSOPS. Wilcoxon test showed that MOGBHS and NSGA-II dominate SPEA2 y MSOPS with an SL of 95%.

Concerning Spacing: The Friedman test was run according to a chi-square distribution with 4 degrees of freedom: 38.2 and p = 1.01950E−7. Results showed a ranking where SPEA2 achieved the best performance, followed by NSGA-II, MOEA/D, MOGBHS, and finally MSOPS. Wilcoxon test showed that SPEA2 dominates MOEA/D, NSGA-II, MOGBHS, and MSOPS with an SL of 95%, besides MOEA/D dominates MOGBHS and MSOPS with the same SL.

Results evidence that MOGBHS achieved the best solutions over constrained problems such as CF7, occupying the first place according to 3 metrics (HV, EP, and SP) as can be seen in Table 5.

Table 5. MOGBHS en 20000 EOFs. Best results in bold.

5.5 Analysis of the Results Per Metric

The present analysis is based on the average of the best solutions found during the evaluations in the MOEA-Framework. The performance was measured according to the metrics. General results are shown in Table 6.

Table 6. Overall ranking of Friedman test by metrics. Best results in bold.

The overall ranking for hypervolume showed that SPEA2 is the most competitive algorithm as its average ranking is 2.4, followed by MSOPS in the second place. The third position is occupied by MOEA/D followed by MOGBHS and NSGA-II in the last place. SPEA2 is better with 1000 and 2000 EOFs, and MSOPS is better with 10000 and 20000 EOFs. These results show that SPEA2 achieved the maximum value of hypervolume (accuracy and diversity of solutions) in the final set of solutions when EOFs is less than or equal to 20000.

The overall ranking for Epsilon showed that SPEA2 is the most competitive algorithm as its average is 2.3, followed by MSOPS with 2.4. In the third place NSGA-II and finally MOGBHS and MOEA/D. SPEA2 is better with 2000 and 5000 EOFs and MSOPS is better with 10000 and 20000 EOFs. These results evidence that SPEA2 and MSOPS generated solutions closer to the optimal Pareto front (high level of accuracy, diversity, and separability). In other words, these solutions have a high level of similarity with the Pareto front solutions when EOFs is less than (or equal to) 20000.

The overall ranking for Generational distance showed that SPEA2 is the most competitive algorithm with an average ranking of 1.9, followed by MOEA/D in the second place, MOBGHS in the third position followed by NSGA-II and MSOPS in the last place. SPEA2 is better with all EOFS values tested. SPEA2 achieved better performance (accuracy) as its solutions have short average distance from the Pareto front when EOFs is less than or equal to 20000.

The overall ranking for Inverted Generational distance showed that MOGBHS is the most competitive algorithm with an average ranking of 2.6 like NSGA-II, MOEA/D in the third place followed by MSOPS and SPEA2 in the last place. MSOPS is better with 2000 EOFs, and NSGA-II is better with 5000 EOFs. Finally, MOGBHS is better with 10000 and 20000 EOFs. These results allow evidence that solutions obtained with MOGBHS have a high level of convergence (accuracy and diversity) with respect to the distribution of solutions in the Pareto front. It is important to note the difference in the results of this metric compared to the results of the Hypervolume metric. Although both metrics evaluate accuracy and diversity, the results are opposite.

The overall ranking for Spacing showed that SPEA2 is the most competitive algorithm with an average ranking of 1.7 followed by MOEA/D in the second place, NSGA-II in the third place and MOGBHS in the fourth place, finally MSOPS in the last place. MOEA/D is better with 2000 EOFs, and SPEA2 is better with 5000, 10000, and 20000 EOFs.

By analyzing all the metrics, it can be said that SPEA2 is the most competitive with different EOFs. MOGBHS is closer to the best solutions with few EOFs (2000 and 5000) but does not get good results when the number of EOFs grows. However, when considering the metric of inverted generational distance, MOGBHS gets the best results with 10000 and 20000 EOFs. A detailed review of the MOEA Framework and the metrics of hypervolume and inverted generational distance is necessary to find the explanation of these opposite results.

5.6 Overall Analysis

After observing the overall rankings, it can be stated that SPEA2 is the most competitive algorithm according to all the metrics (HV, EP, GD, IGD, and SP) with 2000, 5000, 10000, and 20000 EOFs. The second place is occupied by MOEA/D, NSGA-II in the third place, MOGBHS in the fourth and MSOPS in the last place (see Table 7).

Table 7. Average results using all metrics

In Table 7, SPEA2 showed a consistent performance in the test carried out. Equally, it can be observed that MOGBHS is competitive with 2000 and 5000 EOFs occupying the first place according to the metrics IGD and EP which are commonly used for testing MOEAs [17].

6 Conclusions and Future Work

In this paper, a comparative analysis of the MOGBHS algorithm with other four state-of-the-art algorithms is presented. Results showed that the most competitive algorithm for the families of problems UF1–UF12 and CF1–CF9 is SPEA2. The analysis used the metrics Hypervolume, Epsilon, generational distance, Inverted generational distance, and Spacing. SPEA2 remained constant in the best rankings positions generated during the tests (with 2000, 5000, 10000 and 20000 EOFs).

It was observed that the MOGBHS algorithm obtained the best solutions according to the metric inverted generational distance. Equally, the results of the overall analysis showed that as the number of evaluations of EOFs increased the performance of MOGBHS decreased. The opposite happens with the MSOPS algorithm, which obtained better performance as the number of EOFs increased. Therefore, MOGBHS should be used to solve problems when there exist time restrictions or the number of EOFs is established.

It was observed that MOGBHS is very competitive in constrained problems, because this algorithm was designed to solve transportation problems with several constrains. Also, MOGBHS was designed to solve problems in which the fitness calculation is time-consuming (the simulation of buses was done in a tool based on discrete events) and some restrictions in the number of EOFs were defined. For all the EOFs MOGBHS obtained the best solutions for the CF7 problem.

As future work, it is expected to incorporate covering arrays and tournament objectives to MOGBHS in order to tackle many-objectives optimization problems. Likewise, it is planned to experiment with problems of much more than two goals (objectives) such as the DTLZ and WFG family of problems.