Keywords

1 Introduction

We describe in this paper a harmony search algorithm which is metaheuristic algorithm inspired by music. In particular we refer to the improvisation of jazz version of Hs and its comparison with the genetic algorithm. These algorithms were applied to benchmark mathematical functions and comparative tables were made showing the optimization of results between Genetic Algorithm and Harmony Search algorithm.

The comparative study of the two algorithms is performed in order to show the effectiveness of harmony search algorithm versus optimization problem, in the same manner proving that it is more effective than the genetic algorithm.

The paper is organized as follows: in this Sect. 2 a description about Harmony Search Algorithm is presented, in this Sect. 3 a description of Genetic Algorithm is shown, in Sect. 4 description the mathematical functions is presented, in Sect. 5 a description about the optimization problems is shown, in Sect. 6 the simulations results are described and we can appreciate a comparison between Harmony Search algorithm and genetic algorithms, and in Sect. 7 the conclusions obtained after the study of the two algorithms versus mathematical functions is presented.

In the literature there are works where the Harmony Search has been used. In [4] a new Meta heuristic algorithm for continuous engineering optimization Theory and practice is presented. In this paper the authors propose a new harmony search(HS) meta heuristic algorithm based approach for engineering optimization problems with continuous design variables it uses a stochastic random search instead of a gradient search so that derivative information is unnecessary various engineering optimization problems, including mathematical function minimization and structural engineering optimization problems, are presented to demonstrate the effectiveness and robustness of the HS algorithm. The results indicate that the proposed approach is a powerful search and optimization technique that may yield better solutions to engineering problems than those obtained using current algorithms. In [5] the Parameter setting free harmony search algorithm is presented, the authors proposed this study a novel technique to eliminate tedious and experience requiring parameter assigning efforts. The new parameter setting free (PSF) technique which this study suggests contains one additional matrix which contains an operation type (random selection, memory consideration, or pitch adjustment) for every variable in harmony memory. In [15] the Harmony Search Benchmarking of heuristic optimization methods is presented, the authors propose it is short history when many heuristic optimization methods appear. As example Particle swarm optimization method (PSO) or Repulsive particle swarm optimization method (RPSO), Gravitational search algorithm (GSA), Central force optimization (CFO), Harmony search algorithm (HAS) etc. Those methods are working differently but all of them can optimize same problems. There is general question: Exists any standard benchmark which can be used for individual methods comparing. It is a bit hard to answer this question because it is possible to find some optimization problems which are widely used along some papers but in fact there does not exists summary which can be uses for standard evaluation of optimization process. In [9] the Global Best Harmony Search is presented, the authors propose a new variant of HS concepts from swarm intelligence are borrowed to enhance the performance of HS. The performance of the GHS is investigated and compared with HS and a recently developed variation of HS. The experiments performed show that the GHS generally outperformed the other approach when applied to ten benchmark problems. The effect of noise on the performance of the three HS variants is investigated and a scalability study is conducted. The effect of the GHS parameters is analyzed. Finally, the three HS variants are compared on several Integer Programming test problem. The results show that the three approaches seem to be an efficient alternative for solving Integer Programming problem. In [16] the Self adaptive: harmony search algorithm for optimization is presented, the authors proposed a new metaheuristic optimization algorithm harmony search (HS) with continuous design variables was developed. This algorithm is conceptualized using the musical improvisation process of searching for a perfect state of harmony. Although several variants and an increasing number of applications have appeared, one of its main difficulties is how to select suitable parameter values. In [10] the An improved harmony search algorithm for solving optimization problems is presented, in this paper the authors propose develops an improved harmony search (IHS) algorithm for solving optimization problems. IHS employs a novel method for generating new solution vectors that enhances accuracy and convergence rate of harmony search (HS) algorithm, in [11] the a survey on applications of the harmony search algorithm, in this paper they propose thoroughly reviews and analyses the main characteristics and application portfolio of the so called Harmony Search algorithm a meta heuristic approach that has been shown to achieve excellent results in a wide range of optimization problems. In [8] the A Tabu Harmony Search Based Approach to Fuzzy Linear Regression is presented, the authors propose an unconstrained global continuous optimization method based on tabu search and harmony search to support the design of fuzzy linear regression (FLR) models. Tabu and harmony search strategies are used for diversification and intensification of FLR, respectively. The authors propose approach offers the flexibility to use any kind of an objective function based on client’s requirements or requests and the nature of the data set and then attains its minimum error. In [14] the A new gravitational search algorithm using fuzzy logic to parameter adaptation the authors propose a new method using fuzzy logic to change alpha parameter and give a different gravitation and acceleration to each agent in order to improve its performance, we use this new approach for mathematical functions and present a comparison with original approach. In [3] the Fuzzy Control of Parameters to Dynamically Adapt the PSO and GA Algorithms the authors propose a new hybrid approach for mathematical function optimization combining Particle Swarm Optimization (PSO) and Genetic Algorithms (GAs) using Fuzzy Logic for parameter adaptation and integrate the results. In [7] the Music Inspired Harmony Search Algorithm Theory and practice, the authors propose we show the performance of the algorithm and the areas in which it can be applied. In [6] the Harmony Search Algorithms for structural design optimization the authors propose a show us the type of problems you can solve the harmony search algorithm and some methods that have been proposed to improve in certain areas of application. In [12] the Differential evolution with dynamic adaptation of parameters for the optimization of fuzzy controllers is presented, the authors propose a new algorithm using fuzzy logic with dynamic adaptation of parameters.

2 Harmony Search Algorithm

Harmony search is a relatively new heuristic optimization algorithm inspired music and was first developed by ZW Gemm et al. in 2001 [7].

This algorithm can be explained more in detail with the process of improvisation that takes a musician, which consists of three options:

  1. 1.

    Play any song you have in your memory

  2. 2.

    Play a similar composition to an existing

  3. 3.

    Play a new song or randomly

If we formalize these three options for optimization, we have three corresponding components: memory usage of harmony, pitch adjustment and randomization [17].

2.1 Memory in Harmony Search Algorithm

The use of harmony memory is important because it is similar to choosing the best people in genetic algorithms. This will ensure the best harmonies will be transferred to the new memory harmony. In order to use this memory more effectively, we can assign a parameter r accept € [0, 1] call acceptance rate memory. If this rate is too low, just select the best harmonies and may converge very slowly [17].

$$ r_{accept} \in \left[ {0,1} \right] $$
(1)

2.2 Pitch Adjustment

To adjust the pitch slightly in the second component, we have to use such a method can adjust the frequency efficiently. In theory, the tone can be adjusted linearly or nonlinearly, but in practice the linear is used. If the current solution is X old (or pitch), then the new solution (tone) is generated X new .

$$ x_{new} = x_{old} + b_{p} (2rand - 1) $$
(2)

where “rand” is a random number drawn from a uniform distribution [0, 1]. Here is it bandwidth, which controls the local range of tone adjustment in fact, we can see that the pitch adjustment (2) is a random step.

Pitch setting is similar to the mutation operator in genetic algorithms. We can assign a pitch adjustment rate to control the degree of adjustment. If too low, there is usually no change. If too high, then the algorithm may not converge at all [17].

2.3 Randomization

The third component is a randomization component (3) that is used to increase the diversity of the solutions. Although the tone setting has a similar role, but it is limited to certain local tone adjustment and therefore correspond to a local search. The use of randomization can further push the system to explore various regions with high diversity solution in order to find the global optimum [17]. So we have:

$$ {\text{P}}_{\text{a}} = P_{lower \, limit} + P_{range} *\,rand $$
(3)

where rand is a generator of random numbers in the range of 0 and 1. (Search space) \( P_{range} = P_{upper \, limit} - P_{lower \, limit} \)

The three components in harmony search can be summarized in the pseudo code shown in Sect. 2.4, where you can find that the probability of a true randomization (4) is

$$ P_{random} = { 1 } - r_{accept} $$
(4)

And the actual probability of tone adjustment (5) is

$$ P_{tono} = r_{accept} *r_{pa} $$
(5)

2.4 Pseudo Code for Harmony Search Algorithm

The pseudo code for HS is presented below:

2.5 Variants

There are three variants of the algorithm that have been applied to achieve better results briefly explain each of them:

The improved harmony search algorithm (IHS)

To address the shortcomings of the HS, Mahdavi et al. [10] proposed a new variant of the HS, called the improved harmony search (IHS). The IHS dynamically updates (r pa ) according to the following equation,

$$ Rpa(t) = Rpa_{\hbox{min} } + \frac{{(Rpa_{\hbox{max} } - Rpa_{\hbox{min} } )}}{NI} + t $$
(6)

where Rpa(t) is the pitch adjusting rate for generation t, PAR min is the minimum adjusting rate, PAR max is the maximum adjusting rate and t is the generation number.

In addition, bp is dynamically updated as follows:

$$ bp(t) = bp_{{\max^{e} }} (\frac{{\ln (\frac{{bp_{\hbox{min} )} }}{{bp_{\hbox{max} } }})}}{NI})*t $$
(7)

where bp(t) is the bandwidth for generation t, bp min is the minimum bandwidth and bp max is the maximum bandwidth.

A major drawback of the IHS is that the user needs to specify the values for bw min and bp max which are difficult to guess and problem dependent.

Best overall harmony search (GHS)

Inspired by the concept of swarm intelligence as proposed in Particle Swarm Optimization (PSO) [2], a new variation of HS is proposed in this paper. In a global best PSO system, a swarm of individuals (called particles) fly through the search space. Each particle represents a candidate solution to the optimization problem.

The position of a particle is influenced by the best position visited by itself (i.e. its own experience) and the position of the best particle in the swarm (i.e. the experience of swarm).

The new approach, called global-best harmony search (GHS), modifies the pitch-adjustment step of the HS such that the new harmony can mimic the best harmony in the HM. Thus, replacing the bp parameter altogether and adding a social dimension to the HS. Intuitively, this modification allows the GHS to work efficiently on both continuous and discrete problems.

The GHS has exactly the same steps as the IHS with the exception that Step 3 is modified as follows:

New global harmony search (NGHS)

The NGHS algorithm is an improved version of harmony search algorithm (HS), and it includes two important operations: position updating and genetic mutation with a low probability. The former can enhance the convergence of the NGHS, and the latter can effectively prevent the NGHS from trapping into the local optimum. Based on a large number of experiments, the NGHS has demonstrated high efficiency on solving chemical equation balancing. The results show that the NGHS can be an efficient alternative for solving chemical equation balancing [1].

2.6 Application Areas

The harmony search algorithm has been applied so far to various optimization problems. Moreover, the structure of the algorithm has been customized by case adjust basic structure. To overcome this situation, the algorithm of harmony search (HS) used a new stochastic derivative, using the experiences of musicians in jazz improvisation and may be applicable to discrete variables. Instead of tilting the information of an objective function, the stochastic derivative HS gives a probability of being selected for each value of a decision variable.

The HS algorithm has been applied to various problems in science and engineering optimization including:

Optimization function, the distribution of water, groundwater modeling, energy saving clearance, structural design, vehicle routing, and others. The possibility of combining harmony search with other algorithms such as particle swarm optimization and genetic algorithms has also been investigated [17].

3 Genetic Algorithms

Genetic algorithms (GA) emulate genetic evolution. The characteristics of individuals are therefore expressed using genotypes. The original form of the GA, as illustrated by John Holland in 1975, had the distinct features: (1) a bit string representation, (2) proportional selection, and (3) cross-over as the primary method to produce new individuals. Since then, several variations to the original Holland GA have been developed, using different representation schemes, selection, crossover, mutation and elitism operators [2].

3.1 Representation

The classical representation scheme for GAs is of binary vectors of fixed length. In the case of an n x -dimensional search space, each individual consists on n x variables with each variable encoded as a bit string. If variables have binary values, the length of each chromosome is n x bits. In the case of nominal-valued variables, each nominal value can be encoded as an n d -dimensional bit vectors where 2nd is the total numbers of discrete nominal values for that variable. Each n d -bit string represents a different nominal value. In the case of continuous-valued variables, each variable should be mapped to an n d -dimensional bit vector,

$$ \varphi :R \to (0,1)^{{n_{d} }} $$
(8)

The range of continuous space needs to be restricted to a finite range, [x min , x max ]. Using the standard binary decoding, each continuous variable x ij of chromosome x i is encoded using a fixed length bit string.

GAs have also been developed that use integer or real valued representations and order based representations where the order of variables in a chromosome plays an important role. Also, it is not necessary that chromosomes be of fixed length [2].

3.2 Crossover Operations

Several crossover operators have been developed for GA’s depending on the format in which individuals are represented. For binary representations, uniform crossover, one-point crossover and two-point crossover are the most popular:

  • Uniform Crossover, where corresponding bit positions are randomly exchanged between the two parents to produce two offspring.

  • One-Point Crossover, where a random bit position is selected, and the bit substrings after the selected bit are swapped between the two parents to produce two offspring.

  • Two-Point Crossover, where two bit positions are randomly selected and the bit substrings between the selected bit positions are swapped between the two parents.

For continuous valued genes, arithmetic crossover can be used:

$$ x_{ij} = r_{j} x_{1j} + (1.0 - r_{j} )x_{2j} $$
(9)

where \( r_{j} \)~U(0, 1) and x i is the offspring produced from parents x 1 and x 2 [2].

3.3 Mutation

The mutation scheme used in a GA depends on the representation scheme. In the case of bit string representations, we here:

  • Random Mutation, randomly negates bits, while

  • In-Order Mutation, performs random mutation between two randomly selected bit positions.

For discrete genes with more than two possible values that a gene can.

Assume, random mutation selects a random value from the finite domain of the gene. In the case of continuous valued genes, a random value sampled from a Gaussian distribution with zero mean and small deviation is usually added to the current gene value. As an alternative, random noise can be sampled from a Cauchy distribution [2].

4 Benchmark Mathematical Functions

This section list a number of the classical benchmark functions used to validate optimization algorithms.

In the area of optimization using mathematical functions have been considered in the works mentioned below: A new gravitational search algorithm using fuzzy logic to parameter adaptation [14], Differential evolution with dynamic adaptation of parameters for the optimization of fuzzy controllers [12], Bat algorithm comparison with genetic algorithm using benchmark functions [13].

To validate our method we used a set of 6 benchmark mathematical functions, called Spherical, Rosenbrock, Rastrigin, Ackley, Zakharov, Sum Square; all functions were evaluated with 4, 5, 10, 20, 30 and 40 Harmonies.

Figure 1 shows the plot corresponding to the Spherical function and Eq. 10 represents the Spherical function. Figure 2 shows the plot corresponding to Rosenbrock function and Eq. 11 represents the Rosenbrock function.

Fig. 1
figure 1

Spherical function

Fig. 2
figure 2

Rosenbrock function

The mathematical functions are shown below:

$$ f(x) = \sum\limits_{j = 1}^{{n_{x} }} {x_{j}^{2} } $$
(10)

Witch \( x_{j} \in [ - 100,100] \) and \( f^{*} (x) = 0.0 \)

$$ f(x) = \sum\limits_{j = 1}^{{n_{z} /2}} {[100(x_{2j} - x_{2j - 1}^{2} )^{2} + (1 - x_{2j - 1} )^{2} ]} $$
(11)

Witch \( x_{j} \in [ - 2.048,2.048] \) and \( f^{*} (x) = 0.0 \)

Figure 3 shows the plot corresponding to Rastrigin function and Eq. 12 shows the description the Rastrigin function.

$$ f(x) = \sum\limits_{j = 1}^{{n_{x} }} {(x_{j}^{2} - 10\cos (2\pi x_{j} ) + 10)} $$
(12)

With \( x_{j} \in [ - 5.12,5.12] \) and \( f^{*} (x) = 0.0 \)

Fig. 3
figure 3

Rastrigin function

Figure 4 shows the plot corresponding to Ackley function and Eq. 13 shows the description the Ackley function.

$$ f(x) = - 20e^{{ - 0.2\sqrt {\frac{1}{{n_{x} }}\sum\limits_{j = 1}^{{n_{x} }} {x_{j}^{2} - \frac{1}{{e^{{n_{x} }} }}\sum\limits_{j = 1}^{{n_{x} }} {\cos (2\pi x_{j} )} } } }} + 20 + e $$
(13)

With \( x_{j} \in [ - 30,30] \) and \( f^{*} (x) = 0.0 \)

Fig. 4
figure 4

Ackley function

Figure 5 shows the plot corresponding to Zakharov function and Eq. 14 shows the description the Zakharov function.

$$ f(x) = \sum\limits_{i = 1}^{n} {x_{i}^{2} } + (\sum\limits_{i = 1}^{n} {0.5ix_{i} )^{2} + (\sum\limits_{i = 1}^{n} {0.5ix_{i} )^{4} } } $$
(14)

Witch \( x_{i} \in [ - 5,10] \) and \( f^{*} (x) = 0.0 \)

Fig. 5
figure 5

Zakharov function

Figure 6 shows the plot corresponding to Sum Square function and Eq. 15 shows the description the Sum Square function.

$$ f(x) = \sum\limits_{i = 1}^{n} {ix_{i}^{2} } $$
(15)

Witch \( x_{i} \in [ - 2,2] \) and \( f^{*} (x) = 0.0 \)

Fig. 6
figure 6

Sum square function

5 Optimization Problems

The optimization problem can be defined as:

$$ {\text{Min }}y = \, f\left( x \right) $$
(16)

Basically the problems can be divided into unimodal and multimodal.

Another aspect are constrains. Some problems have not constrains. The best optimization method has to find optimal value in all cases.

There exist many algorithms for problem solving. Some of them are special, some are more general. Many problems cannot be solved by deterministic algorithm, so heuristic algorithm are used. In the set of metaheuristic algorithms we can find PSO [3], GSA [14], DE [12] and others.

Unimodal functions

Have only one local optimum. Those functions are relatively easy to analyze for optimums. They are used for checking the speed of optimization and convergence. There are used commonly two functions. First one is Schefel’s and second one is Rastrigin’s.

Multimodal functions

Multimodal functions have multiple local optimums. Some methods will stuck in local optimum. Main goal of those problems is to test solvers how they are able to avoid local optimum. Some problems have no single global optimum. Some of them have one global optimum and many local which are very close in the term of fitness function. First one is Sphere, second one is Sum and Product and third one is Griewank’s.

6 Simulation Results

In this section the comparison of the Harmony Search algorithm is made against genetic algorithms [13]. In each of the algorithms 6 mathematical functions Benchmark were considered separately, a dimension of 10 variables was used with 30 runs for each function varying the parameters of the algorithms.

The parameters used in the HS were:

  • Size solution harmonies: 4–40 Harmonies.

  • Harmony memory accepting: 0.75–0.95.

  • Pitch adjustment: 0.1–0.5.

  • Pitch range: 200–400.

The parameters for the genetic algorithm are shown below [13]:

  • Number of Individuals: 4–40.

  • Selection: Stochastic, Remainder, Uniform, Roulette.

  • Crossover: Scattered, Single Point, Two Point, Heuristic and Arithmetic.

  • Mutation: Gaussian, Uniform.

6.1 Simulation Results with Harmony Search Algorithm

In this section we show the experimental results obtained by the Harmony Search algorithm in separate tables of the mathematical functions. Table 1 shows the simulation results for the Spherical function.

Table 1 Simulation results for the spherical function

From Table 1 it can be appreciated that after executing the HS Algorithm 30 runs, with different parameters, we can find the best, average and worst results for the Spherical function. Table 2 shows the simulation results for the Rosenbrock function.

Table 2 Simulation results for the Rosenbrock function

From Table 2 it can be appreciated that after executing the HS Algorithm 30 runs, with different parameters, we can find the best, average and worst results for the Rosenbrock function. Table 3 shows the simulation results for the Rastrigin function.

Table 3 Simulation results for the Rastrigin function

From Table 3 it can be appreciated that after executing the HS Algorithm 30 runs, with different parameters, we can find the best, average and worst results for the Rastrigin function. Table 4 shows the simulation results for the Ackley function.

Table 4 Simulation results for the Ackley function

From Table 4 it can be appreciated that after executing the HS Algorithm 30 runs, with different parameters, we can find the best, average and worst results for the Ackley function. Table 5 shows the simulation results for the Zakharov function.

Table 5 Simulation results for the Zakharov function

From Table 5 it can be appreciated that after executing the HS Algorithm 30 runs, with different parameters, we can find the best, average and worst results for the Zakharov function. Table 6 shows the simulation results for the Sum Square function.

Table 6 Simulation results for the sum square function

From Table 6 it can be appreciated that after executing the HS Algorithm 30 runs, with different parameters, we can find the best, average and worst results for the Sum Square function.

6.2 Simulation Results with the Genetic Algorithm

In this section we show the experimental obtained by the genetic algorithm in separate tables of the mathematical functions [13]. Table 7 shows the simulation results for the Sum Sphere function using genetic algorithm.

Table 7 Simulation results for the sphere function

From Table 7 it can be appreciated that after executing the Genetic Algorithm 30 runs, with different parameters, we can find the best, average and worst results for the Sphere function. Table 8 shows the simulation results for the Rosenbrock function using genetic algorithm.

Table 8 Simulation results for the Rosenbrock function

From Table 8 it can be appreciated that after executing the Genetic Algorithm 30 runs, with different parameters, we can find the best, average and worst results for the Rosenbrock function. Table 9 shows the simulation results for the Rastrigin function using genetic algorithm.

Table 9 Simulation results for the Rastrigin function

From Table 9 it can be appreciated that after executing the Genetic Algorithm 30 runs, with different parameters, we can find the best, average and worst results for the Rastrigin function. Table 10 shows the simulation results for the Ackley function using genetic algorithm.

Table 10 Simulation results for the Ackley function

From Table 10 it can be appreciated that after executing the Genetic Algorithm 30 runs, with different parameters, we can find the best, average and worst results for the Ackley function. Table 11 shows the simulation results for the Zakharov function using genetic algorithm.

Table 11 Simulation results for the Zakharov function

From Table 11 it can be appreciated that after executing the Genetic Algorithm 30 runs, with different parameters, we can find the best, average and worst results for the Zakharov function. Table 12 shows the simulation results for the Sum Square function using genetic algorithm.

Table 12 Simulation results for the sum square function

From Table 12 it can be appreciated that after executing the Genetic Algorithm 30 runs, with different parameters, we can find the best, average and worst results for the Sum Square function.

7 Conclusions

The HS algorithm is a new method which can solve various types of problem very easily and effectively because it not requires many complex calculations. The HS can handle discrete, continuous variables and can be applied to linear and nonlinear functions.

In the analysis of results obtained with the genetic algorithm and harmony search, we conclude that the HS is better than the GA this is demonstrated with the tables mentioned in the previous section to obtain a minimum error in all the benchmark functions which was applied, the same number of dimensions were used to perform the comparison.

The analysis of simulation results between HS and GA method considered in this work, lead us to the conclusion that for the optimization of benchmark functions, the HS method is a good alternative because it is easier to optimize and achieve good results try that with GA.

As we can realize in each of the tables where the results of the experiments with HS is best values were obtained.

With this it has been that the algorithm HS is better than GA.