Abstract
Self-organizing migrating algorithm (SOMA) is a novel approach capable to solve almost all type of functions. SOMA is highly effective evolutionary optimization technique and has proved its efficiency in solving many real-life applications. This paper presents a new optimization technique M-NM-SOMA to solve global optimization problems. In the proposed algorithm, SOMA is hybridized with Nelder-Mead method as crossover operator and non-uniform mutation operator in order to avoid premature convergence and keep the diversity of the population. The main feature of this algorithm is that it works for very low population size. To authenticate the efficiency of the proposed algorithm, it is tested on 17 benchmark test problems taken from the literature and the obtained results are compared with the results of other existing algorithms. Numerical and graphical results show that M-NM-SOMA has better global search ability and is very efficient, reliable, and accurate in comparison with other algorithms.
Access provided by Autonomous University of Puebla. Download conference paper PDF
Similar content being viewed by others
Keywords
- Self-organizing migrating algorithm
- Nelder-Mead crossover operator
- Non-uniform mutation
- Particle swarm optimization
- Global optimization
1 Introduction
A broad class of population-based algorithms for solving global optimization problems has been developed till date. Some of them are genetic algorithms (GA) [1], differential evolution (DE) [2], particle swarm optimization (PSO) [3], ant colony optimization (ACO) [4], and self-organizing migrating algorithm (SOMA) [5], etc. Among the above mentioned algorithms, SOMA is comparatively a new comer to the class of population-based stochastic search technique capable of handling all type of functions. SOMA can be classified as an evolutionary algorithm, regardless of the fact that no evolution takes place, i.e., no new generations of individuals are created during the search; only the positions of the individuals in the search space are changed during a generation called ‘migration loop’. The main features of this algorithm are:
-
(i)
It works efficiently for very low population size.
-
(ii)
It quickly converges to global optimal solutions.
Despite the fact of several attractive features, sometimes SOMA may converge prematurely and the solution may trap to local optima and this situation arises with the increase of dimensionality. As a result, there is diversity loss in the population. To maintain the diversity mechanism, SOMA can be hybridized with local search techniques or other population-based techniques. Hybridization is a grouping of two or more algorithms, in which one seeks a promising region within the large solution space expected to contain global minima, and the other makes use of the search domain to find the best solution rapidly and more precisely. Several attempts have been made earlier to hybridize population-based techniques with other existing approaches [6–12]. First variant of SOMA was developed by Deep and Dipti which is the hybridization of GA and SOMA [13].
Recently, Dipti and Seema developed a number of variants of SOMA, named SOMAQI, SOMA-M, and M-SOMAQI [14–16]. In this paper, a novel variant of SOMA (M-NM-SOMA) based on Nelder-Mead crossover operator and non-uniform mutation operator is proposed. The performance of M-NM-SOMA has been evaluated on the set of 17 benchmark problems and the comparison of it is made with standard PSO and SOMA.
The paper is structured in the following manner: In Sect. 2, preliminaries are presented. M-NM-SOMA is presented in Sect. 3. In Sect. 4, the experimental results are shown. Finally, the paper concludes with Sect. 5 depicting the outcome of the current study.
2 Preliminaries
2.1 Self-Organizing Migrating Algorithm
Zelinka and Lampinen [17] first introduced SOMA, which is inspired by the collective behavior of intelligent creatures. This algorithm travels in migration loops and in each migration loop, active individual (individual having worst fitness value) travels a finite distance toward leader (individual having best fitness value) in N (path length/step size) moves of defined length (move size). This path is perturbed randomly by perturbation parameter (PRT) which is defined in the range [0, 1]. Perturbation vector (PRT) controls the perturbation and is created before an individual proceeds toward leader in the following manner:
where rnd j is uniformly distributed random number in (0, 1) and n is the number of decision variables.
More information regarding SOMA can be obtained from [18].
2.2 Nelder-Mead Crossover Operator
The Nelder-Mead simplex search method is a direct search method, originally proposed by Spendley et al. and later modified by Nelder and Mead [19]. First of all, a population is initialized and a simplex is created using (n + 1) points (n: the number of variables of a function) chosen arbitrarily from the population. In each migration, the worst point in the simplex is selected first. Then, a new simplex is formed from the old simplex through a sequence of elementary geometric transformations (reflection, contraction, expansion). After each transformation, the current worst point is replaced by a better one. In the proposed algorithm, Nelder-Mead simplex search method is used as a linear Nelder-Mead crossover operator which creates a new point using two out of three randomly chosen points from population.
The computational steps of NM crossover operator method are as follows:
-
Step1:
choose parameters γ > 1, β > 0;
-
Step2:
create an initial simplex with randomly chosen three vertices;
find x h (the worst point), x l (the best point), x g (next to the worst point);
calculate their function values f r, f l, f g; the worst point x h is reflected with respect to the centroid (x c ) of other two points;
$$\begin{aligned} & x_{\text{r}} = 2x_{c} {-}x_{{{\text{h}}{.}}}\;({\text{reflection}}) \\ & {\text{if}}\,f_{\text{r}} < f_{\text{l}} \\ & x_{\text{new}} = ( 1+\upgamma)x_{\text{c}} {-} \,\upgamma\,x_{{{\text{h}} .}} ({\text{expansion}}). \\ & {\text{else}}\,{\text{if}}\,f_{\text{r}}\;{>}{=}\;f_{\text{h}} \\ & x_{\text{new}} = { (1} -\upbeta )x_{\text{c}} + \,\upbeta\,x_{\text{h}}{.}\;({\text{contraction}}). \\ & {\text{else}}\,{\text{if}}\,f_{\text{g}} < f_{\text{r}} < f_{\text{h}} \\ & x_{\text{new}} = ( 1+\upbeta)x_{\text{c}} {-} \,\upbeta\,x_{{{\text{h}}{.}}}\;({\text{contraction}}). \\ \end{aligned}$$(1)calculate f new and replace x h by x new.
-
Step3:
this process continues until termination criterion is satisfied.
2.3 Non-uniform Mutation Operator
Non-uniform mutation operator was proposed by Michalewicz [20] to decrease the weakness of random mutation in the real-coded GA. Non-uniform mutation randomly selects one solution x k from the population and its value is created according to the following rule:
where \(T = \left( {\mu \left( {1 - \frac{t}{{t_{\hbox{max} } }}} \right)} \right)^{b}\) with γ and μ two uniformly distributed random numbers in the interval [0, 1], \(lb_{\text{k}}\) and \(ub_{\text{k}}\) are the lower and upper bound of x k, b > 0 is a parameter determining the degree of uniformity, t is a migration number, and t max the maximum number of migrations allowed to run. Non-uniform mutation has fine-tuning capabilities to achieve high precision.
3 Proposed Hybrid M-NM-SOMA Algorithm
In this section, a variant of SOMA, M-NM-SOMA has been proposed which is the hybridization of SOMA with Nelder-Mead crossover operator and non-uniform mutation operator. The convergence of standard SOMA is so fast that all other individuals move closer to the best individual very quickly. This causes the population diversity decrease and leads to the premature convergence. To overcome the above problems, SOMA is hybridized with Nelder-Mead crossover operator and non-uniform mutation operator to maintain the diversity among the solutions in the search space.
3.1 Methodology of Hybridization
First, the population is initialized randomly spread over the search domain. At each migration the individuals having highest fitness value as leader and having least fitness value as active are selected. Now the active individual travels a finite distance towards leader in N moves of defined length. Among the positions created, the best position is selected and replaces the active individual if it is better than active individual. Now, leader and active individuals are selected again from the population and a new point is created using Nelder-Mead crossover operator using Eq. (1). This new point is accepted only if it is better than active individual and is replaced with active individual. Then leader and active individuals are selected again from the population and a new point is created using non-uniform mutation using Eq. (2). This new point is accepted only if it is better than active individual and is replaced with active individual. The process is continued until some termination criterion is satisfied.
4 Experimental Results
The presented algorithm M-NM-SOMA is programmed using C++ and is executed on a Pentium III PC. M-NM-SOMA is used to obtain the results of 17 benchmark problems taken from the literature. All the problems are of minimization with minimum value 0. The seventeen problems with initialization range are given in Table 1. M-NM-SOMA is probabilistic technique and relies a lot on the generation of random numbers; therefore 30 trials of each problem are carried out. A run is measured to be a success if the solution obtained is within 1 % of the preferred precision. The termination criterion of the proposed algorithm in either a run is a success or a preset number of migrations (10,000) are performed.
In order to make a comparative analysis of M-NM-SOMA with SOMA and standard PSO, various performance measures are considered like mean objective function value to check the efficiency and reliability, average number of function evaluations to check the convergence speed, and one more measure success rate is also considered.
The main parameters of M-NM-SOMA are population size, PRT, move size, and path length. The population size is taken as ten for all the problems. PRT parameter varies from 0.1 to 0.9 depending on the problem. The other parameters, move size and path length are taken as 0.31 and 3. Trials for the 17 problems are performed for dimensions (dim) n = 30, 50 and 100.
Table 2 shows successful runs of a total of 30 runs, corresponding to M-NM-SOMA, PSO, and SOMA. Results show that M-NM-SOMA is best in all 17 problems for dim 30 and 100 and it is best in 16 problems for dim 50.
Table 3 shows the average number of function evaluations corresponding to M-NMSOMA, PSO, and SOMA. Results show that M-NM-SOMA is best in 16 problems for all the three dim 30, 50, and 100. Hence on the basis of results, we can say that M-NM-SOMA shows better convergence accuracy.
Table 4 shows the mean objective function value corresponding to M-NM-SOMA, PSO, and SOMA. Results show that M-NM-SOMA is best in 17 problems for dim 30 and is best in 16 problems for dim 50 and 100. Hence, M-NM-SOMA is most reliable and efficient. The problems which could not be solved by the particular algorithm is given the symbol (*) at the corresponding entries. The best results are highlighted in bold characters.
Figures 1, 2 and 3 show the mean best objective function value curves for selected benchmark problems and from the figures it is very clear that M-NM-SOMA converges very fast. Hence the presented algorithm M-NM-SOMA shows its superiority over other algorithms PSO and SOMA.
5 Conclusions
In this paper, a new variant of SOMA, M-NM-SOMA has been proposed. The proposed algorithm is evaluated on 17 unconstrained benchmark problems and obtained results are compared with the results of standard PSO and SOMA. Population size 10 only has been used to evaluate the performance of M-NM-SOMA. On the ground of the results obtained, it can be concluded that the proposed algorithm outperforms PSO and SOMA in terms of population size, efficiency, reliability, accuracy.
References
Goldberg, D.E.: Genetic Algorithms in Search, Optimization and Machine Learning. Addison Wesely, Reading (1975)
Kennedy, J., Eberhart, R.C.: Particle Swarm Optimization. In: Proceedings of IEEE International Conference on Neural Networks, IEEE Service Center, Piscataway, pp. 1942–1948 (1995)
Storn, R., Price, K.: Differential evolution—a simple and efficient heuristic for global optimization over continuous spaces. J. Global Optim. 341–359 (1997)
Dorigo, M., Stützle, T.: Ant Colony Optimization, MIT Press, 2004
Zelinka, I., Lampinen, J.: SOMA- Self organizing migrating algorithm. In: proceedings of the 6th International Mendel Conference on Soft Computing, pp. 177–187, Brno, Czech, Republic (2000)
Khosravi, A., Lari, A., Addeh, J.: A new hybrid of evolutionary and conventional optimization algorithm. Appl. Math. Sci. 6, 815–825 (2012)
Pant, M., Thangaraj, R., Abraham, A.: New mutation schemes for differential evolution algorithm and their application to the optimization of directional over-current relay settings. Appl. Math. Comput. 216, 532–544 (2010) (Elsevier)
Deep, K., Thakur, M.: A new mutation operator for real coded genetic algorithms. Appl. Math. Comput. 193, 229–247 (2007)
Deep, K., Bansal, J.C.: Hybridization of particle swarm optimization with quadratic approximation. Opsearch 46, 3–24 (2009) (Springer)
Xing, L.N., Chen, Y.W., Yang, K.W.: A novel mutation operator base on immunity operation. European Journal of Operational Research, vol. 197, pp. 830–833 (2009)
Deep, K., Das, K.N.: Performance improvement of real coded genetic algorithm with Quadratic approximation based hybridization. Int. J. Intell. Defence Support Syst. 2, 319–334, (2010) (Inderscience Publisherss)
Esmin, A.A.A., Matwin, S.: A hybrid particle swarm optimization algorithm with genetic mutation. Int. J. Innovative Comput. Inf. Control 9, 1919–1934 (2013)
Deep, K., Dipti: A new hybrid self organizing migrating genetic algorithm for function optimization. In: IEEE Congress on Evolutionary Computation, pp. 2796–2803 (2007)
Singh, D., Agrawal, S., Singh, N.: A novel variant of self organizing migrating algorithm for function optimization. In: Proceedings of the 3rd international conference on soft computing for problem solving. Advances in intelligent and Soft Computing, vol. 258, pp. 225–234 (2013)
Singh, D., Agrawal, S.: Hybridization of self organizing migrating algorithm with mutation for global optimization. In: Proceedings of the international conference on mathematical sciences (ICMS), Elsevier, pp. 605–609 (2014)
Singh, D., Agrawal, S.: Hybridization of self organizing migrating algorithm with quadratic approximation and non uniform mutation for function optimization. In: Proceedings of 4th International Conference on Soft Computing for Problem solving, Advances in Intelligent Systems and Computing, Springer, vol. 335, pp. 373–387 (2014)
Zelinka, I., Lampinen, J.: SOMA—Self organizing migrating algorithm. In: Proceedings of the 6th International Mendel Conference on Soft Computing, pp. 177–187, Brno, Czech, Republic (2000)
Zelinka, I.: SOMA—Self Organizing Migrating Algorithm. In: Onwubolu, G.C., Babu, B.V. (eds.) New optimization techniques in engineering, Springer, Berlin (2004)
Nelder, J.A., Mead, R.A.: A simplex method for function minimization. Comput. J. 7, 308–313 (1965)
Michalewicz, Z.: Genetic algorithms + Data structures = Evolution programs, 3rd edn. Springer (1996)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer Science+Business Media Singapore
About this paper
Cite this paper
Dipti Singh, Seema Agrawal (2016). Nelder-Mead and Non-uniform Based Self-organizing Migrating Algorithm. In: Pant, M., Deep, K., Bansal, J., Nagar, A., Das, K. (eds) Proceedings of Fifth International Conference on Soft Computing for Problem Solving. Advances in Intelligent Systems and Computing, vol 436. Springer, Singapore. https://doi.org/10.1007/978-981-10-0448-3_66
Download citation
DOI: https://doi.org/10.1007/978-981-10-0448-3_66
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-10-0447-6
Online ISBN: 978-981-10-0448-3
eBook Packages: EngineeringEngineering (R0)