Abstract
This study proposes a novel chaotic cuckoo search (CCS) optimization method by incorporating chaotic theory into cuckoo search (CS) algorithm. In CCS, chaos characteristics are combined with the CS with the intention of further enhancing its performance. Further, the elitism scheme is incorporated into CCS to preserve the best cuckoos. In CCS method, 12 chaotic maps are applied to tune the step size of the cuckoos used in the original CS method. Twenty-seven benchmark functions and an engineering case are utilized to investigate the efficiency of CCS. The results clearly demonstrate that the performance of CCS together with a suitable chaotic map is comparable as well as superior to that of the CS and other metaheuristic algorithms.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Introduction
By observing the great laws of nature, several modern metaheuristic algorithms (Gandomi et al. 2013a; Yang et al. 2013) are generally applied to address myriads of complex optimization problems (Yang 2010b). Some of these optimization problems include feature selection (Li and Yin 2013a), image segmentation (Zhang et al. 2011), flow shop scheduling (Li and Yin 2013b), reliability problem (Zou et al. 2010, 2011) and knapsack problem (Zou et al. 2011). These kinds of metaheuristic algorithms are well capable of finding the best solutions by extracting the useful information from a group of individuals. Since the genetic algorithms (GAs) (Goldberg 1998) were put forward during the 1950s and 1960s, several metaheuristic algorithms have been proposed such as artificial plant optimization algorithm (APOA) (Cai et al. 2012), ant colony optimization (ACO) (Zhang and Feng 2012; Zhang et al. 2014; Dorigo et al. 1996), differential evolution (DE) (Storn and Price 1997; Li and Yin 2012), bat algorithm (BA) (Gandomi et al. 2013b; Yang and Gandomi 2012; Yang 2010a; Mirjalili et al. 2013), artificial physics optimization (Xie et al. 2012), biogeography-based optimization (BBO) (Simon 2008; Wang et al. 2013a; Mirjalili et al. 2014a), krill herd (KH) (Gandomi and Alavi 2012; Wang et al. 2014a), harmony search (HS) (Geem et al. 2001; Wang et al. 2014b), monarch butterfly optimization (MBO) (Wang et al. 2015), flower pollination algorithm (FPA) (Yang et al. 2014), animal migration optimization (AMO) (Li et al. 2014), particle swarm optimization (PSO) (Kennedy and Eberhart 1995; Mirjalili and Lewis 2013; Talatahari et al. 2013; Zhao et al. 2014a, b), grey wolf optimizer (GWO) (Mirjalili et al. 2014b) and firefly algorithm (FA) (Gandomi et al. 2011; Yang et al. 2012; Wang et al. 2014c).
Recently, Yang and Deb (2010) proposed a new metaheuristic optimization algorithm, called CS method. CS is inspired by smart incubation behavior of a type of birds called cuckoos in nature. For the single-object case, in CS, the number of eggs, cuckoos and nests are equal, and each one represents an available solution. CS is well capable of finding the best solutions by continuously using new and potentially better solution to replace a not-so-good cuckoo in the population. Conceptually, extremely simple, CS is very easy to implement.
In most cases, CS performs local search well, but sometimes it may have no ability of escaping from local optima which restricts its ability to carry out full search globally. In order to overcome this and enhance the searching ability of the CS method, a strategy has been provided (Li and Yin 2015), which uses self-adaptive method towards adjusting parameters.
On the other hand, several scholars have paid more attention in nonlinear dynamics, especially of chaos. And recently, chaotic theory has found applications in the area of metaheuristics. Up to now, chaotic sequences have been combined with several metaheuristic algorithms, such as imperialist competitive algorithm (Talatahari et al. 2012), FA (Gandomi et al. 2013c), charged system search (Nouhi et al. 2013), chaotic swarming of particles (CSP) (Kaveh et al. 2014), BA (Gandomi and Yang 2014), KH algorithm (Wang et al. 2014d), memetic DE algorithm (Jia et al. 2011) and PSO (Gandomi et al. 2013d).
The present manuscript introduces a chaotic CS-based method, intended for accelerating convergence. In serials of chaotic CS-based algorithms, various one-dimensional chaotic maps are utilized in place of the step size of CS. Therefore, various approaches that use chaotic maps as efficient alternatives to pseudorandom sequences have been put forward. The chaotic CS-based methods are experimented on 27 benchmark functions and an engineering case. Performance comparison with the other approaches demonstrated the superiority of the proposed strategy. Series of the simulation show that CCS performs more efficiently and accurately than the basic CS and other metaheuristic methods. The enhancement of the new methods are revealed due to the usage of determinate chaotic signals substitute for original step size.
The paper is organized into different sections: Sect. 2 describes the basic CS algorithm and 12 chaotic maps. The proposed CCS approach is described in Sect. 3. Subsequently, the tuning of the step size \(\alpha \) and finding the best chaotic CS are discussed in Sect. 4. In addition, comparing with eight other methods, the CCS algorithm is also evaluated through 27 functions and an engineering case. Finally, Sect. 5 provides a summary of our work.
2 Preliminary
First and foremost, we will provide a brief preliminary on the CS algorithm and 12 chaotic maps.
2.1 The CS algorithm
By idealizing and simplifying the brood behavior of some cuckoo species, CS is put forward in combination with the Levy flight, which is a swarm intelligence optimization method.
In order to describe the CS method more easily, Yang and Deb have proposed the following three hypothetical rules:
-
1.
each cuckoo lays and places one egg in a randomly chosen nest at a time;
-
2.
the optimal nests cannot be corrupted;
-
3.
the number of host nests is fixed and the egg can be found by the host bird with a fixed probability \(p_\mathrm{a} \in [0,1]\).
Based on three rules, the pseudo-code of the CS can be represented as shown in Fig. 1.
For cuckoo \(i\), when a new cuckoo \(x^{(t+1)}\) is generated, a Levy flight is implemented
where \(\alpha \) \(>\) 0 is the step size. In most cases, \(\alpha \) is simply set to 1. The product \(\oplus \) means entrywise multiplications.
2.2 Chaotic maps
In optimization field, chaotic optimization algorithm (COA) is a kind of random-based methods that use chaotic variables. In COA, because the chaos has the property of the non-repetition and ergodicity, full search can be implemented at higher speeds than stochastic searches that rely on probabilities. In order to serve for this objective, 12 one-dimensional non-invertible maps are applied to generate chaotic sets (see Table 1). Note that M1, M2, \(\ldots \), M12 in Tables 3 and 4 are short for the 12 responding chaotic maps. More details about COA method and 12 chaotic maps can be found in Wang et al. (2013b).
3 Chaotic CS
As presented in Eq. (1), the main parameters of CS are the step size \(\alpha \) and discovery rate \(p_{\mathrm{a}}\) that characterizes the variations of the global best step size, and their values have a great influence on the convergent speed and how CS performs.
The basic CS is well capable of finding the best solutions, but the solutions are still swinging slightly around the optima. As per the previous literature (Yang 2010b), we used \({\alpha }= O(L/10)\) and \(p_{\mathrm{a}}= 0.25\) for the standard CS, where \(L\) is the characteristic scale of the problem of interest.
The step size used in CS remains unchanged. The improved CS method with a chaotic varying step size \(\alpha \) may be better than the basic one, which may also accelerate its convergence. By normalizing all chaotic maps, their variations are always in [0, 2]. After normalization, chaotic maps are able to tune step size \(\alpha \), and this improved CS method is referred as the chaotic CS.
The simple description of pseudo-code of CCS algorithm can be provided as shown in Fig. 2.
In addition, to protect the best cuckoos, elitism strategy is introduced into CCS. This forbids the best cuckoos from being corrupted by cuckoo updating operator. Note that an elitism strategy is used to save the property of the best cuckoos in the CCS process, so even if cuckoo updating operation destroys its corresponding cuckoo, the best cuckoos can be reverted back if needed.
4 Simulation experiments
In order to illustrate the benefits of the CCS method, it is tested by means of an array of experiments implemented on benchmark functions and an engineering case. In order to obtain the real and fair results, we did all the implementations under the same conditions as Wang et al. (2014e) and Guo et al. (2014). The benchmark functions as shown in Table 2 are standard testing functions, and their detailed properties can be found in Wang et al. (2014f, (2014g). In the following experiments, the best value for each function is shown in bold. The dimension of the function is 20 in the present work.
4.1 CCS with different chaotic maps
Different chaotic CS variants were benchmarked using 14 high-dimensional distinguished numerical examples (see Table 2). In this subsection, tuning the step size \(\alpha \) is carried out. Here, the value of \(\alpha \) is replaced with 12 different chaotic maps (see Sect. 2.2).
For CCS algorithm, we set population size, elitism parameter and maximum generation to 50, 2 and 50, respectively. 1000 independent runs are implemented to get typical performances. The results are illustrated in Tables 3 and 4.
From Tables 3 and 4, it can be seen that the CCS performs more effectively with the M9 (Sine map) and M11 (Sinusoidal map) than others. For these two maps, on average, the M11 significantly outperforms the M9 on most benchmarks. Further, from Table 4, we can see that the M11 yields an output, having only slight difference with the optimal value when multiple runs are made. Comprehensive consideration suggests that we select M11 as the final optimal map to be used for chaotic CS (CCS). In addition, from the numerical simulations, the obtained results indicate that choosing an appropriate chaotic map is crucial in leading to making full use of the advantage of the chaotic CS. M11 can provide more information for CS method to guide its search while other chaotic maps cannot.
4.2 General performance of CCS
In order to investigate the benefits of CCS, it was compared with nine other metaheuristic algorithms, which are ACO (Dorigo and Stutzle 2004), DE (Storn and Price 1997), ES (Beyer 2001), GA (Goldberg 1998), HS (Geem et al. 2001), PBIL (Shumeet 1994) and PSO (Kennedy and Eberhart 1995).
In the following experiments, for CS and CCS, we will use the same parameters that is discovery rate \(p_{\mathrm{a}}=0.25\). In addition, the settings of population size, elitism parameter and maximum generation are the same as Sect. 4.1. For other methods, their parameters are set as follows (Wang et al. 2014g): For ACO, initial pheromone value \(\tau _{0}=1\mathrm{E}{-}6\), pheromone update constant \(Q=20\), exploration constant \(q_{0}=1\), global pheromone decay rate \(\rho _{\mathrm{g}}=0.9\), local pheromone decay rate \(\rho _{\mathrm{l}}=0.5\), pheromone sensitivity \(\alpha =1\) and visibility sensitivity \(\beta =5\); for DE, a weighting factor \(F=0.5\) and a crossover constant CR \(=0.5\); For ES, the number of offspring \(\lambda =10\) produced in each generation, and standard deviation \(\sigma =1\) for changing solutions. For GA, we used roulette wheel selection, single-point crossover with a crossover probability of 1 and a mutation probability of 0.01. For HS, we set HM accepting rate \(= 0.75\), and pitch adjusting rate \(= 0.7\). For KH, we used the foraging speed \(V_{\mathrm{f}}= 0.02\), the maximum diffusion speed \(D^{\mathrm{max}}= 0.005\), the maximum induced speed \(N^{\mathrm{max}}= 0.01\). For PBIL, we used a learning rate of 0.05, 1 good population member and 0 bad population members to use to update the probability vector each generation, an elitism parameter of 1 and a 0 probability vector mutation rate. For PSO, an inertial constant \(= 0.3\), a cognitive constant \(=1\) and a social constant for swarm interaction \(=1\).
It is well known that the metaheuristic methods are generally based on random distribution. In order to remove the influence of the randomness, 1000 independent runs are implemented for each method. Tables 5, 6 and 7 illustrate the results of the simulations.
From Table 5, we see that, on average, CCS is well capable of finding the best values except F03, F04, F07, F10, F19 and F27. ACO performs best on F07, F19, F24 and F27 function. Table 6 shows that CCS performs the best except F03, F04, F06, F07, F09, F10, F12, F19 and F27. GA and ACO can find the best values on F04, F06, F09, F19 and F07, F12, F27, respectively. Moreover, for the worst values as shown in Table 7, CCS is able to search for the best values on 23 of the 27 benchmarks (F01, F03–F06, F08–F18 and F20–F26). ACO performs best on F01, F02, F07, F10, F19 and F27 function. As is evident, replacing step size \(\alpha \) with chaotic maps can definitely improve the performance of the CS.
Moreover, by carefully looking at the Tables 5, 6 and 7, we can see, the obtained results indicate that incorporating chaotic maps in the CS model would lead to a significant increase in the performance of the CS. It is apparent that when chaotic map is taken into account, the performance is improved as compared to the other methods. This verifies the effectiveness and the ability of the proposed CCS algorithm in solving the global numerical optimization problem.
Further, to prove the superiority of the CCS more clearly, several representative convergence results for CCS and CS are shown in Figs. 3, 4, 5, 6 and 7. The values shown in Figs. 3, 4, 5, 6 and 7 are the average objective function optimum.
From Fig. 3, we can draw the conclusion that CCS is significantly superior to the CS method, especially F01.
Figure 4 illustrates the function values for F07–F10 function. For this case, the figure clearly shows that CCS performs significantly better than CS method.
From Fig. 5, CCS is significantly superior as compared to the CS algorithm, though both CCS and CS have similar performance on F11 at the end of optimization.
Figure 6 shows the results for F18–F20 and F22 function. From Fig. 6, apparently, CCS outperforms CS method in this example.
Figure 7 shows the performance achieved for F23–F25, F05 and F27 function. It can be seen that CCS has a faster speed of convergence than CS method.
From above analyses about the Figs. 3, 4, 5, 6 and 7, we conclude that the substitution of step size with an appropriate chaotic map demonstrates superiority in solving global optimization problems.
4.3 Sensor selection problem
Except the standard functions discussed in the section above, one more engineering optimization problem is also used to validate the CCS method. The sensor selection problem can be considered as a test problem to validate the CCS method.
The Modular Aero Propulsion System Simulation (MAPSS) (Simon 2008) is used as the engine simulation in sensor selection problem. The model of the turbofan engine can be represented in the form of the discretized time invariant equations shown as follows:
where \(k\), \(x\), \(u\), \(p\) and \(y\) are the time index, three-element state vector, three-element control vector, ten-element health parameter vector and measurement vector, respectively. Between measurement times their deviations can be approximated by the zero mean noise \(w_{p}(k)\). The \(w_{x}(k)\) and \(e(k)\) mean inaccuracies in the system model and measurement noise, respectively.
The state vector \(x\) and the health parameter vector \(p\) in Eq. (2) can be estimated by a Kalman filter, and its uncertainty can be given by the error covariance \(\sum \). Here, the covariance \(\sum \) is a \(13\times 13\) matrix, because this problem includes three states and ten health parameters. In the sensor selection problem, only the health parameter estimation errors should be taken into account, so close attention can be simply paid to the diagonal elements \(\sum (i, i)\) (\(i=4, 5, \ldots , 13\)).
Based on the above analyses, the object function of the health estimation problem can be expressed as
where \(\sum _{0}\) and \(C_{0}\) are used for normalization. \(\sum _{0 }\)is the covariance, and \(C_{0}\) is the cost of setting the aircraft engine with all 11 sensors. \(\alpha \) is a scale factor that can balance the importance of financial cost and estimation accuracy.
It is clear that the selection of sensors to generate the minimum for \(J\) is essentially an optimization problem.
In fact, optimization methods can be used to solve the sensor selection problem. Here, nine optimization methods are used to search for a sub-optimal sensor set. The results on the sensor selection problem are recorded in Table 8. It can be seen that CCS yields better performance than the other eight methods in terms of average, best and worst performance. Furthermore, the Std of CCS is much smaller than other methods. That is, CCS has a relatively high possibility of finding the satisfactory sensor set.
5 Conclusion
In the proposed study, chaos has been combined with the standard CS which yields a new improved version of the CS algorithm, namely CCS algorithm. Twelve chaotic maps are used to tune the step size, \(\alpha \), of the CS. By series of simulations on various chaotic CS variants, the algorithm in combination of Sinusoidal map in place of \(\alpha \) is the best chaotic CS. From the experimental results, we observe that the tuned CS significantly enhances the ability of the global search as well as the quality of the results. As per the results of the nine approaches on the benchmarks and sensor selection problem, it can be seen that the CCS significantly enhances the search ability of the CS on most benchmark problems and engineering problem.
Various issues are worthy of further study in optimization problems, and some more efficient methods may be put forward, relying on the specific engineering problems. In future, we will focus on such challenging issues. On the one hand, the CCS method would be utilized to solve other practical engineering problems. We strongly believe that CCS can become an efficient and promising method for solving other real-world engineering problems. At the same time, some new meta-hybrid approaches would also be put forward to address optimization problems, based on the current studies.
References
Beyer H (2001) The theory of evolution strategies. Springer, New York
Cai X, Fan S, Tan Y (2012) Light responsive curve selection for photosynthesis operator of APOA. Int J Bio-Inspir Comput 4(6):373–379
Dorigo M, Stutzle T (2004) Ant colony optimization. MIT Press, Cambridge
Dorigo M, Maniezzo V, Colorni A (1996) Ant system: optimization by a colony of cooperating agents. IEEE Trans Syst Man Cybern B Cybern 26(1):29–41. doi:10.1109/3477.484436
Gandomi AH, Alavi AH (2012) Krill herd: a new bio-inspired optimization algorithm. Commun Nonlinear Sci Numer Simul 17(12):4831–4845. doi:10.1016/j.cnsns.2012.05.010
Gandomi AH, Yang X-S (2014) Chaotic bat algorithm. J Comput Sci 5(2):224–232. doi:10.1016/j.jocs.2013.10.002
Gandomi AH, Yang X-S, Alavi AH (2011) Mixed variable structural optimization using firefly algorithm. Comput Struct 89(23–24):2325–2336. doi:10.1016/j.compstruc.2011.08.002
Gandomi AH, Yang XS, Talatahari S, Alavi AH (2013a) Metaheuristic applications in structures and infrastructures. Elsevier, Waltham
Gandomi AH, Yang X-S, Alavi AH, Talatahari S (2013b) Bat algorithm for constrained optimization tasks. Neural Comput Appl 22(6):1239–1255. doi:10.1007/s00521-012-1028-9
Gandomi AH, Yang XS, Talatahari S, Alavi AH (2013c) Firefly algorithm with chaos. Commun Nonlinear Sci Numer Simulat 18(1):89–98. doi:10.1016/j.cnsns.2012.06.009
Gandomi AH, Yun GJ, Yang X-S, Talatahari S (2013d) Chaos-enhanced accelerated particle swarm optimization. Commun Nonlinear Sci Numer Simulat 18(2):327–340. doi:10.1016/j.cnsns.2012.07.017
Geem ZW, Kim JH, Loganathan GV (2001) A new heuristic optimization algorithm: harmony search. Simulation 76(2):60–68. doi:10.1177/003754970107600201
Goldberg DE (1998) Genetic algorithms in search. Optimization and machine learning. Addison-Wesley, New York
Guo L, Wang G-G, Gandomi AH, Alavi AH, Duan H (2014) A new improved krill herd algorithm for global numerical optimization. Neurocomputing 138:392–402. doi:10.1016/j.neucom.2014.01.023
Jia D, Zheng G, Khurram Khan M (2011) An effective memetic differential evolution algorithm based on chaotic local search. Inf Sci 181(15):3175–3187. doi:10.1016/j.ins.2011.03.018
Kaveh A, Sheikholeslami R, Talatahari S, Keshvari-Ilkhichi M (2014) Chaotic swarming of particles: a new method for size optimization of truss structures. Adv Eng Softw 67:136–147. doi:10.1016/j.advengsoft.2013.09.006
Kennedy J, Eberhart R (1995) Particle swarm optimization. In: Paper presented at the proceeding of the IEEE international conference on neural networks, Perth, 27 November 1995–1 December 1995
Li X, Yin M (2012) Application of differential evolution algorithm on self-potential data. PLoS One 7(12):e51199. doi:10.1371/journal.pone.0051199
Li X, Yin M (2013a) Multiobjective binary biogeography based optimization for feature selection using gene expression data. IEEE Trans Nanobiosci 12(4):343–353. doi:10.1109/TNB.2013.2294716
Li X, Yin M (2013b) An opposition-based differential evolution algorithm for permutation flow shop scheduling based on diversity measure. Adv Eng Softw 55:10–31. doi:10.1016/j.advengsoft.2012.09.003
Li X, Yin M (2015) Modified cuckoo search algorithm with self adaptive parameter method. Inf Sci 298:80–97. doi:10.1016/j.ins.2014.11.042
Li X, Zhang J, Yin M (2014) Animal migration optimization: an optimization algorithm inspired by animal migration behavior. Neural Comput Appl 24(7–8):1867–1877. doi:10.1007/s00521-013-1433-8
Mirjalili S, Lewis A (2013) S-shaped versus V-shaped transfer functions for binary particle swarm optimization. Swarm Evol Comput 9:1–14. doi:10.1016/j.swevo.2012.09.002
Mirjalili S, Mirjalili SM, Yang X-S (2013) Binary bat algorithm. Neural Comput Appl 25(3–4):663–681. doi:10.1007/s00521-013-1525-5
Mirjalili S, Mirjalili SM, Lewis A (2014a) Let a biogeography-based optimizer train your multi-layer perceptron. Inf Sci 269:188–209. doi:10.1016/j.ins.2014.01.038
Mirjalili S, Mirjalili SM, Lewis A (2014b) Grey wolf optimizer. Adv Eng Softw 69:46–61. doi:10.1016/j.advengsoft.2013.12.007
Nouhi B, Talatahari S, Kheiri H, Cattani C (2013) Chaotic charged system search with a feasible-based method for constraint optimization problems. Math Probl Eng 2013:1–8. doi:10.1155/2013/391765
Shumeet B (1994) Population-based incremental learning: a method for integrating genetic search based function optimization and competitive learning. Carnegie Mellon University, Pittsburgh, PA
Simon D (2008) Biogeography-based optimization. IEEE Trans Evolut Comput 12(6):702–713. doi:10.1109/TEVC.2008.919004
Storn R, Price K (1997) Differential evolution—a simple and efficient heuristic for global optimization over continuous spaces. J Global Optim 11(4):341–359. doi:10.1023/A:1008202821328
Talatahari S, Farahmand Azar B, Sheikholeslami R, Gandomi AH (2012) Imperialist competitive algorithm combined with chaos for global optimization. Commun Nonlinear Sci Numer Simulat 17(3):1312–1319. doi:10.1016/j.cnsns.2011.08.021
Talatahari S, Kheirollahi M, Farahmandpour C, Gandomi AH (2013) A multi-stage particle swarm for optimum design of truss structures. Neural Comput Appl 23(5):1297–1309. doi:10.1007/s00521-012-1072-5
Wang G, Guo L, Duan H, Wang H, Liu L, Shao M (2013a) Hybridizing harmony search with biogeography based optimization for global numerical optimization. J Comput Theor Nanos 10(10):2318–2328. doi:10.1166/jctn.2013.3207
Wang G-G, Gandomi AH, Alavi AH (2013b) A chaotic particle-swarm krill herd algorithm for global numerical optimization. Kybernetes 42(6):962–978. doi:10.1108/K-11-2012-0108
Wang G, Guo L, Wang H, Duan H, Liu L, Li J (2014a) Incorporating mutation scheme into krill herd algorithm for global numerical optimization. Neural Comput Appl 24(3–4):853–871. doi:10.1007/s00521-012-1304-8
Wang G-G, Gandomi AH, Zhao X, Chu HE (2014b) Hybridizing harmony search algorithm with cuckoo search for global numerical optimization. Soft Comput. doi:10.1007/s00500-014-1502-7
Wang G-G, Guo L, Duan H, Wang H (2014c) A new improved firefly algorithm for global numerical optimization. J Comput Theor Nanos 11(2):477–485. doi:10.1166/jctn.2014.3383
Wang G-G, Guo L, Gandomi AH, Hao G-S, Wang H (2014d) Chaotic krill herd algorithm. Inf Sci 274:17–34. doi:10.1016/j.ins.2014.02.123
Wang G-G, Gandomi AH, Alavi AH (2014e) Stud krill herd algorithm. Neurocomputing 128:363–370. doi:10.1016/j.neucom.2013.08.031
Wang G-G, Gandomi AH, Alavi AH, Hao G-S (2014f) Hybrid krill herd algorithm with differential evolution for global numerical optimization. Neural Comput Appl 25(2):297–308. doi:10.1007/s00521-013-1485-9
Wang G-G, Gandomi AH, Alavi AH (2014g) An effective krill herd algorithm with migration operator in biogeography-based optimization. Appl Math Model 38(9–10):2454–2462. doi:10.1016/j.apm.2013.10.052
Wang G-G, Deb S, Cui Z (2015) Monarch butterfly optimization. Neural Comput Appl. doi:10.1007/s00521-015-1923-y
Xie L, Zeng J, Formato RA (2012) Selection strategies for gravitational constant \(G\) in artificial physics optimisation based on analysis of convergence properties. Int J Bio-Inspir Comput 4(6):380–391
Yang XS (2010a) A new metaheuristic bat-inspired algorithm. In: González JR, Pelta DA, Cruz C, Terrazas G, Krasnogor N (eds) Nature inspired cooperative strategies for optimization (NICSO 2010), vol 284. Studies in computational intelligence. Springer, Heidelberg, pp 65–74. doi:10.1007/978-3-642-12538-6_6
Yang XS (2010b) Nature-inspired metaheuristic algorithms, 2nd edn. Luniver Press, Frome
Yang XS, Deb S (2010) Engineering optimisation by cuckoo search. Int J Math Model Numer Optim 1(4):330–343. doi:10.1504/IJMMNO.2010.03543
Yang XS, Gandomi AH (2012) Bat algorithm: a novel approach for global engineering optimization. Eng Comput 29(5):464–483. doi:10.1108/02644401211235834
Yang X-S, Hosseini SSS, Gandomi AH (2012) Firefly algorithm for solving non-convex economic dispatch problems with valve loading effect. Appl Soft Compt 12(3):1180–1186. doi:10.1016/j.asoc.2011.09.017
Yang XS, Gandomi AH, Talatahari S, Alavi AH (2013) Metaheuristics in water. Geotechnical and transport engineering. Elsevier, Waltham
Yang X-S, Karamanoglu M, He X (2014) Flower pollination algorithm: a novel approach for multiobjective optimization. Eng Optim 46(9):1222–1237. doi:10.1080/0305215X.2013.832237
Zhang Z, Feng Z (2012) Two-stage updating pheromone for invariant ant colony optimization algorithm. Expert Syst Appl 39(1):706–712. doi:10.1016/j.eswa.2011.07.062
Zhang Y, Huang D, Ji M, Xie F (2011) Image segmentation using PSO and PCM with Mahalanobis distance. Expert Syst Appl 38(7):9036–9040. doi:10.1016/j.eswa.2011.01.041
Zhang Z, Zhang N, Feng Z (2014) Multi-satellite control resource scheduling based on ant colony optimization. Expert Syst Appl 41(6):2816–2823. doi:10.1016/j.eswa.2013.10.014
Zou D, Gao L, Li S, Wu J (2011) An effective global harmony search algorithm for reliability problems. Expert Syst Appl 38(4):4642–4648. doi:10.1016/j.eswa.2010.09.120
Zhao X, Lin W, Zhang Q (2014a) Enhanced particle swarm optimization based on principal component analysis and line search. Appl Math Comput 229:440–456. doi:10.1016/j.amc.2013.12.068
Zhao X, Liu Z, Yang X (2014b) A multi-swarm cooperative multistage perturbation guiding particle swarm optimizer. Appl Soft Compt 22:77–93. doi:10.1016/j.asoc.2014.04.042
Zou D, Gao L, Wu J, Li S, Li Y (2010) A novel global harmony search algorithm for reliability problems. Comput Ind Eng 58(2):307–316. doi:10.1016/j.cie.2009.11.003
Zou D, Gao L, Li S, Wu J (2011) Solving 0–1 knapsack problem by a novel global harmony search algorithm. Appl Soft Compt 11(2):1556–1564. doi:10.1016/j.asoc.2010.07.019
Acknowledgments
This work was supported by Research Fund for the Doctoral Program of Jiangsu Normal University (No. 9213614102) and National Natural Science Foundation of China (No. 61305149).
Author information
Authors and Affiliations
Corresponding author
Additional information
Communicated by S. Deb, T. Hanne and S. Fong.
S. Deb is pioneer of cuckoo search algorithm.
Rights and permissions
About this article
Cite this article
Wang, GG., Deb, S., Gandomi, A.H. et al. Chaotic cuckoo search. Soft Comput 20, 3349–3362 (2016). https://doi.org/10.1007/s00500-015-1726-1
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00500-015-1726-1