Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

1 Introduction

Optimization algorithms have been extensively used in constrained design optimization problems. Meta-heuristic methods such as evolutionary computation and swarm intelligence algorithms are attractive because they are rather simple to implement and have better global search abilities than gradient-based optimizers [1]. The general formulation of a constrained optimization problem is:

$$ \begin{array}{*{20}c} {{\text{Optimise:}}\;f ( {\text{X)}}} \\ {\text{Subject}\;{\text{to:}}} \\ {g_{i} ({\mathbf{X}}) \ge 0,i = 1,2, \ldots N,} \\ {h_{j} ({\mathbf{X}}) = 0,j = \text{N} + 1, \ldots M,} \\ {l_{i} \le x_{i} \le u_{i} } \\ \end{array} $$
(1)

Where X = [x 1x 2, …….. x N]T is the design vector including N optimization variables (continuous, integer or discrete) each of which may range between the lower and upper bounds l i and u i ; f (X) is the objective function; g j ( X ) and h j ( X), respectively, are the inequality and equality constraints (the latter can be replaced by the inequality constraint |h j (X) − ɛ| ≤ 0 where ɛ is a tolerance limit set by the user).

Most design optimization problems include many design variables and complicated constraints. Non-linearity often results in multimodal response. For this reason, modern meta-heuristic algorithms were developed to carry out global search trying to increase computational efficiency, solve larger problems, and implement robust optimization codes [2]. The search efficiency of meta-heuristic algorithms can be attributed to the fact that they mimic the best features available in nature from the most various sources ranging from biology [3] to physics [4]. However, there not exist meta-heuristic algorithms intrinsically able to find the global optimum for any kind of real world large-scale problems. Nature-inspired algorithms, based on the selection of the fittest, adaptation to changes, and genetic mechanisms in biological systems which have evolved by natural selection over millions of years are continuously proposed by researchers that continue to find asymptotically better meta-heuristic algorithms for specific design problems.

Evolutionary computation and swarm intelligence algorithms are the main categories of bio-inspired meta-heuristic optimization algorithms. Various algorithms were successfully utilized in solving optimization problems: for example, genetic algorithms (GA) [5, 6], Particle swarm optimization (PSO) [7, 8], Ant Colony Optimization (ACO) [9], Interior search algorithm (ISA) [10], Differential Evolution (DE) [11, 12], Firefly Algorithm (FA) [13, 14], Bat Algorithm (BA) [15, 16] etc. Another biology-inspired meta-heuristic algorithm successfully applied in many engineering problems [17, 18] is Cuckoo Search (CS). CS, developed in 2009 by Yang and Deb [19], reproduces the behavior of some cuckoo species in combination with the Levy flight behavior of some birds and fruit flies. Cuckoo search has recently attracted increasing attention because of its simple formulation, easy implementation and small number of internal parameters involved in the algorithm formulation. Hybridization of CS with other meta-heuristic algorithms was proven to be a viable approach [20, 21]. In this paper, cuckoo search is hybridized with standard GA or PSO (the resulting algorithms are denoted as CS-GA and CS-PSO, respectively). The performance of the new hybrid algorithms developed in this research is first evaluated by Himmelblau’s test function and then further validated by solving four design bench mark problems. Numerical results confirm the validity of the proposed approaches. The remainder of the paper is organized as follows: Sect. 2 describes the brief idea of cuckoo search and in details of the proposed hybrid CS-GA and CS-PSO algorithms. Test problems and optimization results are presented in Sect. 3. Section 4 summarizes the main findings of the present study and outlines directions for further research.

2 Cuckoo Search Algorithm

Cuckoo search (CS) algorithm is based on the obligate brood-parasitic behavior of some cuckoo species in combination with the Lévy flight behavior of some birds and fruit flies. In standard CS, the following three idealized rules are used:

  1. 1.

    Each cuckoo lays one egg at a time, and dumps it in a randomly chosen nest.

  2. 2.

    The nests with high quality of eggs (solutions) will survive to the next generation.

  3. 3.

    The number of available host nests is fixed, and a host can discover an alien egg with a probability \( p_{a} \,{ \in }\,[0,1] \). In this case, the host bird can either throw the egg away or abandon the nest so as to build a completely new nest in a new location.

The above three rules can be transformed into the following search methodology:

  1. 1.

    An egg represents a solution and is stored in a nest.

  2. 2.

    The cuckoo bird searches the most suitable nest to lay eggs (solution) in order to maximize eggs survival rate. The high quality eggs (best solutions near to optimal value) which are more similar to the host bird’s eggs have the opportunity to develop (next generation) and become a mature cuckoo.

  3. 3.

    The host bird discover the alien egg (worse solutions away from optimal value) with a probability \( p_{a} \,{ \in }\,[0,1] \) and these eggs are thrown away or the nest is abandoned, and completely new nest is built in a new location. Otherwise, the egg grows up and is alive for the next generation. New eggs (solutions) lay by a cuckoo chooses the nest by Lévy flights around the current best solutions.

2.1 Lévy Flight

In nature, animals search for food in a random or quasi-random way. In general, the foraging path of an animal is effectively a random walk because the next move is based on the current location/state and the transition probability to the next location. Which direction it chooses depends implicitly on a probability which can be modeled mathematically. For example, various studies have shown that the flight behavior of many animals and insects has demonstrated the typical characteristics of Lévy flights [22]. A study by Reynolds and Frye [23] shows that fruit flies or Drosophila melanogaster explore their landscape using a series of straight flight paths punctuated by a sudden 90o turn, leading to a Lévy-flight-style intermittent scale-free search pattern. Even light can be related to Lévy-flights [24]. Subsequently, such behavior has been applied to optimization and optimal search, and preliminary results show its promising capability [25]. Lévy Flight is performed to generate new solution stochastically.

(2)

where \( \alpha > 0 \) is the step size which should be related to the scale of the problem of interest. In order to accommodate the difference between solution qualities, it can be set [26].

$$ \alpha = \alpha_{0} \left[ {x_{j} \left( t \right) - x_{i} \left( t \right)} \right] $$
(3)

where \( \alpha_{0} \) is a constant while the term in the bracket corresponds to the difference of two selected solutions. This mimics the fact that similar eggs are less likely to be discovered and newly generated solutions are proportional to their differences. The product \( \oplus \). means entry-wise multiplications. Lévy flight is essentially a Markov chain in which the random steps are drawn from a Lévy distribution [15]. The generation of the Lévy distribution can be achieved by Mantegna’s algorithm. Mantegna’s algorithm produces random noise according to a symmetric Lévy stable distribution [27]. A symmetric Lévy stable distribution is ideal for Lévy flights as the direction of the flight should be random [15].

In Mantegna’s algorithm, the step length can be calculated as

$$ {\text{L}\acute{e} \text{vy}}\left( \beta \right)\sim \frac{u}{{\left| v \right|^{{\frac{1}{\beta}}}}}, $$
(4)

Where u and v are drawn from normal distributions. That is.

$$ u\sim N\left( {0,\sigma_{u}^{2} } \right),\quad v\sim N\left( {0,\sigma_{v}^{2} } \right), $$
(5)
$$ \sigma_{u} = \left\{ {\frac{{\Gamma \left( {1 + \beta } \right)\sin \left( {\frac{\pi \beta }{2}} \right)}}{{\Gamma \left[ {\frac{{\left( {1 + \beta } \right)}}{2}} \right]\beta 2^{{\frac{\beta - 1}{2}}} }}} \right\}^{{\frac{1}{\beta }}} , \quad \sigma_{v} = 1 $$
(6)

where the distribution parameter \( \beta \in [0.3, 1.99] \).

For the best cuckoo laying eggs, the step size, \( \alpha \) used is

$$ \alpha = \alpha_{c} \left[ {bestnest\left( t \right) - x_{i} \left( t \right)} \right] $$
(7)

where \( \alpha_{c} \) is a constant.

According to the life style of cuckoo birds, each cuckoo will lay more than one egg at a time and always searching a place to lay the eggs not to be discovered by the host birds, in order to increase the chance of eggs survival. The evolution principles of GA in CS used to lay more eggs from a cuckoo and swarm intelligence of PSO in CS used to update their position according to PSO equation. The proposed hybrid CS-GA and CS-PSO are described in the following section.

2.2 Hybrid CS-GA Algorithm

Genetic Algorithms (GA) employ random choice to carry out a highly exploitative search, striking a balance between exploration of the feasible domain and exploitation of “good” solutions [28]. The rationale of developing the CS-GA algorithm is to combine the advantages of Cuckoo Search and GA in order to mimic the real life style of cuckoo birds. Each cuckoo lays more than one egg at a time and dumps them in a randomly chosen nest always searching the best place to lay eggs not to be discovered by the host birds. This increases the probability of eggs survival. The GA crossover operator is used to reproduce more eggs for each cuckoo. We select two parents and generate two offspring for them. Cuckoo birds mimic the color and pattern of the host bird in order to protect eggs from being recognized by the host birds. Therefore, cuckoo birds will have a mutation in their genes to supply this need. Thus, the GA mutation operator is added for all cuckoo birds. GA selection strategy is used to select the survived cuckoos among all the produced cuckoos and eggs. After the new generations of eggs that have grown to cuckoo bird, these mature birds in their real life immigrate to a better environment for their lifetime via Lévy flight and so forth. In addition, the destination place has limited capacity for them to build their nests. The flow chart proposed hybrid CS-GA is presented in Fig. 1.

Fig. 1.
figure 1

Flow chart of hybrid CS-GA

2.3 Hybrid CS-PSO Algorithm

Particle swarm optimization (PSO), developed by Kennedy and Eberhart in 1995 [29], is a population based algorithm inspired by the social interaction and communication in a flock of birds. In PSO, a member of the swarm (population) is called a particle, and the objective of each particle is to discover the optimal point in a continuous search space as it moves from one point to another during the search process. Each particle moves around in the search space with a velocity. During the movement, each particle adjusts its position according to its own experience (p best ) and the most successful particle experience (g best ).

The velocity and position are updated using the following equations.

$$ V_{t + 1} = wV_{t} + C_{1} U_{1} \left( {p_{best} - x_{t} } \right) + C_{2} U_{2} \left( {g_{best} - x_{t} } \right) $$
(8)
$$ x_{t + 1} = x_{t} + V_{t + 1} $$
(9)

Where w is called the inertia coefficient used to balance exploration versus exploitation of the solution space. Values of w close to 1 usually encourage exploration of new areas in the solution space, while for small values of w such as w < 0.4, the search shifts to the exploitation mode. Parameters C 1 and C 2 are called cognition and social coefficients, respectively, and they are very important for facilitating convergence in PSO. Based on an empirical study [29] as well as theoretical results [30], it is recommended to set C 1 and C 2 such that C 1  + C 2  < 4. U 1 and U 2 are uniform random numbers uniformly distributed between 0 and 1.x t is current position of the particle. The process is repeated until a satisfactory solution is found. The flowchart of proposed hybrid CS-PSO is presented in Fig. 2.

Fig. 2.
figure 2

Flow chart of hybrid CS-PSO

Swarm intelligence of PSO was introduced in CS to modify the third search rule. If alien eggs are discovered by the host bird they will be thrown away or the host bird will abandon its nest and build a new nest elsewhere. The cuckoo bird always looks for a better place where to lay eggs in order to better hide them. The present CS-PSO algorithm introduces the communication between cuckoo birds. The main goal of this communication is to inform each other from their position and help each other to migrate to a better place to build the nest. Each bird will record the personal experience as pbest during its own life. The pbest among all the birds is called gbest. Cuckoo birds migrate to the new position according to Eq. (9).

3 Test Problems and Results

The proposed hybrid CS-GA and CS-PSO algorithm were initially tested a well-known Himmelblau’s test function, and then applied to five optimization problems. Optimization runs were carried out on a 2.4 GHz Intel® -64 Core(TM) 2 CPU 6600 computer with 2 GB of RAM memory. Optimization algorithms were implemented in MATLAB. For each test case, 30 independent runs were performed.

3.1 Himmelblau’s Problem

This problem has originally been proposed by Himmelblau’s [31] and it has been widely used as a benchmark nonlinear constrained optimization problem. In this problem, there are five design variables [x 1 , x 2 , x 3 , x 4 , x 5], six nonlinear inequality constraints, and ten boundary conditions. The problem can be stated as follows:

$$ \begin{aligned} & Min{ :}\;f(X) = 5.3578547x_{3}^{2} + 0.8356891x_{1} x_{5} + 37.293239x_{1} - 40792.141 \\ & \quad \;\;\;\;Subject\,to{:}\;0 \le g_{1} \le 92, 90 \le g_{2} \le 110, 20 \le g_{ 1} \le 25 \\ & g_{1} = 85.334407 + 0.0056858x_{2} x_{5} + 0.0026x_{1} x_{4} - 0.0022053x_{3} x_{5} \\ & g_{2} = 80.51249 + 0.0071317x_{2} x_{5} + 0.0029955x_{1} x_{2} + 0.0021813x_{3}^{2} \\ & g_{3} = 9.300961 + 0.0047026x_{3} x_{5} + 0.0012547x_{1} x_{3} + 0.0019085x_{3} x_{4} \\ & \qquad 78 \le x_{1} \le 102, 33 \le x_{2} \le 45, 27 \le x_{3} ,x_{4 ,} x_{5} \le 45 \\ \end{aligned} $$
(10)

The problem was solved by Himmelblau [31] using generalized gradient method and several other methods such as Genetic Algorithm (GA) [32], particle swarm optimization (PSO) [33], Bat algorithm (BA) [34], evolutionary algorithms (EA) [35] and Cuckoo search (CS) [17]. Table 1 summarizes the optimal results obtained by CS-GA and CS-PSO algorithm. Table 2 compares the results obtained by CS-GA and CS-PSO, as well as other methods reported in the literature. It is obvious that the result obtained using CS-GA is better than the best feasible solution and CS-PSO obtained the same results previously reported.

Table 1. Optimal result for Himmelblau’s problem
Table 2. Statistical comparison results for Himmelblau’s problem

3.2 Cantilever Beam Design Problem

This problem is a good benchmark to verify the capability of any algorithm for solving continuous, discrete, and/or mixed variable structural design problems. It was originally presented by Thanedar and Vanderplaats [36] with ten variables, and it has been solved with continuous, discrete and mixed variables in different cases in the literature [37]. The problem can be expressed as follows:

$$ \begin{aligned} & Min\;f\left( {\mathbf{X}} \right) = 0.0624(x_{1} + x_{2} + x_{3} + x_{4} + x_{5} ) \\ & \text{Subject}\,{\text{to:}} \;g\left( {\mathbf{X}} \right) = \frac{61}{{x_{1}^{3} }} + \frac{37}{{x_{2}^{3} }} + \frac{19}{{x_{3}^{3} }} + \frac{7}{{x_{4}^{3} }} + \frac{1}{{x_{5}^{3} }} - 1 \le 0 \\ \end{aligned} $$
(11)

The best solutions obtained using CS-GA, CS-PSO, CS [17] and generalized convex approximation methods (GCA) [38] are listed in Table 3. As it is seen, the solution found out by CS-GA and CS-PSO is slightly better than basic CS and GCA methods.

Table 3. Best solutions of the cantilever beam design problem

3.3 Tubular Column Design Problem

A uniform column of tubular section to carry a compressive load P = 2500 kgf at minimum cost [39]. The column is made of a material that has a yield stress σ y of 500 kgf cm−2, a modulus of elasticity E of 0.85 × 106 kgf cm−2, and a density ρ of 0.0025 kgf cm−3. The length L of the column is 250 cm. The stress included in the column should be less than the buckling stress (constraint g 1) and the yield stress (constraint g 2). The mean diameter of the column is restricted between 2 and 14 cm (constraint g 3 and g 4), and columns with thickness outside the range 0.2–0.8 cm are not commercially available (constraint g 5 and g 6). The cost of the column includes material and construction costs and can be taken as 9.82dt + 2d, where d is the mean diameter of the column in centimeters and t is tube thickness. The optimization model of this problem can be expressed as follows:

$$ \begin{aligned} & \;Minimize\;f\left( {d,t} \right) = 9.82dt + 2d \\ & {\text{Subject}}\;{\text{to}}\;g_{1} = \frac{P}{{\pi dt\sigma_{y} }} - 1 \le 0;\;\;\;g_{2} = \frac{{8PL^{2} }}{{\pi^{3} Edt\left( {d^{2} + t^{2} } \right)}} - 1 \le 0\;\;g_{3} = \frac{2.0}{d} - 1 \le 0; \\ & g_{4} = \frac{d}{14} - 1 \le 0;\quad \quad \quad \quad \quad \;\;\;g_{5} = \frac{0.2}{t} - 1 \le 0;\quad \quad \quad \quad \quad \;\;\;g_{6} = \frac{t}{0.8} - 1 \le 0 \\ \end{aligned} $$
(12)

Table 4 illustrates the statistical results for the tubular column problem. It is seen that the optimal solution found by CS-GA and CS-PSO slightly better than solution provided by CS [17].

Table 4. Statistical results for the tubular column problem

3.4 Three Bar Truss Design Problem

This three bar truss design problem was first presented by Nowcki [40]. The volume of a statically loaded 3-bar truss is to be minimized subject to stress (\( \sigma \)) constraints on each of the truss members. The objective is to evaluate the optimal cross sectional areas (A1, A2). The mathematical formulation is given as below:

$$ \begin{aligned} & {\mathbf{Min}}{ :}\;{\mathbf{f}} ({\mathbf{A}}{\mathbf{1}}, {\mathbf{A}}{\mathbf{2}}) = ({\mathbf{2}}\surd {\mathbf{2}} {\mathbf{A}}{\mathbf{1}} + {\mathbf{A}}{\mathbf{2}}) {\mathbf{I}} \\ & {\text{Subject}}\;{\text{to}}\;{\mathbf{g}}_{{\mathbf{1}}} = \frac{{\sqrt {{\mathbf{2}} } {\mathbf{A}}_{{\mathbf{1}}} + {\mathbf{A}}_{{\mathbf{2}}} }}{{\sqrt {{\mathbf{2}} } {\mathbf{A}}_{1}^{{\mathbf{2}}} + 2{\mathbf{A}}_{{\mathbf{1}}} {\mathbf{A}}_{{\mathbf{2}}} }} {\mathbf{P}} - {\varvec{\upsigma}} \le {\mathbf{0}}; \; {\mathbf{g}}_{{\mathbf{2}}} = \frac{{{\mathbf{A}}_{{\mathbf{2}}} }}{{\sqrt {{\mathbf{2}} } {\mathbf{A}}_{{\mathbf{1}}}^{{\mathbf{2}}} + {\mathbf{2}}{\mathbf{A}}_{{\mathbf{1}}} {\mathbf{A}}_{{\mathbf{2}}} }} {\mathbf{P}} - {\varvec{\upsigma}} \le {\mathbf{0}}; \; {\mathbf{g}}_{{\mathbf{3}}} = \frac{{\mathbf{1}}}{{{\mathbf{A}}_{{\mathbf{1}}} + \sqrt {{\mathbf{2}} } {\mathbf{A}}_{{\mathbf{2}}} }} {\mathbf{P}} - {\varvec{\upsigma}} \le {\mathbf{0}} \\ \end{aligned} $$
(13)

Where 0 < A1 < 1 and 0 < A2 < 1; P = 2 kN/cm2, σ = 2 kN/cm2 and l = 100 cm.

The available algorithms solving this problem include swarm strategy (SS) [41], social behavior inspired optimization technique (SBO) [42] DE with dynamic stochastic selection (DEDS) [43], hybrid evolutionary algorithm with adaptive constraint handling (HEAA) [44], DE with level comparison (DELC) [45],CS [17] and GA with flexible allowance technique (GAFAT) [18].

The results of these algorithms are listed in Table 5. It can be seen that the best function value 263.8958 obtained by CS-GA and CS-PSO is the best function value and the same as the results obtained by GAFAT.

Table 5. Statistical results of three bar truss design problem

3.5 Corrugated Bulkhead Design Problem

This problem is as an example of minimum-weight design of the corrugated bulkheads for a tanker. The variables of the problem are width (b), depth (h), length (l), and plate thickness (t). The minimum-weight requires the solution of the following optimization problem:

$$ {\mathbf{Min}}\text{:}\;{\mathbf{f}}\text{(}{\mathbf{b}}\text{,}{\mathbf{h}}\text{,}{\mathbf{l}}\text{,}{\mathbf{t}}) = \frac{{{\mathbf{5}}{\mathbf{.665t}}\text{(}{\mathbf{b}}\text{ + }{\mathbf{l}})}}{{{\mathbf{b}}\text{ + }\sqrt {\left( {{\mathbf{l}}^{{\mathbf{2}}} - {\mathbf{h}}^{{\mathbf{2}}} } \right)} }} $$
(14)

Subject to

$$ \begin{array}{*{20}l} {{\mathbf{g}}_{{\mathbf{1}}} = {\mathbf{th}} \left( {{\mathbf{0}}{\mathbf{.4b}} + \frac{{\mathbf{l}}}{{\mathbf{6}}}} \right) - {\mathbf{8}}{\mathbf{.94}}({\mathbf{b}} + \sqrt {({\mathbf{l}}^{{\mathbf{2}}} - {\mathbf{h}}^{{\mathbf{2}}} )} \ge {\mathbf{0}};} \hfill \\ {{\mathbf{g}}_{{\mathbf{2}}} = {\mathbf{th}}^{{\mathbf{2}}} \left( {{\mathbf{0}}{\mathbf{.2b}} + \frac{{\mathbf{l}}}{{{\mathbf{12}}}}} \right) - {\mathbf{2}}{\mathbf{.2}}\left[ {{\mathbf{8}}{\mathbf{.94}}\left( {{\mathbf{b}} + \sqrt {({\mathbf{l}}^{{\mathbf{2}}} - {\mathbf{h}}^{{\mathbf{2}}} )} } \right)} \right]^{{\frac{{\mathbf{4}}}{{\mathbf{3}}}}} \ge {\mathbf{0}}} \hfill \\ {{\mathbf{g}}_{{\mathbf{3}}} = {\mathbf{t}} - {\mathbf{0}}{\mathbf{.0156b}} - {\mathbf{0}}{\mathbf{.15}} \ge {\mathbf{0}};} \hfill \\ {{\mathbf{g}}_{{\mathbf{4}}} = {\mathbf{t}} - {\mathbf{0}}{\mathbf{.0156l}} - {\mathbf{0}}{\mathbf{.15}} \ge {\mathbf{0}};\quad \quad \quad \quad \quad \quad \quad \quad \quad \quad \quad \;{\mathbf{g}}_{{\mathbf{5}}} = {\mathbf{t}} - {\mathbf{1}}{\mathbf{.05}} \ge {\mathbf{0}};\,{\mathbf{g}}_{6} = {\mathbf{l}} - {\mathbf{h}} \ge {\mathbf{0}}} \hfill \\ \end{array} $$

Where \( {\mathbf{0} }\le {\mathbf{b}}, {\mathbf{h}}, {\mathbf{l}} \le {\mathbf{100}} \) and \( {\mathbf{0}} \le {\mathbf{t}} \le {\mathbf{5}} \).

The minimum-weight and the statistical values of the corrugated bulkhead are given in Table 6. For this problem, Ravindran et al. [46] reported the minimum value of 6.84241 using a random search method. As it is seen, the best objective value reported by Gandomi et al. [17] is the best one but they did not consider the fifth constrained due to it is duplication of lower bound for t. A comparison of the results shows that CS-GA performs better in terms of solution quality than CS-PSO.

Table 6. Statistical results of corrugated bulkhead design problem

4 Conclusion

Hybridized version of cuckoo search with well-known GA and PSO algorithms proposed and tested in this paper. In standard CS, each cuckoo lays one egg at a time, but in this CS-GA algorithm, in order to lay more eggs genetic strategy used. To reduce the chance of eggs to be discovered by host bird mutation operator was used. Since there is limitation in number of nests the selection operator was used to select all cuckoos. This will increase the search space and reduce the fitness average in the population during the algorithm’s performance. GA ensures to pass the best solutions onto the next iteration/generation, and there is no risk that the best solutions are being cast out of the population. The exploitation moves are within the neighborhood of the best solutions locally. CS-GA uses a balanced local search and global search.

The swarm intelligence behavior of PSO is built with standard CS in order to guide the cuckoo for selecting the best place to lay the egg. This will increase the best eggs for survival. From the simulation results our comparisons with a large number of comparator algorithms indicated that the proposed hybrid CS-GA and CS-PSO can provide outstanding results for solving optimization problems. Although the number of variables and types of problems solved in this work are limited and hence cannot substantiate the range of scope of application of the algorithm, the authors believe that the algorithm’s working principles and features are appealing and its searching efficiency can be further improved constrained handling techniques adopted in [47, 48] with future development work and experimentation. Additionally, future research will investigate the performance of the hybrid CS with other optimization algorithms in solving unconstrained and constrained multi-objective optimization problems as well as real world applications.