Abstract
Till now, several novel metaheuristic algorithms are proposed for global search. But only specific algorithms have become popular or attracted researchers, who are efficient in solving global optimization problems as well as real-world application problems. The Social Group Optimization (SGO) algorithm is a new metaheuristic bioinspired algorithm inspired by human social behavior that attracted researchers due to its simplicity and problem-solving capability. In this study, to deal with the problems of low accuracy and local convergence in SGO, the chaos theory is introduced into the evolutionary process of SGO. Since chaotic mapping has certainty, ergodicity, and stochastic property, by replacing the constant value of the self-introspection parameter with chaotic maps, the proposed chaotic social group optimization algorithm increases its convergence rate and resulting precision. The proposal chaotic SGO is validated through 13 benchmark functions and after that 9 structural engineering design problems have been solved. The simulated results have been noticed as competent with that of state-of-art algorithms regarding convergence quality and accuracy, which certifies that improved SGO with chaos is valid and feasible.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Introduction
For simplicity and gradient-free mechanism, metaheuristic optimization algorithms are becoming popular among researchers globally. According to the no-free lunch (NFL) theorem [1], a single metaheuristic optimization algorithm cannot solve all optimization problems. It may solve some problems with high performance and some problems with low performance. Hence, researchers have invented many optimization algorithms, and every year new algorithms are being proposed. At the same time, the existing algorithms are also improved.
Till now, several novel metaheuristic algorithms are proposed for global search. These algorithms reveal improved performances in comparison to traditional optimization techniques, especially when applied to solve non-convex optimization problems [2]. Satapathy et al. have developed a promising metaheuristic algorithm, called social group optimization (SGO) in year 2016, which is inspired by humans social behavior to solve complex problems [3]. Preliminary studies suggest that the SGO demonstrates superior results when compared with other metaheuristics algorithms [4, 5].
The metaheuristic algorithms consist of two essential steps exploration and exploitation. Exploration refers to searching the whole search space of the algorithm. This factor shows the capability of a method in global search. Exploitation is the capability to find local optimum around different feasible solutions It has been seen that if an optimization algorithm has good exploration capability, then it will be lacking in good exploitation capability and vice versa [6]. Previously, researchers were using random walks and gradient descent methods for improving exploration and exploitation, respectively. But, increasing the overall computational cost of the algorithm, researchers are using chaotic maps to improve diversification and local exploitation of search space to find the optimal solutions [7, 8]. The interesting property of the systems is that when there is a minor change in the system, the whole system gets affected [9].
In the past, various metaheuristic optimization algorithms have been used together with chaotic sequences such as Artificial Bee Colony (ABC) optimization [10], Harmony Search (HS) [11], Particle Swarm Optimization (PSO) [12], genetic algorithm (GA) [13], differential evolution (DE) [14], simulated annealing (SA) [15], firefly algorithm (FA) [16], krill herd (KH) [17], imperialist competitive algorithm (ICA)[18], biogeography-based optimization (BBO) [19], bat algorithm (BA) [20], gravitational search algorithm (GSA) [8], Bird swarm algorithms (BSA) [21], league championship algorithms (LCA) [22], and farmland fertility (FF) [23].
Based on the SGO algorithm, in this paper, a family of the chaotic algorithm is proposed, called Chaotic SGO (CSGO). The insertion of chaotic maps in the structure of the CSGO algorithm is motivated by the following arguments: (1) the SGO functions with various dimensions and characteristics (unimodal, multimodal, composite) [5]. Thus, it is expected that in the case of structural engineering problems this efficiency will be maintained. (2) Consulting several databases, we found that the SGO algorithm equipped with various chaotic maps has not been used in solving the structural engineering problem (3). In addition, CSGO algorithms are easy to implement and have the ability to maintain a good balance between exploration and exploitation, thus being able to generate promising solutions during the iterative process. Again to evaluate the proposed family of CSGO algorithms, 13 benchmark functions are utilized and their performances are compared: with 10 metaheuristics optimization algorithms.
Normally, metaheuristic algorithms show good results on benchmark functions, but they perform poorly on real-world problems. The practical problem is an actual test for checking the problem-solving capabilities of an optimization algorithm. Therefore, to further evaluate the validity of the proposed family of CSGO algorithms in real-world applications, these are used to solve nine structural engineering design problems. The results reveal that there is an improvement in the performance of the proposed algorithms due to the application of deterministic chaotic signals.
The rest of the paper is organized as follows. Section 2 presents the description of SGO. Section 3 outlines the chaotic maps that generate chaotic sequences in the SGO. Section 4 presents the proposed family of CSGO algorithms. Simulations and result analysis are presented in Sect. 5. Finally, the conclusions and directions for further research are drawn in Sect. 6.
2 Social Group Optimization (SGO) Algorithm
The SGO algorithm is based on human behavior towards society in solving complex problems. The person is a candidate solution and the person’s knowledge is the fitness value of the problem. The human traits are designated as the design variable of the problem which corresponds to the dimension of the problem. The SGO algorithm goes through two phases, namely the improving and acquiring phases. In the group, each individual’s knowledge level is improved based on the best individual influence in the improving phase. The best candidate solution is the one having the highest knowledge level and the ability to solve the problem under concern. The mutual interaction between individuals in the group and at the same time interaction with the best person through the acquiring phase improved each person’s knowledge. For a detailed description of the SGO algorithm, please refer to the paper [3, 24]. The SGO algorithm, in short, is given as follows:
Let \({P}_{i}\), i = 1,2,3,…., N, be the N persons of the social group and each person \({P}_{i}\) is defined by \({P}_{i}=({p}_{i1},{p}_{i2},{p}_{i3},\dots \dots ,{p}_{iD})\) where D is the number of traits assigned to a person and \({f}_{i}\)’s are their corresponding fitness value, respectively. For every iteration, each person has to undergo the “improving” and “acquiring” phase in the hope of finding a better solution.
2.1 Improving Phase
Accept \(P\mathrm{new}\) if it gives better fitness than \(P\)
where \(\mathrm{rand}\) is a random number, \(\mathrm{rand}\sim U(\mathrm{0,1}),\) and \(c\) is known as a self-introspection parameter in (0,1).
2.2 Acquiring Phase
Randomly select one person \({P}_{r}\), where \(i\ne r\)
If f (\({P}_{i}\)) < f (\({P}_{r}\))
Else
End If
Accept \(P\mathrm{new}\) if it gives better fitness than \(P\) where \({\mathrm{rand}}_{1}\) and \({\mathrm{rand}}_{2}\) are two independent random numbers, \({\mathrm{rand}}_{1} \sim U\left(\mathrm{0,1}\right)\), and \({\mathrm{rand}}_{2} \sim U(\mathrm{0,1})\). These random numbers are used to affect the stochastic nature of the algorithm.
3 Chaotic Map
Variety of chaotic maps are available in the optimization field [25]. In this study, 10 most widely used uni-dimensional chaotic maps have been employed [26]. The mathematical forms of chaotic maps employed are represented as follows:
Chebyshev map
Circle map
where P = 0.5 and b = 0.2.
Gauss map
Iterative map
where P \(\in\) (0, 1) is a suitable parameter.
Logistic map
P = 4 is used for the experiments.
Piecewise map
where \(0\le P\le 0.5.\)
.
Sine map
Singer map
where P \(\in\) (0.9, 1.08).
Sinusoidal map
where P = 2.3.
Tent map
4 The Proposed Chaotic Social Group Optimization Algorithms
On consulting several databases (Scopus, Springer, Elsevier), it can be seen much latest research work has been done through the SGO algorithm. The SGO algorithm has been successfully applied to many research areas such as in the medical field [27, 28], civil engineering [29], optimization engineering [30], communication engineering [31], operation management [32, 33], and many more.
In metaheuristic algorithms, randomness is achieved through some probability distributions. Such randomness can be replaced with a chaotic map due to similar properties of randomness with better statistical and dynamic properties. Such dynamical mixing helps the algorithm to diverse enough to reach every mode in the multimodal objective landscape. Due to the ergodicity and mixing properties of chaos, algorithms perform the iterative search at higher speeds than standard stochastic searches with standard probability distributions.
While going through the literature on the SGO algorithm, we have found that there is only a single paper on SGO which is combined with chaotic concepts [34]. In this paper, the self-inspection parameter ‘C’ value is replaced by two chaotic strategies as Chaotic decreasing inertia weight and Chaotic random inertia weight where logistic maps are aggregated with two popular techniques i.e., liner decreasing inertia weight and random inertia weight. Here, authors proved that the chaotic maps do not significantly affect the convergence of SGO. Again the authors replaced the ‘C’ value with another adaptive chaotic inertia weight to adjust the weight with logistic maps to introduce chaotic sequence into iterations and proved that SGO with adaptive chaotic inertia weight performs better for some benchmark functions.
In the SGO algorithm, the self-introspection parameter C = 0.2 is made constant for all the persons in all generations, and this parameter is responsible for a person to improve his/her knowledge level from the current position towards the optimum position. In order to increase the searchability of the algorithm, the parameter C should be changed or redefined in a manner such that the improvement of the person should be done at a higher speed than the standard speed. This can be achieved by selecting different C values according to the chaotic function, as the insertion of chaotic maps in the structure of metaheuristic algorithms can increase the efficiency of the new algorithm [20, 35]. Equipping the SGO algorithm with chaotic maps aims to improve the capacity of the CSGO algorithm to avoid local minimums, increase stability and strengthen the global search. Hence if replacement happens the potential benefits of C are retained by chaotic numbers.
When the ‘C’ value in SGO is replaced by chaotic maps, then the CSGO can be an algorithm-specific parameter-free algorithm. Then, when we compared CSGO with other algorithms that have different parameter settings, we can declare for the CSGO algorithm no need to bother with parameter settings. The selected chaotic maps that produce chaotic numbers in (0, 1) have been listed in Sect. 3. The family of CSGO algorithms maybe simply classified and described as follows:
In Chaotic SGO1 (CSGO1), Chaotic SGO2 (CSGO2), Chaotic SGO3 (CSGO3), Chaotic SGO4 (CSGO4), Chaotic SGO5 (CSGO5), Chaotic SGO6 (CSGO6), Chaotic SGO7 (CSGO7), Chaotic SGO8 (CSGO8), Chaotic SGO9 (CSGO9), and Chaotic SGO10 (CSGO10) algorithm, the self-introspection parameter C is replaced by a chaotic number generated by the Chebyshev map, circle map, gauss map, iterative map, logistic map, piecewise map, sine map, singer map, sinusoidal map, and tent map, respectively.
Now, it can be said that both original SGO and chaotic SGOs algorithms have the same structure; the only difference between them is the self-introspection parameter that is replaced by chaotic maps in chaotic SGOs and all other conditions remain the same. If we carefully see, four random numbers have been used in SGO: the random numbers in the initialization phase, the improving phase, and two in acquiring phase, and these are not replaced by any chaotic maps in CSGOs. It can be seen from the literature that the chaotic maps replace the random numbers of the chaotic-based stochastic algorithm and even in population initialization. From paper [36], the author has experimentally proven that logistic map-based initialization is able to generate more uniformly distributed particles in the allowable search space to enhance the stability of the algorithm. Not replacing random numbers of the CSGO algorithm with any of the chaotic maps create uniqueness in the algorithm. Although there is no mathematical proof for enhancing the stability of the SGO algorithm still, it has been proved through our experiments that the proposed CSGO algorithms increase their convergence rate and the resulting precision than the SGO algorithm.
5 Simulation, Experimental Results, and Discussion
Every novel optimization algorithm must be subjected to well-defined benchmark functions to measure and test the performance. There are many benchmark functions available; however, there is no standardized set of benchmark functions that are agreed upon for validating new algorithms. To validate and benchmark the performance of the proposed CSGO family of algorithms, simulations on 13 benchmark functions are conducted. One of the main reasons for selecting these functions is that they are utilized in many papers [37,38,39,40,41]. Out of 13, 7 are unimodal benchmark functions and 6 are multimodal benchmarks. Detailed descriptions of these benchmark functions are given in papers [37,38,39,40,41]. After that, nine structural engineering design problems are considered, and the detailed descriptions of these design problems are given in their respective cited papers. All algorithms are implemented using MATLAB 2016a, under Microsoft Windows 10 operating system. Simulations are carried out on an Intel Core i5, 8 GB memory laptop.
5.1 Algorithm Validation
For validating the performance of the CSGO family of algorithms, 13 benchmarks are employed as described above and the results are compared with 10 different metaheuristics algorithms such as GSA [42], Whale Optimization Algorithm (WOA) [37], Henry Gas Solubility Optimization (HGSO) [43], Seagull Optimization Algorithm (SOA) [44], Marine Predators Algorithm (MPA) [45], Tunicate Swarm Algorithm (TSA) [46], Slime Mould Algorithm (SMA) [47], Sooty Term Optimization Algorithm (STOA) [48], Harris Hawks Optimization (HHO) [38], and Ground-Tour Algorithm (GTA) [49]. In experiment 1, the CSGO family of algorithms is compared with each other, and Table 2 illustrates the comparative results. Similar to experiment 2, the performance of the CSGO family is compared with the other ten algorithms and Table 3 illustrates the comparative results. In the experiments, the parameters max_FEs have been kept fixed at 10,000. Hence, the number of iterations and population size may vary for different algorithms. The algorithmic parameter settings are based on the parameters widely used by various researchers and these are mentioned in Table 1.
5.1.1 Experiment 1: The Performance Comparison of the CSGO Family of Algorithms
In this experiment, the performance of the proposed CSGO family of algorithms such as CSGO1, CSGO2, CSGO3, CSGO4, CSGO5, CSGO6, CSGO7, CSGO8, CSGO9, and CSGO10 are compared with each other. Statistical results of 30 repetitions in terms of the best (BEST), worst (WORST), average (MEAN), and standard deviation (SD) of fitness solutions are determined and reported in Table 2 to ensure stability and statistical significance with the best results are highlighted in bold. In the tables, the symbol ‘\(\parallel\)’ represents that its value is equal to the value of the above column.
It is seen from Table 2 that the CSGO3 algorithm reaches the global optimum for all the functions except F5–F7, F10, F12, and F13, and in 10 cases out of 13 finds the best solutions than others. CSGO4 in 6 cases, CSGO7 in 5 cases, and other algorithms except for CSGO9 in 4 cases out of 13 find the best solutions, whereas CSGO9 in 3 cases finds the best solutions. For the F10 function, all algorithms find an equivalent solution but not an optimal solution. Hence, it can be said that the CSGO3 algorithm outperformed all other CSGO family algorithms.
5.1.2 Experiment 2: The Performance Comparison with Other Metaheuristics Algorithms
From experiment 1, it can be examined that CSGO3 has shown superior performance in comparison to all algorithms of the CSGO family in terms of fitness function evaluation. Therefore, in this experiment, CSGO3 is compared with the other ten algorithms for performance validation. Statistical results of 30 repetitions in terms of the best (BEST), worst (WORST), average (MEAN), and standard deviation (SD) of fitness solutions are determined and reported in Table 3 to ensure stability and statistical significance with the best results are highlighted in bold. Table 4 reports p values of the WRS test [50] obtained at a 5% significance level of CSGO3 vs. other approaches. The p values less than 0.05 indicate that the null hypothesis is rejected, and p values that are ‘NaN’ mean both the input values are the same in Table 4, “−”, “+”, and “≈” denote that the performance of other approaches is worse, better, and similar to CSGO3, respectively.
Table 3 illustrates that the CSGO3 algorithm has the best results in most of the cases than the other compared algorithms for the analyzed benchmarks. As can be seen from Table 4, out of 130 cases, only in 7 cases, CSGO3 finds equivalent results, in 9 cases, CSGO3 finds the same solution, in 4 cases, CSGO3 finds a worse solution and in 110 cases, CSGO3 finds best results than others.
5.2 Structural Engineering Design Optimization Problems and Result Analysis
In structural engineering, design optimization problems are Constrained Optimization Problem (COP) which are highly nonlinear and design variables are involved under complex constraints. Such nonlinearity often results in multimodal response landscape. Subsequently, metaheuristic global optimization algorithms are used to obtain optimal solutions.
5.2.1 Constrained Optimization
A COP comprises of an objective function together with some equality and inequality constraints. Lower and upper bounds of design variables are often specified. Considering that there are \(n\) design variables, then COP can be written in following form:
where the function \(f\left(X\right)\) is objective function which is to be minimized. The functions \({g}_{i}\left(X\right)\) and \({h}_{k}\left(X\right)\) are inequality and equality constraint functions, respectively. There are m inequality constraints and p equality constraints in the above problem. This problem is a nonlinear optimization problem if at least one of the functions \(f\left(X\right)\), \({g}_{i}\left(X\right)\) or \({h}_{k}\left(X\right)\) is nonlinear.
Most metaheuristic algorithms are normally designed to work on unconstrained search spaces. Solving COPs using metaheuristic algorithms requires additional mechanisms to incorporate the effects of constraints into their objective function. While solving COPs, it has become necessary to deal with both feasible and infeasible solutions, dealing with the latter having more concerns. It may be possible to ignore all the infeasible solutions but as metaheuristic algorithms are stochastic search methods, completely discarding the infeasible solutions may results in a loss of information about some promising regions of the function landscape. To remove this confusion and to solve this problem, there is a traditional approach that imposes a penalty [51] for the infeasible solutions. A constraint violation is included for the penalized candidate solutions. Then, the penalized candidate solutions are handled as an unconstrained objective function that can be optimized using the unconstrained optimizing technique.
5.2.2 Constraint Violation
The constraint violation \(V(X)\) is the measure that indicates by how much a candidate solution \(X\) violates the given constraints:
Generally, evaluation of constraint violation in the COP is done using the following two equations:
In our study, we have used the approach (17) with m = 2.
5.2.3 Constraint Handling
In COP, the constraint handling technique is a necessary criterion to reach the optimal solution within the feasible region (if exists). This is mainly to exploit the infeasible candidate solutions and extract effective information for the stochastic search process. Depending on the constraint violation and the objective function value, Deb’s rules [52] have been chosen for handling constraints.
While solving a COP, it is very difficult to handle the situation if some active constraint is present. All equality constraints are active constraints and for the inequality constraints those satisfy \({g}_{i}\left(X\right)=0\) at the global optimum solution are called active constraints. Therefore, the problems with equality constraint should be handled evasively for a high-quality solution. The equality constraints can be altered into the inequality form and can easily be combined with the inequality constraint. Lots of techniques have been used for this particular operation. Here, we use a tolerance parameter (\({t}_{p}\)) to for converting the equality constraints into inequality form. Therefore, the constraints of Eq. (14) can be written as
where \({G}_{\mathrm{ineq}}\left(X\right)\) is the inequality constraints, and \({t}_{p}\) is a tolerance parameter for the equality constraints.
Thus, the objective is to minimize the fitness function \(f\left(X\right)\) such that the optimal solution obtained satisfies all the inequality constraints \({G}_{\mathrm{ineq}}\left(X\right)\).
5.2.4 Structural Engineering Design Problems
The performance of the family of CSGO algorithms are demonstrated in this paper through solving nine structural engineering design problems and the performances are compared with many state-of-the-art as well as latest metaheuristic bionic algorithm algorithms of literature.
5.2.4.1 Parameter Settings and Evaluation Criterion
-
Stopping criterion: Maximum number of function evaluations 20,000.
-
Runs: 30 independent runs
-
Statistical results: best (BEST), mean (MEAN), worst (WORST), and standard deviation (SD)
-
Constraints handling: Deb’s rules [52]
-
Initial point in chaos theory set to 0.7 for all chaos maps
The parameter settings for all the algorithms considered for statistical results comparisons are kept the same as mentioned in their respective papers.
Here, we have applied a rule that the infeasible solutions containing slight violation of the constraints (from 0.01 in the first iteration to 0.001 in the last iteration) are considered as feasible solutions. For most structural optimization problems, the global minimum locates on or close to the boundary of a feasible design space. By applying this rule, the candidate solutions approach the boundaries and can reach the global minimum with a higher probability[53].
5.2.4.2 Tension/Compression Spring Design Problem
The objective is the minimization of the fabrication cost of spring with three parameters and four constrains such as wire diameter (\({x}_{1}\)), spring coil diameter (\({x}_{2}\)), and a number of active coils (\({x}_{3}\)), and deflection (\({g}_{1}\left(X\right)\)), shear stress (\({g}_{2}\left(X\right)\)), surge frequency (\({g}_{3}\left(X\right)\)), and outer diameter limit (\({g}_{4}\left(X\right)\)). The spring design pattern is shown in Fig. 1 and formulated optimization problem is referred from [54].
The optimization results and the statistical results are given in Tables 5 and 6, respectively. The result of the PO, EO, PRO, CSA, MFO and Emperor Penguin Optimizer (EPO) algorithms are imported from [55,56,57,58,59] and [46], respectively. The result for the HGSO, MPA, TSA, and STOA algorithms are reported from [43, 45, 46], and [48], respectively. For SHO, SCA, GWO, PSO, MVO, GSA, GA, and DE, the results are imported from [44]. For CS, WOA, EHO (elephant herbing behavior), and SA, the results are imported from [43]. The results for the CSGO family are achieved by us. In the tables, “\(-\)” represents “not given” for that particular algorithm. The best result is highlighted in bold. From Tables 5 and 6, it can be inferred that the CSGO family has found better optimal results than other. Again in CSGO family, CSGO5 finds better optimal result than others.
5.2.4.3 The Welded Beam Design Problem
The objective is to minimize the manufacturing cost of the welded beam with four optimized variables and seven constrains such as thickness of the weld (\({x}_{1}\)), length of clamped bar (\({x}_{2}\)), the height of the bar (\({x}_{3}\)), and thickness of the bar (\({x}_{4}\)), and shear stress (τ), and bending stress in the beam (θ), buckling load (Pc), end deflection of beam (δ), normal stress (σ), and boundary. The design of welded beam is shown in Fig. 2 and formulated optimization problem is referred from [54].
The optimization results and statistical results are given in Tables 7 and 8, respectively. The result of the PO, EO, PRO, CSA, MFO and Emperor Penguin Optimizer (EPO) algorithms are imported from [55,56,57,58,59] and [56], respectively. The result for the HGSO, SOA, MPA, TSA, and STOA algorithms are imported from [43,44,45,46], and [48], respectively. For SMA, SSA the result is imported from [57]. For SHO, GWO, PSO, MVO, SCA, GSA, GA, and DE, the results are imported from [44]. For CS, WOA, EHO (elephant herbing behavior), and SA, the results are imported from [43]. The results for the CSGO family are achieved by us. In the tables, “\(-\)” represents “not given” for that particular algorithm. The best result is highlighted in bold. From Tables 7 and 8, it can be stated that the CSGO family of algorithms outperforms all other algorithms. Here, all algorithms of the CSGO family perform equally well in finding optimal results.
5.2.4.4 Cantilever Beam Design Problem
It is made up of five hollow square cross-sections as Fig. 3 [54]. Regarding this problem, detailed description and formulated optimization problem is referred from [54].
The optimization results and the statistical results of algorithms are given in Tables 9 and 10, respectively. The result SMA, MFO, SOS (Symbiotic Organisms Search), CS, MMA (Method of Moving Asymptotes), and GCA (Generalised Convex Approximation) are imported from [47]. The results for the CSGO family are achieved by us. The “\(-\)” represents “not given” for that particular algorithm. The best result is highlighted in bold. From Tables 9 and 10, it can be inferred that the CSGO family of algorithm outperform all other algorithms. Among the CSGO family of algorithms, CSGO1 finds the best optimal solution, whereas CSGO8 finds the best solution in terms of BEST, WORST, MEAN, and SD solution among others.
5.2.4.5 Three-Bar Truss Design Problem
The objective is to design a truss with a minimum weight that does not violate constraints. Regarding this problem, detailed description is given in [54]. Figure 4 [54] shows the structural parameters of this problem. The formulated design problem is referred from [42].
The optimization result of algorithms with constraint function values is given in Table 11, and the statistical result is given in Table 12. The result PRO (Poor and Rich Optimization), GOA, MFO, PSO-DE, ALO, and MVO are imported from [57], and for CSA, the result is imported from [58]. The results for the CSGO family are achieved by us. The constraint value in Table 13 is calculated by us using optimization results as imported by the authors in their respective papers. The “\(-\)” represents “not given” for that particular algorithm. The best result is highlighted in bold. From Tables 11 and 12, it can be stated that the CSGO family of algorithms outperforms all other algorithms. Among the CSGO family of algorithms, CSGO4 finds the best optimal solution, whereas CSGO1 finds the best solution in terms of worst, mean, and standard deviation solution among others.
5.2.4.6 I-Beam Design Problem
The objective is to minimize the vertical deflection of an I-beam with four optimization variables and two optimization constraints such as cross-section area (\({g}_{1}\left(X\right)\)), and bending stress (\({g}_{2}\left(X\right)\)) as Fig. 5 [59]. This case is modified from the original problem reported in [60]. The formulated optimization problem is referred from [59].
The optimization result of algorithms with constraint function values is given in Table 13, and the statistical result is given in Table 14. The result of ARSM, Improved ARSM, and CS are imported from [59]. The results for the CSGO family are achieved by us. “\(-\)” represents “not given” for that particular algorithm. The best result is highlighted in bold From Tables 13 and 14, it can be deduced that the CSGO family of algorithms outperforms all other algorithms. Here, all algorithms of CSGO family find equivalent results. Hence, all CSGO algorithm finds the best optimal solution as well as the best solution in terms of mean and standard deviation solution among others.
5.2.4.7 Tabular Column Design
The objective of the tabular column design problem is to minimize the cost of designing a uniform column of the tabular section that includes material and construction costs as Fig. 6 [59]. The column is made of material of length (L) with a yield stress (S), a modulus of elasticity (E), and a density (D) carry a compressive load (P). This optimization problem has two optimized variables such as mean diameter of the column (\({x}_{1}\)) and tube thickness (\({x}_{2}\)), and six constraints such as the stress included in the column should be less then the buckling stress ((\({g}_{1}\left(X\right)\)), and the yield stress (\({g}_{2}\left(X\right)\)), the mean diameter of the column is restricted between 2 and 14 cm (\({g}_{3}\left(X\right)\) and \({g}_{4}\left(X\right)\)), and columns with thickness outside the range 0.2–0.8 cm is not commercially available (\({g}_{5}\left(X\right)\) and \({g}_{6}\left(X\right)\)). The formulated optimization problem is referred from [59].
The optimization result of algorithms with constrains value is given in Table 15, and the statistical result is given in Table 16. The result of Rao, fuzzy proportional-derivative controller optimization engine (fuzzy PDCOE) and CS are reported from [60, 61] and [59], respectively. The results for the CSGO family are achieved by us. The “\(-\)” represents “not given” for that particular algorithm. The best result is highlighted in bold. From Tables 15 and 16, it can be stated that the CSGO family of algorithms outperforms all other algorithms. Among the CSGO family of algorithms, CSGO7 finds the best optimal solution, whereas CSGO6 finds the best solution in terms of mean and standard deviation among others.
5.2.4.8 Piston Lever Design Problem
The objective is to minimize the oil volume when the lever of the piston is lifted up from \({0}^{o}\) to \({45}^{o}\) as shown in Fig. 7 [59]. This problem has four optimization variables and four constraints such as force equilibrium, maximum bending moment of the lever, minimum piston stroke, and geographical conditions. The formulated optimization problem is referred from [59].
The optimization result of algorithms is given in Table 17, and the statistical result is given in Table 18. The result of PSO, DE, GA, HPSO, HPSO with Q-learning, and CS are reported from [59]. The results for the CSGO family of algorithms are achieved by us. “\(-\)” represents “not given” for that particular algorithm. The best result is highlighted in bold. From Tables 17 and 18, it can be stated that the CSGO1 algorithm finds the best optimal solution as well as worst solution and CS algorithm achieve best mean solution among others.
5.2.4.9 Multi-plate Disc Clutch Brake Design Problem
The objective is to minimize the total weight of the multi-plate disc clutch brake as Fig. 8 [61]. This problem has five optimized variables as driving force F(\({x}_{4}\)), inner redius ri(\({x}_{1}\)), outer redius ro(\({x}_{2}\)), friction surface number Z(\({x}_{5}\)), and disc thickness t(\({x}_{3}\)). Since the problem contains eight different constraints, the feasible region in the solution space only accounts for 70%, which makes it more difficult to solve the problem. The formulated optimization problem is referred from [61].
The optimization results of algorithms are given in Table 19, and the statistical results are given in Table 20. The result of HHO, PVS, WCA, TLBO, and WSOA are reported from [38, 62,63,64,65], respectively. Similarly, the result of hHHO–SCA, NSGA-II, and AMDE are reported from [66] and for teaching learning-based pathfinder algorithm (TLPFA), the result is reported from [67]. The results for the CSGO family are achieved by us. “\(-\)” represents “not given” for that particular algorithm. The best result is highlighted in bold. From Tables 19 and 20, it can be stated that the TLPFA, CSGO1-CSGO6, CSGO9 and CSGO10 algorithms find the best optimal solution than all other algorithms. Here, CSGO1 and CSGO4 algorithm outperforms all other algorithms and finds the best optimal solution in terms of best (BEST), mean (MEAN), worst (WORST) and standard deviation (SD) solution among others.
5.2.4.10 Corrugated Bulkhead Design Problem
The objective is to minimize the weight of the corrugated bulkhead for a tanker [68]. This problem has four optimized variables such as width (\({x}_{1}\)), depth (\({x}_{2}\)), length (\({x}_{3}\)), and plate thickness (\({x}_{4}\)) and six constraints. The formulated optimization problem is referred from [67].
The optimization result of CSGO family algorithms is given in Table 21 and the statistical result is given in Table 22. For this problem, Ravindran et al. [69] reported the minimum value of 6.84241 using the random search method. A comparison of the results clearly shows that the CSGO family notably improves the results were obtained by the random search method.
In the present study, the CSGO family of algorithms performance is compared against other state-of-art as well as latest metaheuristic algorithms in solving structural optimization problems. The extensive comparative study conducted reveals that the CSGO family performs superior to different existing algorithms. This is partly due to the fact that there is one parameter, i.e., self-introspection parameter is replaced by the chaotic maps than in other algorithms. Other algorithms such as GA, DE, PSO, MBA, and CS require the tuning of at least one specific algorithm parameter. While simpler and more robust than competing algorithms, the CSGO family is able to resolve a wide variety of problems. Moreover, it avoids the risk of compromised performance due to proper parameter tuning.
6 Conclusion
In this paper, the family of CSGO algorithms is proposed as an improved version of the SGO algorithm to solve optimization problems. The family of ten CSGO algorithms is designed using ten chaotic maps in place of self-introspection parameters which improves the performance of the SGO algorithm. The performance of the CSGO family of algorithms is validated through 13 benchmark functions and to evaluate effectiveness, extensive experiments are conducted using 9 structural engineering problems and results are compared with many popular optimization algorithms. The extensive experiment and the promising results indicate the superiority of the proposed CSGO family for providing acceptable results in a wide range of problems. Again, although not yet mathematically proven, these experimental studies have shown that using chaotic maps is generally more useful than a constant value of 0.2 for the self-introspection parameter in SGO. Since in CSGO, any of the random numbers are not replaced by a chaotic map, then replacing random numbers with a chaotic map in SGO is further research. Furthermore, the mapping strategies can also be used to generate a high-quality initial population for obtaining rapid and better solutions. Proposing different hybridizing methods for chaotic mapping, and solving the problem of clustering or classification, large-scale optimizations and multiobjective optimizations using these chaotic concepts with SGO can be further researched.
Data Availability Statement
Data sharing not applicable to this article as no datasets were generated or analyzed during the current study.
References
Wolpert, D. H., & Macready, W. G. (1997). No free lunch theorems for optimization. IEEE Transactions on Evolutionary Computation, 1(1), 67–82.
Talbi, E. G. (2009). Metaheuristics: From design to implementation. Wiley.
Satapathy, S., & Naik, A. (2016). Social group optimization (SGO): A new population evolutionary optimization technique. Complex & Intelligent Systems, 2(3), 173–203.
Naik, A., Satapathy, S. C., Ashour, A. S., & Dey, N. (2018). Social group optimization for global optimization of multimodal functions and data clustering problems. Neural Computing and Applications, 30(1), 271–287.
Naik, A., & Satapathy, S. C. (2021). A comparative study of social group optimization with a few recent optimization algorithms. Complex & Intelligent Systems, 7(1), 249–295.
Eiben, A. E., & Schippers, C. A. (1998). On evolutionary diversification and intensification. Fundamenta Informaticae, 35(1/4), 35–50.
Gandomi, A. H., Yun, G. J., Yang, X. S., & Talatahari, S. (2013). Chaos-enhanced accelerated particle swarm optimization. Communications in Nonlinear Science and Numerical Simulation, 18(2), 327–340.
Mirjalili, S., & Gandomi, A. H. (2017). Chaotic gravitational constants for the gravitational search algorithm. Applied Soft Computing, 53, 407–419.
Rather, S. A., & Bala, P. S. (2020). Swarm-based chaotic gravitational search algorithm for solving mechanical engineering design problems. World Journal of Engineering., 17, 97–114.
Alatas, B. (2010). Chaotic bee colony algorithms for global numerical optimization. Expert Systems with Applications, 37(8), 5682–5687.
Alatas, B. (2010). Chaotic harmony search algorithms. Applied Mathematics and Computation, 216(9), 2687–2699.
Alatas, B., Akin, E., & Ozer, A. B. (2009). Chaos embedded particle swarm optimization algorithms. Chaos, Solitons & Fractals, 40(4), 1715–1734.
Coello, C. A. C., & Montes, E. M. (2002). Constraint-handling in genetic algorithms through the use of dominance-based tournament selection. Advanced Engineering Informatics, 16(3), 193–203.
Guo, H., Li, Y., Liu, X., Li, Y., & Sun, H. (2016). An enhanced self-adaptive differential evolution based on simulated annealing for rule extraction and its application in recognizing oil reservoir. Applied Intelligence, 44(2), 414–436.
Mingjun, J., & Huanwen, T. (2004). Application of chaos in simulated annealing. Chaos, Solitons & Fractals, 21(4), 933–941.
Gandomi, A. H., Yang, X. S., Talatahari, S., & Alavi, A. H. (2013). Firefly algorithm with chaos. Communications in Nonlinear Science and Numerical Simulation, 18(1), 89–98.
Wang, G. G., Guo, L., Gandomi, A. H., Hao, G. S., & Wang, H. (2014). Chaotic krill herd algorithm. Information Sciences, 274, 17–34.
Talatahari, S., Azar, B. F., Sheikholeslami, R., & Gandomi, A. H. (2012). Imperialist competitive algorithm combined with chaos for global optimization. Communications in Nonlinear Science and Numerical Simulation, 17(3), 1312–1319.
Saremi, S., Mirjalili, S., & Lewis, A. (2014). Biogeography-based optimisation with chaos. Neural Computing and Applications, 25(5), 1077–1097.
Gandomi, A. H., & Yang, X. S. (2014). Chaotic bat algorithm. Journal of Computational Science, 5(2), 224–232.
Varol Altay, E., & Alatas, B. (2020). Bird swarm algorithms with chaotic mapping. Artificial Intelligence Review, 53(2), 1373–1414.
Bingol, H., & Alatas, B. (2016). Chaotic league championship algorithms. Arabian Journal for Science and Engineering, 41(12), 5123–5147.
Gharehchopogh, F. S., Nadimi-Shahraki, M. H., Barshandeh, S., Abdollahzadeh, B., & Zamani, H. (2022). Cqffa: A chaotic quasi-oppositional farmland fertility algorithm for solving engineering optimization problems. Journal of Bionic Engineering, 20, 1–26.
Naik, A., Satapathy, S. C., & Abraham, A. (2020). Modified Social Group Optimization—A meta-heuristic algorithm to solve short-term hydrothermal scheduling. Applied Soft Computing, 95, 106524.
He, D., He, C., Jiang, L. G., Zhu, H. W., & Hu, G. R. (2001). Chaotic characteristics of a one-dimensional iterative map with infinite collapses. IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications, 48(7), 900–906.
Tavazoei, M. S., & Haeri, M. (2007). Comparison of different one-dimensional maps as chaotic search pattern in chaos optimization algorithms. Applied Mathematics and Computation, 187(2), 1076–1085.
Singh, A. K., Kumar, A., Mahmud, M., Kaiser, M. S., & Kishore, A. (2021). COVID-19 infection detection from chest X-ray images using hybrid social group optimization and support vector classifier. Cognitive Computation. https://doi.org/10.1007/s12559-021-09848-3
Dey, N., Rajinikanth, V., Fong, S. J., Kaiser, M. S., & Mahmud, M. (2020). Social group optimization–assisted Kapur’s entropy and morphological segmentation for automated detection of COVID-19 infection from computed tomography images. Cognitive Computation, 12(5), 1011–1023.
Das, S., Saha, P., Satapathy, S. C., & Jena, J. J. (2021). Social group optimization algorithm for civil engineering structural health monitoring. Engineering Optimization, 53(10), 1651–1670.
Kalananda, V. K. R. A., & Komanapalli, V. L. N. (2021). A combinatorial social group whale optimization algorithm for numerical and engineering optimization problems. Applied Soft Computing, 99, 106903.
Swathi, A. V. S., Chakravarthy, V. V. S. S. S., & Krishna, M. V. (2021). Circular antenna array optimization using modified social group optimization algorithm. Soft Computing, 25(15), 10467–10475.
Akhtar, M., Manna, A. K., & Bhunia, A. K. (2023). Optimization of a non-instantaneous deteriorating inventory problem with time and price dependent demand over finite time horizon via hybrid DESGO algorithm. Expert Systems with Applications, 211, 118676.
Naik, A., & Chokkalingam, P. (2011). Binary social group optimization algorithm for solving 0–1 knapsack problem. Decision Science Letters, 11(1), 55–72.
Jena, J. J., & Satapathy, S. C. (2021). A new adaptive tuned Social Group Optimization (SGO) algorithm with sigmoid-adaptive inertia weight for solving engineering design problems. Multimedia Tools and Applications. https://doi.org/10.1007/s11042-021-11266-4
Yu, J., Kim, C. H., Wadood, A., Khurshiad, T., & Rhee, S. B. (2018). A novel multi-population based chaotic JAYA algorithm with application in solving economic load dispatch problems. Energies, 11(8), 1946.
Tian, D., & Shi, Z. (2018). MPSO: Modified particle swarm optimization and its applications. Swarm and Evolutionary Computation, 41, 49–68.
Mirjalili, S., & Lewis, A. (2016). The whale optimization algorithm. Advances in Engineering Software, 95, 51–67.
Heidari, A. A., Mirjalili, S., Faris, H., Aljarah, I., Mafarja, M., & Chen, H. (2019). Harris hawks optimization: Algorithm and applications. Future Generation Computer Systems, 97, 849–872.
Nematollahi, A. F., Rahiminejad, A., & Vahidi, B. (2017). A novel physical based meta-heuristic optimization method known as lightning attachment procedure optimization. Applied Soft Computing, 59, 596–621.
Nematollahi, A. F., Rahiminejad, A., & Vahidi, B. (2020). A novel meta-heuristic optimization method based on golden ratio in nature. Soft Computing, 24(2), 1117–1151.
Moghdani, R., & Salimifard, K. (2018). Volleyball premier league algorithm. Applied Soft Computing, 64, 161–185.
Rashedi, E., Nezamabadi-Pour, H., & Saryazdi, S. (2009). GSA: A gravitational search algorithm. Information Sciences, 179(13), 2232–2248.
Hashim, F. A., Houssein, E. H., Mabrouk, M. S., Al-Atabany, W., & Mirjalili, S. (2019). Henry gas solubility optimization: A novel physics-based algorithm. Future Generation Computer Systems, 101, 646–667.
Dhiman, G., & Kumar, V. (2019). Seagull optimization algorithm: Theory and its applications for large-scale industrial engineering problems. Knowledge-Based Systems, 165, 169–196.
Faramarzi, A., Heidarinejad, M., Mirjalili, S., & Gandomi, A. H. (2020). Marine predators algorithm: A nature-inspired metaheuristic. Expert Systems with Applications, 152, 113377.
Kaur, S., Awasthi, L. K., Sangal, A. L., & Dhiman, G. (2020). Tunicate Swarm Algorithm: A new bio-inspired based metaheuristic paradigm for global optimization. Engineering Applications of Artificial Intelligence, 90, 103541.
Li, S., Chen, H., Wang, M., Heidari, A. A., & Mirjalili, S. (2020). Slime mould algorithm: A new method for stochastic optimization. Future Generation Computer Systems, 111, 300–323.
Dhiman, G., & Kaur, A. (2019). STOA: A bio-inspired based optimization algorithm for industrial engineering problems. Engineering Applications of Artificial Intelligence, 82, 148–174.
Meirelles, G., Brentan, B., Izquierdo, J., & Luvizotto, E., Jr. (2020). Grand tour algorithm: Novel swarm-based optimization for high-dimensional problems. Processes, 8(8), 980.
Derrac, J., García, S., Molina, D., & Herrera, F. (2011). A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm and Evolutionary Computation, 1(1), 3–18.
Koziel, S., & Michalewicz, Z. (1999). Evolutionary algorithms, homomorphous mappings, and constrained parameter optimization. Evolutionary Computation, 7(1), 19–44.
Deb, K. (2000). An efficient constraint handling method for genetic algorithms. Computer Methods in Applied Mechanics and Engineering, 186(2–4), 311–338.
Kaveh, A., & Talatahari, S. (2009). A particle swarm ant colony optimization for truss structures with discrete variables. Journal of Constructional Steel Research, 65(8–9), 1558–1568.
Belegundu, A.D. (1983) Study of mathematical programming methods for structural optimization. Diss Abstr Int Part B Sci Eng 43.
Askari, Q., Younas, I., & Saeed, M. (2020). Political optimizer: A novel socio-inspired meta-heuristic for global optimization. Knowledge-Based Systems, 195, 105709.
Faramarzi, A., Heidarinejad, M., Stephens, B., & Mirjalili, S. (2020). Equilibrium optimizer: A novel optimization algorithm. Knowledge-Based Systems, 191, 105190.
Moosavi, S. H. S., & Bardsiri, V. K. (2019). Poor and rich optimization algorithm: A new human-based and multi populations algorithm. Engineering Applications of Artificial Intelligence, 86, 165–181.
Askarzadeh, A. (2016). A novel metaheuristic method for solving constrained engineering optimization problems: Crow search algorithm. Computers & Structures, 169, 1–12.
Gandomi, A. H., Yang, X. S., & Alavi, A. H. (2013). Cuckoo search algorithm: a metaheuristic approach to solve structural optimization problems. Engineering with Computers, 29(2), 245–245.
Rao, S. S. (2019). Engineering optimization: Theory and practice. Wiley.
Hsu, Y. L., & Liu, T. C. (2007). Developing a fuzzy proportional–derivative controller optimization engine for engineering design optimization problems. Engineering Optimization, 39(6), 679–700.
Che, Y., & He, D. (2021). A Hybrid whale optimization with seagull algorithm for global optimization problems. Mathematical Problems in Engineering, 2021, 1–31.
Savsani, P., & Savsani, V. (2016). Passing vehicle search (PVS): A novel metaheuristic algorithm. Applied Mathematical Modelling, 40(5–6), 3951–3978.
Zheng, Y. J. (2015). Water wave optimization: A new nature-inspired metaheuristic. Computers & Operations Research, 55, 1–11.
Rao, R. V., Savsani, V. J., & Vakharia, D. P. (2011). Teaching–learning-based optimization: A novel method for constrained mechanical design optimization problems. Computer-aided Design, 43(3), 303–315.
Kamboj, V. K., Nandi, A., Bhadoria, A., & Sehgal, S. (2020). An intensify Harris Hawks optimizer for numerical and engineering optimization problems. Applied Soft Computing, 89, 106018.
Tang, C., Zhou, Y., Tang, Z., & Luo, Q. (2021). Teaching-learning-based pathfinder algorithm for function and engineering optimization problems. Applied Intelligence, 51(7), 5040–5066.
Kvalie, D. (1967). Optimization of plane elastic grillages. Doctoral dissertation, PhD Thesis, Norges Teknisk Naturvitenskapelige Universitet, Norway.
Ravindran, A., Reklaitis, G. V., & Ragsdell, K. M. (2006). Engineering optimization: methods and applications. Wiley.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of Interest
The author declares that she has no conflict of interest in the publication of this paper.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Naik, A. Chaotic Social Group Optimization for Structural Engineering Design Problems. J Bionic Eng 20, 1852–1877 (2023). https://doi.org/10.1007/s42235-023-00340-2
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s42235-023-00340-2