Abstract
In this paper, the combination of particle swarm optimization (PSO) and gravitational search algorithm (GSA) is enhanced by the first-order gradient method and a new optimization algorithm is introduced as GPSG. In metaheuristic methods, some search directions are randomly selected and the resulting information gradually progresses toward the optimal solution. Since along the gradient direction usually yields the largest decrease in the desired function, it is added to the GSA and PSO process to allow for faster and more accurate convergence. By integrating the metaheuristic methods with the gradient directions, a powerful method for optimizing functions has been made possible. A novel approach is introduced by merging two metaheuristic methods with the randomly generated steepest decent directions. Numerous examples of unconstrained problems of mathematical functions of CEC2005 and CEC2017 together with some constrained examples of stress and displacement structural design problems have been chosen to demonstrate the reliability and capability of the presented method. Comparison of the numerical results with some methods indicates that the average rank of the proposed technique is better.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Introduction
The optimization methods are divided into two main groups. The traditional mathematical programming techniques belong to this category (Vanderplaats 1999; Rao 2009; Haftka et al. 1990). Most of the techniques are based on choosing some search directions employing the first and higher-order derivatives of the functions under consideration. The search directions and the step lengths are modified to reach the desired optimal solution. The final results may not lead to global optimum for multimodal functions. The methods are referred to as gradient-based approaches (GBA).
The second type of method is modern optimization methods that are based on statistical search directions and derived from different behaviors in nature (Yang 2010; Parmee 2001; Gandomi et al. 2013; Du and Swamy 2016; Yang et al. 2016; Siddique and Adeli 2017; Zhou et al. 2013; Akhtar et al. 2020). In these methods, the relevant variables are randomly selected in the design space and with intelligent computational methods, the results gradually move toward the optimal design. As all the design space is explored, it is possible to calculate the near-global optimal design (Osińskia et al. 2013). There are several intelligent methods based on the social life of creatures, like particle swarm optimization (PSO) (Kennedy and Eberhart 1995), artificial immune systems (AIS) (Dasgupta 2006; Farmer et al. 1986), ant colony search algorithm (Dorigo and Stützle 2004) and harmony search algorithm (Geem 2010; Geem et al. 2001). Another category of science-based principles inspired by different disciplines like physics and chemistry such as simulated annealing(Kirkpatrick et al. 1983; Aarts and Korst 1989), quantum computing (Benioff 1980), central force optimization (Formato 2007, 2008), gravitational search algorithm (Rashedi et al. 2009, 2018) and chemical reaction optimization (Guggenheim and Modern 1967; Lam and Li 2010).
Among the heuristic optimization methods, the PSO method is attractive that has been studied by many researchers (Kashani et al. 2020). Different variants of PSO have been successfully presented in the literature (Kumar et al. 2020, 2021; Das et al. 2021). Another interesting approach is the gravitational search algorithm (GSA) that has been recently presented by researchers at the Shahid Bahonar University of Kerman with various modifications (Ebrahimi Mood et al. 2015).
The weakness of the standard random metaheuristic methods is the slow convergence and the approaches require a high number of particles in the design space with many iterations. If these methods are merged with deterministic methods such as gradient-based approaches (GBA), the optimal results will be achieved more efficiently with rapid convergence.
In this paper, the PSO and GSA methods with the GBA are combined and a new innovative method called GPSG (GSA + PSO + GBA) is introduced. To evaluate the performance of the approach, 25 mathematical optimization functions of CEC2005 (Suganthan et al. 2005) and 29 complicated multimodal functions of CEC2017 (Awad et al. 2017) are examined. Besides, three truss design problems from the literature are chosen to verify the convergence of the approach. The numerical results of PSO, GSA and GPSG are compared and it is observed that the performance of GPSG is much better than other approaches. In the next sections, first, a brief description of the methods of PSO and GSA are outlined and then the details of GPSG are presented. The numerical results of the three approaches are compared with some of the methods in the literature.
2 Optimization algorithms
This section presents the basic ideas of PSO, GSA, GBA and the proposed GPSG. The pseudo-code of the approach is outlined to clarify the main steps of the approach.
2.1 Particle swarm optimization (PSO)
Kennedy and Eberhart (1995) developed a metaheuristic optimization method based on a random search. In this algorithm, a random initial population with a certain number of particles is generated in the design space as indicated in (1).
where \({{\varvec{X}}}_{i}\) is the vector of the design variables of the particle \(i\), \(n\) is the number of design variables, \(d\) denotes the dimension of the problem, and \(NP\) is the number of all particles.
The particles share their information with other particles. So each particle can adjust its position according to its previous experience and makes the best use of its own and its neighbors.
As a result, in the progress of optimization, over time to obtain the best possible response, the movement of each particle is based on self-awareness and the intelligence of the group particles.
In the PSO method, the particle velocity can be calculated according to (2).
where \({{\varvec{V}}}_{i}^{t}\) is the velocity and \({{\varvec{X}}}_{i}^{t}\) is the positions of any particle at time \(t\). The vectors \({{\varvec{P}}}_{i}^{t}\) (Pbest) are the best particle position i and \({{\varvec{P}}}_{\text{g}}^{t}\) (Gbest) are the best particle position in the whole society and the index g stands for global. The parameter \(\omega \) is the weight of inertia to apply the importance of the velocity of the preceding iterations that gradually decreases linearly from 0.9 to 0.4 overtime. The scalars \({c}_{1}\) and \({c}_{2}\) are coefficients to determine the significance of Pbest and Gbest, respectively. In this study, an experimental value of 0.5 is considered for both coefficients. The coefficients of \({r}_{1}\) and \({r}_{2}\) are random scalars with a uniform distribution in the interval of zero and one to maintain the randomness of the algorithm. Note that initially, all particle velocities are set to zero. The first part of (2) is a portion of the speed of the previous iteration. The second and third parts are the effects of the particle in question and the whole group particles, respectively, from the beginning of the path to the moment t. PSO keeps the memory of all the best results of the previous iterations.
The particles are updated according to (3).
By repeating the process, the optimal solution to the problem can be obtained.
2.2 Gravitational search algorithm (GSA)
To minimize the objective function, a set of particles is randomly assigned in the design space and updated over time. The position of each particle is defined as (1). The gravitational search algorithm is inspired by the law of gravity in nature using Newton’s laws. In summary, the acceleration vector of the ith particle in iteration \(t\) can be specified as (Rashedi et al. 2009, 2018);
where \(G\) is the gravitational constant. \({M}_{j}\) represents the mass of the jth particle and can be evaluated from the objective functions of particles (Rashedi et al. 2009). \({R}_{ij}\) is the distance between the particles of i and j. randj is a random scalar with uniform distribution in the interval of zero and one, and \(\upvarepsilon \) is a small number to prevent numerical errors.
The gravitational constant \({G}_{0}\) is defined as (Rashedi et al. 2009);
where the parameters \({G}_{0}\) and \(\alpha \) are two constant coefficients. \(t\) represents the current iteration and \(T\) is the maximum number of iterations.
The updated mass of the particles is evaluated by (6) and normalized according to the relation (7).
in which
where the fitness value (objective function) of each of the ith particles is evaluated as \({F}_{i}(t)\) at time \(t\).
The updated gravitational velocity vector is calculated from (10), assuming a one-second interval between the iterations.
Then, the position of the particles is updated according to (3).
2.3 The first order gradient-based algorithm (GBA)
Using the Taylor series, retaining the first-order terms, the position of the updated particles is obtained as (11). The direction of the search is referred to as the gradient based-method (GBM) (Vanderplaats 1999; Rao 2009; Haftka et al. 1990).
in which \(\Upsilon\) is the step length and the operator \(\nabla \) is the gradients. By employing (11), the direction of search is in the negative gradients of the function under consideration that is referred to as steepest descent resulting in the most reduction in the function. Special care should be taken in choosing the step length \(\Upsilon\). The gradient directions are randomized for compatibility with the metaheuristic methods as explained in the next section.
2.4 Hybrid method (GPSG)
The metaheuristic methods such as PSO and GSA have a good capability of exploring the whole design space. The combination of PSO and GSA improves the quality of exploration (Tsai et al. 2013). However, all the heuristic approaches or their combination have low power of exploitation (local search). On the other hand, the GBA is powerful for exploitation and weak in exploration (global search) that may be trapped in local optima. Thus the merging of the three approaches is a good idea for obtaining a fast and powerful method. The heuristic methods are random but the gradient methods are deterministic. In the combination approaches, the idea of randomization must be employed. Besides, a fraction of the speed in each method should be combined.
The velocity of the particles for PSO, GSA and GBA is given in (12), (13) and (14), respectively, with some modifications.
The parameter \(sv\) is given by
The velocity of GBA is normalized to achieve a unit vector along the negative gradient direction. Then, it is multiplied by \(sv\) to adjust the length of the search direction of the GBA corresponding to the resultant of the PSO and GSA algorithms. On the other hand, \(sv\) is a step length for the unit vector of the negative gradients.
Comparing (12) and the original PSO velocity indicates that the first term of (2) is omitted because its effects are considered in the velocity of the GSA (13).
Finally, the velocity of the combined approach (GPSG) is obtained by (16).
In this formulation, \({r}_{3}\) is a random number between zero and one. \({C}_{g}=2\) is a statistically appropriate coefficient as the average of \({C}_{g}*{r}_{3}\) tends to one. Therefore, the magnitude of the GBA (\(sv\)) does not change much.
The value of \({C}_{t}\) is considered as 0.5 to prevent statistically increasing the resultant of the magnitude of the three vectors corresponding to the algorithms.
Finally, the position of each particle is updated by the GPSG velocity presented in (17).
The main idea of the proposed method is that for minimization problems, in the direction of negative gradient of the objective function (GBA), the reduction of the function is more than any other direction. However, the drawback of the gradient direction is that the optimal result falls in a local solution. Thus, the direction of the gradient vector is randomized and its length is normalized for compatibility with the resultant directions generated by PSO and GSA. In each iteration, the resultant of the PSO, GSA and GBA (GPSG) are employed for the optimization process. The fast convergence of numerical results with a small number of the initial population indicates the high ability of the exploration and exploitation of the proposed GPSG method.
The steps of the GPSG hybrid method are presented in the following pseudo-code (comments for more information represented with %);
2.5 Optimization problem formulation
The general form of a constrained optimization problem can be specified as follows:
where \(f\left({\varvec{X}}\right)\) and \({g}_{k}\left({\varvec{X}}\right)\) represent the objective function and constraints, respectively. The value of K represents the number of constraints.
To convert the constrained structural problems into the unconstrained functions, the penalty function should be used as (Salajegheh and Salajegheh 2019);
where \(c\) is a scalar coefficient for penalties that increase the objective function if the constraints are violated. \(F\left({\varvec{X}}\right)\) is called the penalized objective function. The parameters \(\mu \) and \(\lambda \) are the coefficients needed to calculate the value of \(c\). In this study, \(\mu \) is considered as 0.5.
3 Numerical investigation
In this research, the mathematical functions of CEC2005 (Suganthan et al. 2005) with 10 variables (n = 10) and CEC2017 (Awad et al. 2017) with 30 variables are used to explore the capabilities of the proposed approach. In addition, three standard benchmark structural problems are considered for the validation of the GPSG algorithm. These are discussed as follows:
3.1 Results of CEC2005 mathematical functions
The results of the methods of PSO, GSA and GPSG for a population of 5 (NP) are presented graphically for some of the functions in Fig. 1. The required data in the process of optimization are presented in Table 1.
The vertical axis represents the difference between the obtained results and the exact solution (X*) at each iteration. Each graph is the outcome of the average of 30 independent runs.
The two methods of PSO and GSA have very poor convergence and the methods trap in a local optimum for some of the cases. But in the combined GPSG method, the convergence of the optimization process is smooth with much better results.
Besides, the effects of the number of the initial population are considered for the methods. Populations of 5, 10, 20 and 30 are examined. The results are presented in Figs. 2, 3 and 4. It can be observed that the methods of PSO and GSA are very sensitive to the number of population. However, in the GPSG, the population does not affect the results much. With a small population, the enhanced method behaves well.
The summary of the results is given in Table 2. For all the problems, mean, rank, standard deviation, median and best response for the final iteration are given. The rank is evaluated based on the results of the average of the final iteration of the 30 runs, for the same population.
From the statistical point of view, the standard deviation (SD.) indicates that the diversity of the final results of the independent runs are lower for the proposed approach, which shows the superiority of the method.
The mean values of the average of the 25 functions are found and then their ranks are evaluated for each population. The final results are given in Table 3. It is seen that the GPSG possesses the first rank in all the cases.
3.2 Comparison of the proposed method with several metaheuristic algorithms
The outlined approach with 20 particles is compared with different variations of GSA for the CEC2005 (Ebrahimi et al. 2015). The required initial parameters are chosen similarly. The numerical results are presented in Table 4. It can be observed that 15 functions have the first rank and the average rank of all the 25 functions is better than variants of GSA.
For CEC2017 the average, best and standard deviation (SD.) results for each function are indicated in Table 5. The average results of the proposed approach compared with 8 other methods, according to the available information. The superiority of the GPSG among the others is observed according to the average rank. Also, the same number of function evaluations is chosen for all methods. The standard deviations (SD.) are obtained for GPSG and the other methods, the results of SD. are not available for comparison.
3.3 Structural problems
The optimal design of structures is the main topic among structural engineers (Mashayekhi et al. 2012, 2016; Gholizadeh 2013; Khatibinia and Yazdani 2018; Bhullar et al. 2020). In this section, three design problems are chosen for truss structures. The weight of the structures is taken as the objective function and the constraints are bounds on member stresses and joint displacements. The cross-sectional areas are continuous design variables. In all the problems, the value of \(\alpha \) in (5) and the maximum iterations (\(T\)) are chosen as 4 and 200, respectively.
3.3.1 10-bar plane truss
The 10-bar truss is optimized as shown in Fig. 5. The required information for the truss is given in Table 6.
The results are given in Table 7. The numerical results indicate that the best results are obtained with the combined GPSG method, in which for all the number of particles, similar results are approximately achieved.
The convergence trend for different populations is illustrated in Fig. 6. It is concluded that the GPSG method yields better results which demonstrates the efficiency of this method.
The ranking of the three algorithms is given in Table 8. The ranking of the GPSG is first.
3.3.2 72-bar truss
The 72-member truss shown in Fig. 7 consists of 16 member types due to geometrical shape. The aim is to minimize the weight of the structure. All of the specifications of the structure are given in Table 9.
The results are presented in Table 10. It can be concluded that the GPSG hybrid method has the most favorable results. The number of the initial population does not affect the results. The average of the hybrid method with only five particles is better than the PSO and GSA methods with 30 particles.
The convergence history of the methods is shown in Fig. 8. The GPSG method yields better results that demonstrate its effectiveness.
The ranking of the three algorithms is shown in Table 11 and the GPSG method is ranked first.
3.3.3 120-member dome under asymmetric vertical load
The 120-member dome shown in Fig. 9 is composed of 7 member types due to the existing geometric symmetry. Therefore, the number of design variables is reduced from 120 to 7. Stress constraints have been used following AISC-ASD regulations. The permissible tensile and compressive stresses are given in (25 and 26), respectively.
where \(E\) is the modulus of elasticity, \({F}_{y}\) is the yield stress of steel, and \({C}_{c}\) is the boundary value between the elastic and inelastic buckling states. The effective length coefficient \(k\)=1, and \({r}_{i}\) is the radius of gyration of each member, evaluated as (26). The asymmetric vertical load is applied to the free nodes in the z-direction according to Table 12.
The required specifications for the 120-member dome are given in Table 13.
The results are presented in Table 14. The best results correspond to the combined GPSG method, in which the deviation between different performances is low.
The convergence trend of the results is presented in Fig. 10. The results show that for different populations, the GPSG is the most appropriate method.
The rankings of the three GSA, PSO and GPSG algorithms are presented in Table 15. Similar to the previous examples, the first rank belongs to the suggested new method of GSPG.
4 Conclusions
In advanced methods of optimization, the main goal is to find the optimal solution of multimodal functions efficiently. There are two main categories in this field. The traditional gradient-based methods (GBM) start from a pre-assigned solution and move along a search direction using the gradients of the function under consideration. The GBM methods are reliable, efficient and fast for unimodal functions but may trap into a local optimum for multimodal functions and the initial point is important for the convergence. The second category is referred to as multipoint metaheuristic approaches. These methods are based on a number of the initial population and the search directions are made on some statistical ideas. The methods lead to finding the global point if enough initial population is chosen and the search directions are organized logically. However, the exploitation of the approaches is weak and unsatisfactory.
Based on these difficulties, in the present research, the two categories are combined to achieve a new successful approach. A compromise is made between the two categories in terms of their capabilities and shortcomings.
Among the vast number of metaheuristic techniques, the two rather successful methods of PSO and GSA are selected and their combination is merged with GBM. The integration of the three methods is called GPSG.
The resultant of the search directions of the three methods are made in such a way to control the overall speed of the GPSG with proper move limits, employing the multipoint initial population and the philosophy of the stochastic ideas.
To verify the proposed method, 25 complicated multimodal functions of CEC2005 and 29 functions of CEC2017 from the literature as benchmark examples are tested. Besides, three structural design problems of two and three-dimensional truss structures with stress and displacement constraints are optimized for optimal weight.
The numerical results indicate the superiority of the proposed approach compared to both methods of PSO and GSA. In addition, the results of CEC2005 are compared with four variants of GSA and the results of CEC2017 are compared with eight other available methods. The first mean rank belongs to the proposed approach. The power of GPSG is investigated in terms of the exploration and exploitation demands. The new approach can reach the appropriate optimal solution with a less initial population with lower independent runs. The convergence history of the approach is smooth and the results are more efficient, reliable and stable.
It was found that the combination of GBA with either of the PSO and GSA works well; however, the efficiency of the integration of the three approaches is greatly enriched.
As the metaheuristic approaches are not suitable for all the optimization problems, the search is continuously under progress to reach more suitable approaches. It is intended to incorporate higher-order gradient directions with other variants in the future.
Data availability
Enquiries about data availability should be directed to the corresponding author.
References
Aarts E, Korst J (1989) Simulated annealing and boltzmann machines. Wiley, Chichester
Akhtar M, Manna A, Duary A, Bhunia A (2020) A hybrid tournament differential evolution algorithm for solving optimization problems and applications. Int J Oper Res. https://doi.org/10.1504/IJOR.2021.10034505
Askarzadeh A (2016) A novel metaheuristic method for solving constrained engineering optimization problems: crow search algorithm. Comput Struct 169:1–12. https://doi.org/10.1016/j.compstruc.2016.03.001
Awad NH, Ali MZ, Suganthan PN, Liang JJ, Qu BY (2017) Problem definitions and evaluation criteria for the CEC 2017 special session and competition on single objective real-parameter numerical optimization. Technical Report, Nanyang Technological University, Singapore and Jordan University of Science and Technology, Jordan and Zhengzhou University, Zhengzhou China.
Benioff P (1980) The computer as a physical system: a microscopic quantum mechanical Hamiltonian model of computers as represented by Turing machines. J Stat Phys 22(5):563–591. https://doi.org/10.1007/BF01011339
Bhullar AK, Kaur R, Sondhi S (2020) Enhanced crow search algorithm for AVR optimization. Soft Comput 24:11957–11987. https://doi.org/10.1007/s00500-019-04640-w
Civicioglu P (2013) Backtracking search optimization algorithm for numerical optimization problems. Appl Math Comput 219(15):8121–8144. https://doi.org/10.1016/j.amc.2013.02.017
Das SC, Manna AK, Rahman MS, Shaikh AA, Bhunia AK (2021) An inventory model for non-instantaneous deteriorating items with preservation technology and multiple credit periods-based trade credit financing via particle swarm optimization. Soft Comput 25(7):5365–5384. https://doi.org/10.1007/s00500-020-05535-x
Dasgupta D (2006) Advances in artificial immune systems. IEEE Comput Intell Mag 1(4):40–49. https://doi.org/10.1109/MCI.2006.329705
Dorigo M, Stützle T (2004) Ant colony optimization. MIT Press, Cambridge
Du KL, Swamy MNS (2016) Search and optimization by metaheuristics: techniques and algorithms inspired by nature. Birkhauser, Switzerland
Ebrahimi Mood S, Rashedi E, Javidi MM (2015) New functions for mass calculation in gravitational search algorithm. J Comput Secur 2(3):233–246
Farmer JD, Packard NH, Perelson AS (1986) The immune system, adaptation, and machine learning. Physica D 22(1–3):187–204. https://doi.org/10.1016/0167-2789(86)90240-X
Formato RA (2007) Central force optimization: a new metaheuristic with applications in applied electromagnetics. Progress Electromagnet Res 77:425–491. https://doi.org/10.2528/PIER07082403
Formato RA (2008) Central force optimization: a new nature inspired computational framework for multidimensional search and optimization. Stud Comput Intell 129:221–238. https://doi.org/10.1007/978-3-540-78987-1_21
Gandomi AH, Alavi AH (2012) Krill herd: a new bio-inspired optimization algorithm. Commun Nonlinear Sci Numer Simul 17(12):4831–4845. https://doi.org/10.1016/j.cnsns.2012.05.010
Gandomi AH, Yang XS, Talatahari S, Alavi AH (eds) (2013) Metaheuristic applications in structures and infrastructures. Elsevier, London
Geem ZW (2010) Recent advances in harmony search algorithm. Springer Science & Business Media, Berlin
Geem ZW, Kim JH, Loganathan GV (2001) A new heuristic optimization algorithm: harmony search. SIMULATION 76(2):60–68. https://doi.org/10.1177/003754970107600201
Gholizadeh S (2013) Layout optimization of truss structures by hybridizing cellular automata and particle swarm optimization. Comput Struct 125:86–99. https://doi.org/10.1016/j.compstruc.2013.04.024
Guggenheim EA (1967) Modern thermodynamics: An advanced treatment for chemists and physicists. Wiley, North Holland
Haftka RT, Gürdal Z, Kamat MP (1990) Elements of structural optimization, 2nd edn. Springer Science & Business Media, Dordrecht
Karaboga D, Basturk B (2007) A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm. J Global Optim 39(3):459–471. https://doi.org/10.1007/s10898-007-9149-x
Kashani AR, Chiong R, Mirjalili S, Gandomi AH (2020) Particle swarm optimization variants for solving geotechnical problems: review and comparative analysis. Arch Comput Methods Eng 27:1–57. https://doi.org/10.1007/s11831-020-09442-0
Kennedy J, Eberhart R (1995) Particle swarm optimization. In: Proceedings of ICNN'95-International Conference on Neural Networks; Perth, Australia: IEEE: pp 1942–1948. https://doi.org/10.1109/ICNN.1995.488968
Khatibinia M, Yazdani H (2018) Accelerated multi-gravitational search algorithm for size optimization of truss structures. Swarm Evol Comput 38:109–119. https://doi.org/10.1016/j.swevo.2017.07.001
Kirkpatrick S, Gelatt CD, Vecchi MP (1983) Optimization by simulated annealing. Science 220(4598):671–680. https://doi.org/10.1126/science.220.4598.671
Kumar N, Mahato SK, Bhunia AK (2020) A new QPSO based hybrid algorithm for constrained optimization problems via tournamenting process. Soft Comput 24(15):11365–11379. https://doi.org/10.1007/s00500-019-04601-3
Kumar N, Manna AK, Shaikh AA, Bhunia AK (2021) Application of hybrid binary tournament-based quantum-behaved particle swarm optimization on an imperfect production inventory problem. Soft Comput 25(16):11245–11267. https://doi.org/10.1007/s00500-021-05894-z
Lam AYS, Li VOK (2010) Chemical-reaction-inspired metaheuristic for optimization. IEEE Trans Evol Comput 14(3):381–399. https://doi.org/10.1109/TEVC.2009.2033580
Mashayekhi M, Salajegheh E, Salajegheh J, Fadaee MJ (2012) Reliability-based topology optimization of double layer grids using a two-stage optimization method. Struct Multidiscip Optim 45(6):815–833. https://doi.org/10.1007/s00158-011-0744-6
Mashayekhi M, Salajegheh E, Dehghani M (2016) Topology optimization of double and triple layer grid structures using a modified gravitational harmony search algorithm with efficient member grouping strategy. Comput Struct 172:40–58. https://doi.org/10.1016/j.compstruc.2016.05.008
Mirjalili S (2016) SCA: a sine cosine algorithm for solving optimization problems. Knowl-Based Syst 96:120–133. https://doi.org/10.1016/j.knosys.2015.12.022
Mirjalili S, Mirjalili SM, Lewis A (2014) Grey wolf optimizer. Adv Eng Softw 69:46–61. https://doi.org/10.1016/j.advengsoft.2013.12.007
Osińskia P, Deptuła A, Partyka MA (2013) Discrete optimization of a gear pump after tooth root undercutting by means of multi-valued logic trees. Arch Civ Mech Eng 13:422–431. https://doi.org/10.1016/j.acme.2013.05.001
Parmee IC (2001) Evolutionary and adaptive computing in engineering design. Springer Science & Business Media, London
Rao SS (2009) Engineering optimization: theory and practice, 4th edn. John Wiley & Sons Inc., New Jersey
Rashedi E, Nezamabadi-pour H, Saryazdi S (2009) GSA: a gravitational search algorithm. Inf Sci 179(13):2232–2248. https://doi.org/10.1016/j.ins.2009.03.004
Rashedi E, Rashedi E, Nezamabadi-pour H (2018) A comprehensive survey on gravitational search algorithm. Swarm Evol Comput 41:141–158. https://doi.org/10.1016/j.swevo.2018.02.018
Salajegheh F, Salajegheh E (2019) PSOG: enhanced particle swarm optimization by a unit vector of first and second order gradient directions. Swarm Evol Comput 46:28–51. https://doi.org/10.1016/j.swevo.2019.01.010
Siddique NH, Adeli H (2017) Nature-inspired computing: physics and chemistry-based algorithms. CRC Press, New York
Storn R, Price K (1997) Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. J Global Optim 11(4):341–359. https://doi.org/10.1023/A:1008202821328
Suganthan PN, Hansen N, Liang JJ, Deb K, Chen YP, Auger A, Tiwari S (2005) Problem definitions and evaluation criteria for the CEC 2005 special session on real-parameter optimization. KanGAL Report 2005:2005005
Tsai HC, Tyan YY, Wu YW, Lin YH (2013) Gravitational particle swarm. Appl Math Comput 219(17):9106–9117. https://doi.org/10.1016/j.amc.2013.03.098
Vanderplaats GN (1999) Numerical optimization techniques for engineering design: with applications, 3rd edn. Vanderplaats Research & Developments Inc.
Yang XS (2010) Engineering optimization: an introduction with metaheuristic applications. John Wiley & Sons Inc, New Jersey
Yang XS, Bekdaş G, Nigdeli SM (eds) (2016) Metaheuristics and optimization in civil engineering. Springer, Switzerland
Zhou J, Wang B, Lin J, Fu L (2013) Optimization of an aluminum alloy anti-collision side beam hot stamping process using a multi-objective genetic algorithm. Arch Civ Mech Eng 13:401–411. https://doi.org/10.1016/j.acme.2013.01.008
Funding
No funding was received to assist with the preparation of this manuscript.
Author information
Authors and Affiliations
Contributions
FS was involved in conceptualization, methodology, software, writing; ES helped in supervision, editing, reviewing; SS contributed to supervision, validation, reviewing.
Corresponding author
Ethics declarations
Conflict of interest
Authors declare that they have no conflict of interest.
Ethical approval
This article does not contain any studies with human participants or animals performed by any of the authors.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Salajegheh, F., Salajegheh, E. & Shojaee, S. An enhanced approach for optimizing mathematical and structural problems by combining PSO, GSA and gradient directions. Soft Comput 26, 11891–11913 (2022). https://doi.org/10.1007/s00500-022-07007-w
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00500-022-07007-w