Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

1.1 Metaheuristic Algorithms for Optimization

In today’s extremely competitive world, human beings attempt to exploit the maximum output or profit from a limited amount of available resources. In engineering design, for example, choosing design variables that fulfill all design requirements and have the lowest possible cost is concerned, i.e., the main objective is to comply with basic standards but also to achieve good economic results. Optimization offers a technique for solving this type of issues.

The term “optimization” refers to the study of problems in which one seeks to minimize or maximize a function by systematically choosing the values of variables from/within a permissible set. In one hand, a vast amount of research has been conducted in this area of knowledge, hoping to develop effective and efficient optimization algorithms. On the other hand, the application of the existing algorithms to real projects has also been the focus of many studies.

In the past, the most commonly used optimization techniques were gradient-based algorithms which utilized gradient information to search the solution space near an initial starting point [1, 2]. In general, gradient-based methods converge faster and can obtain solutions with higher accuracy compared to stochastic approaches. However, the acquisition of gradient information can be either costly or even impossible to obtain the minima. Moreover, this kind of algorithms is only guaranteed to converge to local minima. Furthermore, a good starting point is quite vital for a successful execution of these methods. In many optimization problems, prohibited zones, side limits, and non-smooth or non-convex functions should be taken into consideration. As a result, these non-convex optimization problems cannot easily be solved by these methods.

On the other hand, other types of optimization methods, known as metaheuristic algorithms, are not restricted in the aforementioned manner. These methods are suitable for global search due to their capability of exploring and finding promising regions in the search space at an affordable computational time. Metaheuristic algorithms tend to perform well for most of the optimization problems [3, 4]. This is because these methods refrain from simplifying or making assumptions about the original problem. Evidence of this is their successful applications to a vast variety of fields, such as engineering, physics, chemistry, art, economics, marketing, genetics, operations research, robotics, social sciences, and politics.

The word heuristic has its origin in the old Greek work heuriskein, which means the art of discovering new strategies (rules) to solve problems. The suffix meta, also is a Greek word, means “upper level methodology.” The term metaheuristic was introduced by Glover in the paper [5].

A heuristic method can be considered as a procedure that is likely to discover a very good feasible solution, but not necessarily an optimal solution, for a considered specific problem. No guarantee can be provided about the quality of the solution obtained, but a well-designed heuristic method usually can provide a solution that is at least nearly optimal. The procedure also should be sufficiently efficient to deal with very large problems. The heuristic methods are often considered as iterative algorithm, where each iteration involves conducting a search for a new solution that might be better than the best solution found previously. After a reasonable time when the algorithm is terminated, the solution it provides is the best one that was found during any iteration. A metaheuristic is formally defined as an iterative generation process which guides a subordinate heuristic by combining intelligently different concepts for exploring (global search) and exploiting (local search) the search space; learning strategies are used to structure information in order to find efficiently near-optimal solutions [57].

Metaheuristic algorithms have found many applications in different areas of applied mathematics, engineering, medicine, economics, and other sciences. These methods are extensively utilized in the design of different systems in civil, mechanical, electrical, and industrial engineering. At the same time, one of the most important trends in optimization is the constantly increasing emphasis on the interdisciplinary nature of the field.

1.2 Optimal Design of Structures and Goals of the Present Book

In the area of structural engineering that is the main concern of this book, one tries to achieve certain objectives in order to optimize weight, construction cost, geometry, layout, topology, and time satisfying certain constraints. Since resources, fund, and time are always limited, one has to find solutions to optimal usage of these resources.

The main goal of this book is to introduce some well-established and the most recently developed metaheuristics for optimal design of structures. Schematic of the chapters of the present book in one glance is shown in Fig. 1.1.

Fig. 1.1
figure 1

Schematic of the chapters of the present book in one glance

Most of these methods are either nature-based or physics-based algorithm, (Fig. 1.2). Though many design examples are included, however, the results may or may not have small constraint violations and do not constitute the main objective of the book.

Fig. 1.2
figure 2

Classification of the metaheuristics presented in this book

1.3 Organization of the Present Book

The remaining chapters of this book are organized in the following manner:

Chapter 2 introduces the well-known particle swarm optimization (PSO) algorithms. These algorithms are nature-inspired population-based metaheuristic algorithms originally accredited to Eberhart and Kennedy. The algorithms mimic the social behavior of birds flocking and fishes schooling. Starting from a randomly distributed set of particles (potential solutions), the algorithms try to improve the solutions according to a quality measure (fitness function). The improvisation is preformed through moving the particles around the search space by means of a set of simple mathematical expressions which model some interparticle communications. These mathematical expressions, in their simplest and most basic form, suggest the movement of each particle toward its own best experienced position and the swarm’s best position so far, along with some random perturbations.

Chapter 3 presents the well-established charged system search (CSS) algorithm, developed by Kaveh and Talatahari. This chapter consists of two parts. In the first part, an optimization algorithm based on some principles from physics and mechanics is introduced. In this algorithm, the governing Coulomb law from electrostatics and the governing laws of motion from the Newtonian mechanics are utilized. CSS is a multi-agent approach in which each agent is a charged particle (CP). CPs can affect each other based on their fitness values and their separation distances. The quantity of the resultant force is determined by using the electrostatics laws, and the quality of the movement is determined using the governing laws of motion from the Newtonian mechanics. CSS can be utilized in all optimization fields; especially it is suitable for non-smooth or non-convex domains. CSS needs neither the gradient information nor the continuity of the search space. In the second part, CSS is applied to optimal design of skeletal structures, and high performance of CSS is illustrated.

Chapter 4 extends the algorithm of the previous chapter and presents the magnetic charged system search, developed by Kaveh, Motie Share, and Moslehi. This chapter consists of two parts. In the first part, the standard magnetic charged system search (MCSS) is presented and applied to different numerical examples to examine the efficiency of this algorithm. The results are compared to those of the original charged system search method. In the second part, an improved form of the MCSS algorithm, denoted by IMCSS, is presented, and also its discrete version is described. The IMCSS algorithm is applied to optimization of truss structures with continuous and discrete variables to demonstrate the performance of this algorithm in the field of structural optimization.

Chapter 5 presents a generalized CSS algorithm known as the field of force optimization. Although different metaheuristic algorithms have some differences in approaches to determine the optimum solution, however their general performance is approximately the same. They start the optimization with random solutions; and the subsequent solutions are based on randomization and some other rules. With progress on the optimization process, the power of rules increases, and the power of randomization decreases. These rules can be modeled by a familiar concept of physics as well known as the field of force (FOF). FOF is a concept which is utilized in physics to explain the reason of the operation of the universe. The virtual FOF model is approximately simulated by using the concepts of real-world fields such as gravitational, magnetic, or electric fields.

Chapter 6 presents the recently developed algorithm known as dolphin echolocation optimization, proposed by Kaveh and Farhoudi. Nature has provided inspiration for most of the man-made technologies. Scientists believe that dolphins are the second to humans in smartness and intelligence. Echolocation is the biological sonar used by dolphins and several kinds of other animals for navigation and hunting in various environments. This ability of dolphins is mimicked to develop a new optimization method. There are different metaheuristic optimization methods, but in most of these algorithms, parameter tuning takes a considerable time of the user, persuading the scientists to develop ideas to improve these methods. Studies have shown that metaheuristic algorithms have certain governing rules and knowing these rules helps to get better results. Dolphin echolocation takes advantages of these rules and outperforms many existing optimization methods, while it has few parameters to be set. The new approach leads to excellent results with low computational efforts.

Chapter 7 contains the most recently developed algorithm so-called colliding bodies optimization (CBO) proposed by Kaveh and Mahdavi. This novel algorithm is based on one-dimensional collisions between bodies, with each agent solution being considered as the massed object or body. After a collision of two moving bodies having specified masses and velocities, these bodies are separated with new velocities. This collision causes the agents to move toward better positions in the search space. CBO utilizes simple formulation to find minimum or maximum of functions; also it is independent of internal parameters.

Chapter 8 presents the ray optimization (RO) algorithm originally developed by Kaveh and Khayatazad. Similar to other multi-agent methods, ray optimization has a number of particles consisting of the variables of the problem. These agents are considered as rays of light. Based on the Snell’s light refraction law when light travels from a lighter medium to a darker medium, it refracts and its direction changes. This behavior helps the agents to explore the search space in early stages of the optimization process and to make them converge in the final stages. This law is the main tool of the ray optimization algorithm. This chapter consists of three parts. In the first part, the standard ray optimization is presented and applied to different mathematical functions and engineering problems. In the second part, RO is employed for size and shape optimization of truss structures. Finally in the third part, an improved ray optimization (IRO) algorithm is introduced and applied to some benchmark mathematical optimization problems and truss structure examples.

Chapter 9 presents a Modified Big Bang–Big Crunch (BB–BC) algorithm. The standard BB–BC method is developed by Erol and Eksin and consists of two phases: a Big Bang phase and a Big Crunch phase. In the Big Bang phase, candidate solutions are randomly distributed over the search space. Similar to other evolutionary algorithms, initial solutions are spread all over the search space in a uniform manner in the first Big Bang. Erol and Eksin associated the random nature of the Big Bang to energy dissipation or the transformation from an ordered state (a convergent solution) to a disorder or chaos state (new set of solution candidates).

Chapter 10 presents the cuckoo search (CS) optimization developed by Yang and colleagues. In this chapter, CS is utilized to determine optimum design of structures for both discrete and continuous variables. This algorithm is recently developed by Yang, and it is based on the obligate brood parasitic behavior of some cuckoo species together with the Lévy flight behavior of some birds and fruit flies. The CS is a population-based optimization algorithm, and similar to many others, metaheuristic algorithms start with a random initial population which is taken as host nests or eggs. The CS algorithm essentially works with three components: selection of the best by keeping the best nests or solutions, replacement of the host eggs with respect to the quality of the new solutions or Cuckoo eggs produced-based randomization via Lévy flights globally (exploration), and discovery of some cuckoo eggs by the host birds and replacing according to the quality of the local random walks (exploitation).

Chapter 11 presents the imperialist competitive algorithm (ICA) proposed by Atashpaz et al. ICA is a multi-agent algorithm with each agent being a country, which is either a colony or an imperialist. These countries form some empires in the search space. Movement of the colonies toward their related imperialist, and imperialistic competition among the empires, forms the basis of the ICA. During these movements, the powerful imperialists are reinforced and the weak ones are weakened and gradually collapsed, directing the algorithm toward optimum points.

Chapter 12 is an introduction to chaos embedded metaheuristic algorithms. In nature complex biological phenomena such as the collective behavior of birds, foraging activity of bees, or cooperative behavior of ants may result from relatively simple rules which however present nonlinear behavior being sensitive to initial conditions. Such systems are generally known as “deterministic nonlinear systems” and the corresponding theory as “chaos theory.” Thus real-world systems that may seem to be stochastic or random may present a nonlinear deterministic and chaotic behavior. Although chaos and random signals share the property of long-term unpredictable irregular behavior and many of random generators in programming softwares as well as the chaotic maps are deterministic, chaos can help order to arise from disorder. Similarly, many metaheuristic optimization algorithms are inspired from biological systems where order arises from disorder. In these cases, disorder often indicates both non-organized patterns and irregular behavior, whereas order is the result of self-organization and evolution and often arises from a disorder condition or from the presence of dissymmetries. Self-organization and evolution are two key factors of many metaheuristic optimization techniques. Due to these common properties between chaos and optimization algorithms, simultaneous use of these concepts can improve the performance of the optimization algorithms. Seemingly the benefits of such combination are a generic for other stochastic optimization, and experimental studies confirmed this, although this has not mathematically been proven yet.

Chapter 13 extends the CBO to enhanced colliding bodies optimization (ECBO) which uses a memory to save some best solutions developed by Kaveh and Ilchi Ghazaan. In addition, a mechanism is utilized to escape from local optima. The performance of the proposed algorithm is compared to those of the CBO and some optimization techniques on some benchmark mathematical functions and two standard discrete and continuous structural problems.

Chapter 14 presents the global sensitivity analysis (GSA) widely used to investigate the sensitivity of the model output with respect to its input parameters. In this chapter, a new single-solution metaheuristic optimization algorithm is presented based on GSA and applied to some benchmark constraint optimization problems and optimal design of truss structures. This algorithm is originally developed by Kaveh and Mahdavi. In this method, the single solution moves toward the specified region, using the sensitivity indicator of variables. Unlike the common metaheuristic algorithms where all the variables are simultaneously changed in the optimization process, in this approach first the high sensitive variables of solution and then the less sensitive ones are iteratively changed in the search space.

Chapter 15 presents a population-based metaheuristic algorithm inspired by the game of tug of war and originally developed by Kaveh and Zolghadr. Utilizing a sport metaphor, the algorithm, denoted as tug of war optimization (TWO), considers each candidate solution as a team participating in a series of rope pulling competitions. The teams exert pulling forces on each other based on the quality of the solutions they represent. The competing teams move to their new positions according to the governing laws of motion from the Newtonian mechanics. Unlike many other metaheuristic methods, the algorithm is formulated in such a way that considers the qualities of both of the interacting solutions.

Chapter 16 presents the water evaporation optimization (WEO) as a physics-based metaheuristic algorithm that mimics the well-known rules governing the evaporation process of water molecules from a solid surface with different wettability. This algorithm is originally developed by Kaveh and Bakhshpoori. In the WEO algorithm, molecules are updated globally and locally respectively in two independent sequential phases: monolayer and droplet evaporation phases. In this study, the computational cost of the WEO is improved through the simultaneous utilizing of both phases.

Chapter 17 presents a metaheuristic algorithm based on free vibration of single degree of freedom systems with viscous damping, and it is called vibrating particles system (VPS). This algorithm is originally developed by Kaveh and Ilchi Ghazaan. The solution candidates are considered as particles that gradually approach to their equilibrium positions. Equilibrium positions are achieved from current population and historically best position in order to have a proper balance between diversification and intensification. To evaluate the performance of the proposed method, it is applied to sizing optimization of four skeletal structures including trusses and frames.

Chapter 18 presents a nature-inspired population-based metaheuristic algorithm. The algorithm, called cyclical parthenogenesis algorithm (CPA), is inspired by reproduction and social behavior of some zoological species like aphids, which alternate between sexual and asexual reproduction. This algorithm is originally developed by Kaveh and Zolghadr. The algorithm considers each candidate solution as a living organism and iteratively improves the quality of solutions utilizing reproduction and displacement mechanisms. Mathematical and engineering design problems are employed in order to investigate the viability of the proposed algorithm. The results indicate that the performance of the newly proposed algorithm is comparable to other state-of-the-art metaheuristic algorithms.

Chapter 19 employs the idea of cascade optimization method which allows a single optimization problem to be tackled in a number of successive autonomous optimization stages. In each stage of cascade procedure, a design variable configuration is defined for the problem in a manner that at early stages, the optimizer deals with small number of design variables and at subsequent stages gradually faces with the main problem consisting of a large number of design variables. This algorithm is originally developed by Kaveh and Bolandgerami. In order to investigate the efficiency of this method, in all stages of cascade procedure, the utilized optimization algorithm is the enhanced colliding bodies optimization which is a powerful metaheuristic. Three large-scale space steel frames with 1860, 3590, and 3328 members are investigated for testing the algorithm.

Chapter 20 consists of a multi-objective optimization method to solve large-scale truss structures in continuous search space that is developed by Kaveh and Massoudi. This method is based on the charged system search, which has been used for single objective optimization in Chap. 3. In this study the aim is to develop a multi-objective optimization algorithm with higher convergence rate compared to the other well-known methods to enable to deal with multimodal optimization problems having many design variables. In this method, the CSS algorithm is utilized as a search engine in combination with clustering and particle regeneration procedures. The proposed method is examined for four mathematical functions and two structural problems, and the results are compared to those of some other state-of-art approaches.

Finally, it should be mentioned that most of the metaheuristic algorithms are attractive, because each one has its own striking features. However, the one which is simple, less parameter dependent, and easy to implement; has a good balance between exploration and exploitation, higher capability to avoid being trapped in local optima, higher accuracy, and the applicability to wider types of problems; and can deal with higher number of variables can be considered as the most attractive for engineering usage.

In order to have the above features partially or collectively, sometimes it is necessary to design hybrid algorithms. There are many such algorithms, and a successful example of this is that of Kaveh and Talatahari [8].