1 Scope of the article: global optimization algorithms

Global optimization methods aim to avoid getting trapped at local, suboptimal, points. They may, as in local restart methods, or may not, as in standard evolutionary algorithms, use gradient information. They concern continuous, discrete, or mixed design variables. Local optima often appear in discrete optimization problems, which therefore is a predominant field of application of global optimization. Theoretical convergence proofs of global optimization algorithms are typically established as the number of optimization criteria evaluations tends to infinity. Moreover, any algorithm can be made asymptotically global (in probability) by adding a small noise to the iterates (Auger and Hansen 2011). For these reasons, theoretical global convergence is likely to be of limited interest to most SMO Journal readers. In practice, the important feature of these algorithms is the ability to locate, at an acceptable computing cost, better solutions to multimodal problems than local searches would. For problems with large number of local optima, or with deceptive optima (overall trends leading away from the optimum) convergence to the global optima requires very large number of evaluations of the optimization criteria.

When the many thousands of evaluations needed cannot be afforded, settling for local searches is an often used solution. Consequently, transformations that can greatly reduce the number of local optima, were invented that allow us to fall back on gradient based methods (which can be restarted from different points).

It is interesting to note that SIMPFootnote 1 is a transformation that converts the binary topology optimization (we want a black and white solution) to a continuous differentiable problem apparently without creating a large number of local optima. As is apparent from Sigmund (2011), once such a transformation is found, its practitioners find it difficult to understand why one would want to attack the discrete problem directly using expensive discrete optimization methods.

Another popular example of such transformation methods is the use of lamination parameters for finding the optimal stacking sequence of a composite laminate (Fukunaga and Vanderplaats 1991). This approach reduces the number of design variables and local optima, and so it is particularly useful when the ply orientations vary from point to point (e.g., Ijsselmuiden et al. 2010). Papers on these methods clearly belong in SMO, and so they will not be discussed further.

2 An analysis of the global optimization literature

An important goal of a journal is to publish papers that would be useful to its readers, and one common measure of usefulness is the number of citations. Of course, it is an imperfect measure in that a paper may garner many citations for other reasons. However, it is the best measure that we presently have. Appendix A provides a list of global optimization articles, with structural or fluid mechanics applications, which have received over 100 citations on Google Scholar. Two observations can be made from this list:

Firstly, the vast majority of these articles deal with metaphorical stochastic methods (evolutionary/genetic algorithms, particle swarm and harmony search). In other words, deterministic and more mathematically oriented contributions are not as much cited.

Secondly, the majority of these articles proposes a specialization of a general method to a mechanical problem, as opposed to articles that propose a new general method.

Good articles about new global optimization methods (in contrast to tweaks on existing methods) have not been published in SMO but in more mathematically oriented journals. SMO may want to focus on applications in mechanical engineering, and particularly encourage specializations to applications requiring solid mechanics and fluid mechanics simulations.

Since many SMO readers are interested in metaphorical based global optimization contributions, SMO may want to continue publishing such contributions. However, a large number of submitted articles of this category have a poor scientific quality and are not very useful, as evidenced by very few citations. Therefore, editorial guidelines are needed to clarify what is demanded from authors. We propose such guidelines. In particular, since most metaphorical global optimization methods are empirically tested (convergence results are almost never accessible), standards about the testing procedure might be proposed. The metaphor used to present the method should also be mathematically formalized in order to allow comparison with other existing methods. The starting rapid screening procedure might also be systematically resorted to for this category of articles.

3 Proposed editorial guidelines

The following guidelines might be included in the “Instructions for Authors” of SMO Journal.

1.   Metaphor-based optimization methods

Metaphorical optimization methods like evolutionary/genetic algorithms, particle swarm optimization, harmony search, ant colony, Tabu search, ..., should be properly formalized as algorithms, for example through pseudo-code or mathematical formulation devoid of the metaphor. When it is claimed that the algorithm or portions of it are new, the difference with existing metaphorical optimization algorithms or other related optimization methods should be clearly explained. For example, the generation of a harmony matrix in harmony search algorithms is similar to the application of a uniform crossover in genetic algorithms. The chaotic imperialist competitive algorithm resembles island models in genetic algorithms and particle swarm optimizations. The hybrid cellular automaton in topology optimization is close to other filtering schemes such as the slope control formulation (Petersson and Sigmund 1998)... The choice of one reviewer who has not published articles about the particular metaphor which is being proposed should be encouraged.

Metaphor-based optimization methods usually come with parameters whose tuning needs to be properly analyzed. Typically, the relationship between these parameters and the number of design variables should be discussed.

2.   Non-deterministic optimization algorithms

Non-deterministic optimization algorithms such as probabilistically restarted local searches, Bayesian optimization, simulated annealing, stochastic gradient methods, and evolutionary algorithms, do not yield the same result when restarted with the same parameters. Therefore, submissions containing empirical results based on non-deterministic optimization algorithms should provide proper statistical tests, i.e., repeated sample tests with measures of performance and spread (e.g., mean, variances, percentiles, confidence levels...).

3.   Test problems

Global optimization methods are not general purpose like local searches. The ‘no free lunch’ paradigm states that if a global optimization algorithm works well for some problems, it will work poorly for others. Therefore, authors should explain what in their algorithm is particularly suited to what class of structural or fluid mechanics problems and provide examples of applications to such problems. Examples should be chosen with strong preference to structural or fluid problems tested with other algorithms. It is important that test problems include ones that cannot be solved more efficiently by restarted gradient methods. The aspect of the problem that makes it difficult for restarted global methods (e.g., very large number of local optima or deceptive optima) should be clearly established.

Finally, we notice that many scientifically recognized teams that work in global optimization with SMO applications do not reach the 100 citations. Examples of such contributions, which could be in SMO scope and raise the journal quality, are listed in Appendix B. It seems also necessary for SMO to remain attractive to such contributions by keeping a balance between metaphorical and other global optimization articles.