Skip to main content

Towards Better Integration of Surrogate Models and Optimizers

  • Chapter
  • First Online:
High-Performance Simulation-Based Optimization

Part of the book series: Studies in Computational Intelligence ((SCI,volume 833))

Abstract

Surrogate-Assisted Evolutionary Algorithms (SAEAs) have been proven to be very effective in solving (synthetic and real-world) computationally expensive optimization problems with a limited number of function evaluations. The two main components of SAEAs are: the surrogate model and the evolutionary optimizer, both of which use parameters to control their respective behavior. These parameters are likely to interact closely, and hence the exploitation of any such relationships may lead to the design of an enhanced SAEA. In this chapter, as a first step, we focus on Kriging and the Efficient Global Optimization (EGO) framework. We discuss potentially profitable ways of a better integration of model and optimizer. Furthermore, we investigate in depth how different parameters of the model and the optimizer impact optimization results. In particular, we determine whether there are any interactions between these parameters, and how the problem characteristics impact optimization results. In the experimental study, we use the popular Black-Box Optimization Benchmarking (BBOB) testbed. Interestingly, the analysis finds no evidence for significant interactions between model and optimizer parameters, but independently their performance has a significant interaction with the objective function. Based on our results, we make recommendations on how best to configure EGO.

All authors contributed equally to this work.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 159.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    It should be noted that it is common to use maximum likelihood estimation of hyperparameters rather than integrating over all possible hyperparameters given a prior probability distribution. Although some research suggest it aids the optimization process, but it may increase the overall computation time [42].

  2. 2.

    Rank models are also important in the domain of multi-objective optimization. Here, ranks can be easily produced (via non-dominated sorting), whereas numeric indicators (such as crowding distance or hypervolume) are still subject of current research. Rank-based models would allow to represent multiple objectives with just one single surrogate model.

  3. 3.

    We used limited memory BFGS with five restarts to estimate the hyperparameters [1].

References

  1. GPy: a Gaussian process framework in python (since 2012). http://github.com/SheffieldML/GPy

  2. Acar, E., Rais-Rohani, M.: Ensemble of metamodels with optimized weight factors. Struct. Multidiscip. Optim. 37(3), 279–294 (2009)

    Article  Google Scholar 

  3. Aggarwal, C.C., Hinneburg, A., Keim, D.A.: On the surprising behavior of distance metrics in high dimensional space. In: Database Theory — ICDT 2001, pp. 420–434. Springer Science + Business Media, Berlin (2001)

    Chapter  Google Scholar 

  4. Andrianakis, I., Vernon, I.R., McCreesh, N., McKinley, T.J., Oakley, J.E., Nsubuga, R.N., Goldstein, M., White, R.G.: Bayesian history matching of complex infectious disease models using emulation: a tutorial and a case study on HIV in Uganda. PLoS Comput. Biol. 11(1), 1–18 (2015)

    Article  Google Scholar 

  5. Barber, D.: Bayesian Reasoning and Machine Learning. Cambridge University Press, Cambridge (2012)

    Google Scholar 

  6. Bartz-Beielstein, T.: A survey of model-based methods for global optimization. In: Papa, G., Mernik, M. (eds.) Bioinspired Optimization Methods and Their Applications, pp. 1–18 (2016)

    Google Scholar 

  7. Bull, A.D.: Convergence rates of efficient global optimization algorithms. J. Mach. Learn. Res. 12, 2879–2904 (2011)

    MathSciNet  MATH  Google Scholar 

  8. Chugh, T., Sindhya, K., Hakanen, J., Miettinen, K.: A survey on handling computationally expensive multiobjective optimization problems with evolutionary algorithms. Soft Comput. 23(9), 3137–3166 (2019). https://doi.org/10.1007/s00500-017-2965-0

    Article  Google Scholar 

  9. Chugh, T., Jin, Y., Miettinen, K., Hakanen, J., Sindhya, K.: A surrogate-assisted reference vector guided evolutionary algorithm for computationally expensive many-objective optimization. IEEE Trans. Evol. Comput. 22(1), 129–142 (2018). https://doi.org/10.1109/TEVC.2016.262230 (to appear)

  10. Coello, C., Lamont, G., Veldhuizen, D.: Evolutionary Algorithms for Solving Multi-objective Problems, 2nd edn. Springer, New York (2007)

    MATH  Google Scholar 

  11. Couckuyt, I., Deschrijver, D., Dhaene, T.: Fast calculation of multiobjective probability of improvement and expected improvement criteria for Pareto optimization. J. Glob. Optim. 60, 575–594 (2014)

    Article  MathSciNet  Google Scholar 

  12. Deb, K.: Multi-objective Optimization Using Evolutionary Algorithms. Wiley, Chichester (2001)

    MATH  Google Scholar 

  13. Eiben, A., Smit, S.: Parameter tuning for configuring and analyzing evolutionary algorithms. Swarm Evol. Comput. 1(1), 19–31 (2011)

    Article  Google Scholar 

  14. Forrester, A., Sobester, A., Keane, A.: Engineering Design via Surrogate Modelling. Wiley, New York (2008)

    Google Scholar 

  15. Forrester, A.I., Sóbester, A., Keane, A.J.: Multi-fidelity optimization via surrogate modelling. Proc. R. Soc. A: Math. Phys. Eng. Sci. 463(2088), 3251–3269 (2007)

    Article  MathSciNet  Google Scholar 

  16. Fortin, F.A., De Rainville, F.M., Gardner, M.A., Parizeau, M., Gagné, C.: DEAP: evolutionary algorithms made easy. J. Mach. Learn. Res. 13, 2171–2175 (2012)

    MathSciNet  MATH  Google Scholar 

  17. Hansen, N.: Compilation of results on the 2005 CEC benchmark functions (2005)

    Google Scholar 

  18. Hansen, N., Finck, S., Ros, R., Auger, A.: Real-parameter black-box optimization benchmarking 2009: noiseless functions definitions. Research report RR-6829, INRIA (2009)

    Google Scholar 

  19. Hansen, N., Auger, A., Mersmann, O., Tušar, T., Brockhoff, D.: COCO: a platform for comparing continuous optimizers in a black-box setting (2016). arXiv:1603.08785

  20. Hauschild, M., Pelikan, M.: An introduction and survey of estimation of distribution algorithms. Swarm Evol. Comput. 1(3), 111–128 (2011)

    Article  Google Scholar 

  21. Hutter, F., Hoos, H.H., Leyton-Brown, K.: Sequential model-based optimization for general algorithm configuration (extended version). Technical report TR-2010-10, University of British Columbia, Department of Computer Science (2010)

    Google Scholar 

  22. Jin, Y.: Surrogate-assisted evolutionary computation: recent advances and future challenges. Swarm Evol. Comput. 1(2), 61–70 (2011)

    Article  Google Scholar 

  23. Jones, D., Schonlau, M., Welch, W.: Efficient global optimization of expensive black-box functions. J. Glob. Optim. 13, 455–492 (1998)

    Article  MathSciNet  Google Scholar 

  24. Knowles, J.: ParEGO: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems. IEEE Trans. Evol. Comput. 10, 50–66 (2006)

    Article  Google Scholar 

  25. Lane, F., Azad, R.M.A., Ryan, C.: Principled evolutionary algorithm search operator design and the kernel trick. In: 2016 IEEE Symposium Series on Computational Intelligence (SSCI), pp. 1–9 (2016). https://doi.org/10.1109/SSCI.2016.7850204

  26. Le, M.N., Ong, Y.S., Menzel, S., Jin, Y., Sendhoff, B.: Evolution by adapting surrogates. Evol. Comput. 21(2), 313–340 (2013)

    Article  Google Scholar 

  27. Lim, D., Jin, Y.: Generalizing surrogate-assisted evolutionary computation. IEEE Trans. Evol. Comput. 14, 329–354 (2010)

    Article  Google Scholar 

  28. Lockwood, B.A., Anitescu, M.: Gradient-enhanced universal Kriging for uncertainty propagation. Nucl. Sci. Eng. 170(2), 168–195 (2012)

    Article  Google Scholar 

  29. Loshchilov, I., Schoenauer, M., Sebag, M.: Dominance-based Pareto-surrogate for multi-objective optimization. In: Deb, K., et al. (eds.) Simulated Evolution and Learning (SEAL 2010). LNCS, vol. 6457, pp. 230–239. Springer, Berlin (2010)

    Google Scholar 

  30. Mckay, M., Beckman, R., Conover, W.: A comparison of three methods for selecting values of input variables in the analysis of output from a computer code. Technometrics 42, 55–61 (2000)

    Article  Google Scholar 

  31. Mogilicharla, A., Chugh, T., Majumder, S., Mitra, K.: Multi-objective optimization of bulk vinyl acetate polymerization with branching. Mater. Manuf. Process. 29, 210–217 (2014)

    Article  Google Scholar 

  32. Montgomery, D.C.: Design and Analysis of Experiments, 5th edn. Wiley, New York (1997)

    Google Scholar 

  33. Moscato, P., Cotta, C.: A Gentle Introduction to Memetic Algorithms, pp. 105–144. Springer, Boston (2003)

    Google Scholar 

  34. Oyama, A., Okabe, Y., Shimoyama, K., Fujii, K.: Aerodynamic multiobjective design exploration of a flapping airfoil using a Navier-Stokes solver. J. Aerosp. Comput. Inf. Commun. 6, 256–270 (2009)

    Article  Google Scholar 

  35. Ponweiser, W., Wagner, T., Biermann, D., Vincze, M.: Multiobjective optimization on a limited budget of evaluations using model-assisted S-metric selection. In: Proceedings of the Parallel Problem Solving from Nature-PPSN X, pp. 784–794. Springer, Berlin (2008)

    Google Scholar 

  36. Rahat, A.A.M., Everson, R.M., Fieldsend, J.E.: Alternative infill strategies for expensive multi-objective optimisation. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 873–880. ACM (2017)

    Google Scholar 

  37. Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. The MIT Press, Cambridge (2006)

    Book  Google Scholar 

  38. Runarsson, T.P.: Ordinal regression in evolutionary computation. In: Runarsson, T.P., Beyer, H.G., Burke, E., Merelo-Guervós, J.J., Whitley, L.D., Yao, X. (eds.) Parallel Problem Solving from Nature - PPSN IX: 9th International Conference, Reykjavik, Iceland, 9–13 September 2006, Proceedings, pp. 1048–1057. Springer, Berlin (2006)

    Google Scholar 

  39. Shahriari, B., Swersky, K., Wang, Z., Adams, R.P., de Freitas, N.: Taking the human out of the loop: a review of Bayesian optimization. Proc. IEEE 104(1), 148–175 (2016)

    Article  Google Scholar 

  40. Singh, H., Ray, T., Smith, W.: Surrogate assisted simulated annealing (SASA) for constrained multi-objective optimization. In: Proceedings of the IEEE Congress on Evolutionary Computation, pp. 1–8. IEEE (2010)

    Google Scholar 

  41. Snelson, E., Ghahramani, Z.: Sparse Gaussian processes using pseudo-inputs. Adv. Neural Inf. Process. Syst. 18, 1257 (2006)

    Google Scholar 

  42. Snoek, J., Larochelle, H., Adams, R.P.: Practical Bayesian optimization of machine learning algorithms. In: Advances in Neural Information Processing Systems, pp. 2951–2959 (2012)

    Google Scholar 

  43. Tripathy, R., Bilionis, I., Gonzalez, M.: Gaussian processes with built-in dimensionality reduction: applications to high-dimensional uncertainty propagation. J. Comput. Phys. 321, 191–223 (2016)

    Article  MathSciNet  Google Scholar 

  44. Ursem, R.K.: From Expected Improvement to Investment Portfolio Improvement: Spreading the Risk in Kriging-Based Optimization, pp. 362–372. Springer International Publishing, Cham (2014)

    Chapter  Google Scholar 

  45. Volz, V., Rudolph, G., Naujoks, B.: Surrogate-assisted partial order-based evolutionary optimisation. In: Conference on Evolutionary Multi-Criterion Optimization (EMO), pp. 639–653. Springer, Cham (2017)

    Google Scholar 

  46. Wang, H., van Stein, B., Emmerich, M., Bäck, T.: Time complexity reduction in efficient global optimization using cluster Kriging. In: Proceedings of the Genetic and Evolutionary Computation Conference, GECCO’17, pp. 889–896. ACM, New York (2017)

    Google Scholar 

  47. Weihs, C.: MOI-MBO: multiobjective infill for parallel model-based optimization. In: Learning and Intelligent Optimization: 8th International Conference, Lion 8, Gainesville, FL, USA, 16–21 February 2014. Revised Selected Papers, vol. 8426, p. 173. Springer (2014)

    Google Scholar 

  48. Wessing, S., Preuss, M.: The true destination of EGO is multi-local optimization (2017). arXiv:1704.05724

  49. Yao, X.: An empirical study of genetic operators in genetic algorithms. Microprocess. Microprogram. 38(1), 707–714 (1993)

    Article  MathSciNet  Google Scholar 

  50. Zaefferer, M., Stork, J., Friese, M., Fischbach, A., Naujoks, B., Bartz-Beielstein, T.: Efficient global optimization for combinatorial problems. In: Proceedings of the 2014 Conference on Genetic and Evolutionary Computation, GECCO’14, pp. 871–878. ACM, New York (2014)

    Google Scholar 

Download references

Acknowledgements

Parts of this work are the result of the discussions at the Surrogate-Assisted Multi-Criteria Optimization (SAMCO) Lorentz Center Workshop in Leiden, NL (February 29, 2016 till March 4, 2016). The research of Tinkle Chugh was supported by the FiDiPro project DeCoMo funded by Tekes: Finnish Funding Agency for Innovation and Natural Environment Research Council [grant number NE/P017436/1]. Alma Rahat was supported by the Engineering and Physical Sciences Research Council, UK [grant number EP/M017915/1]. The research of Martin Zaefferer is part of a project that has received funding from the European Union’s Horizon 2020 research and innovation program under grant agreement No. 692286.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Martin Zaefferer .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Chugh, T., Rahat, A., Volz, V., Zaefferer, M. (2020). Towards Better Integration of Surrogate Models and Optimizers. In: Bartz-Beielstein, T., Filipič, B., Korošec, P., Talbi, EG. (eds) High-Performance Simulation-Based Optimization. Studies in Computational Intelligence, vol 833. Springer, Cham. https://doi.org/10.1007/978-3-030-18764-4_7

Download citation

Publish with us

Policies and ethics