Abstract
Hybridizing metaheuristic approaches becomes a common way to improve the efficiency of optimization methods. Many hybridizations deal with the combination of several optimization methods. In this paper we are interested in another type of hybridization, where datamining approaches are combined within an optimization process. Hence, we propose to study the interest of combining metaheuristics and datamining through a short survey that enumerates the different opportunities of such combinations based on literature examples.
Access provided by Autonomous University of Puebla. Download to read the full chapter text
Chapter PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Agrawal, R., Srikant, R.: Fast algorithms for mining association rules. In: Bocca, J.B., Jarke, M., Zaniolo, C. (eds.) Proceeding 20th International Conference Very Large Data Bases, VLDB, pp. 12–15. Morgan Kaufmann, San Francisco (1994)
Baluja, S.: Population based incremental learning. Technical Report CMU-CS-94-163, Carnegie Mellon University (1994), http://www.cs.cmu.edu/afs/cs/user/baluja/www/techreps.html
Chan, Z.S.H., Kasabov, N.: Gene trajectory clustering with a hybrid genetic algorithm and expectation maximization method. In: IEEE International Joint Conference on Neural Networks, pp. 1669–1674 (2004)
Dalboni, F.L., Ochi, L.S., Drummond, L.M.A.: On improving evolutionary algorithms by using data mining for the oil collector vehicle routing problem. In: International Network Optimization Conference (2003)
Falkenauer, E.: A new representation and operators for genetic algorithms applied to grouping problems. Evolutionary Computation 2(2), 123–144 (1994)
Gaspar-Cunha, A., Vieira, A.S.: A hybrid multi-objective evolutionary algorithm using an inverse neural network. In: Hybrid Metaheuristic, pp. 25–30 (2004)
Hall, L.O., Özyurt, I.B., Bezdek, J.C.: Clustering with a genetically optimized approach. IEEE Trans. on Evolutionary Computation 3(2), 103–112 (1999)
Handa, H., Baba, N., Katai, O., Sawaragi, T.: Coevolutionary genetic algorithm with effective exploration and exploitation of useful schemata. In: Proceedings of the International Conference on Neural Information Systems, vol. 1, pp. 424–427 (1997)
Handa, H., Horiuchi, T., Katai, O., Baba, M.: A novel hybrid framework of coevolutionary GA and machine learning. International Journal of Computational Intelligence and Applications (2002)
Handa, H., Horiuchi, T., Katai, O., Kaneko, T., Konishi, T., Baba, M.: Fusion of coevolutionary ga and machine learning techniques through effective schema extraction. In: Spector, L., Goodman, E.D., Wu, A., Langdon, W.B., Voigt, H.-M., Gen, M., Sen, S., Dorigo, M., Pezeshk, S., Garzon, M.H., Burke, E. (eds.) Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2001), July 7-11, 2001, p. 764. Morgan Kaufmann, San Francisco (2001)
Handl, J., Knowles, J.: Improvements to the scalability of multiobjective clustering. In: IEEE (ed.), IEEE Congress on Evolutionary Computation, pp. 438–445 (2005)
Hong, T.P., Wang, H., Chen, W.: Simultaneously applying multiple mutation operators in genetic algorithms. Journal of heuristics 6, 439–455 (2000)
Huyet, A.-L.: Extraction de connaissances pertinentes sur le comportement des systemes de production: une approche conjointe par optimisation evolutionniste via simulation et apprentissage. PhD thesis, Université Blaise Pascal Clermont II (October 2004)
Huyet, A.-L., Paris, J.-L.: Configuration and analysis of a multiproduct kanban system using evolutionary optimisation coupled to machine learning (ISBN 2-9512309-5-8, CDROM). In: Proceedings of CESA 2003, the IMACS Multiconference Computational Engineering in Systems Applications (July 2003) ISBN 2-9512309-5-8, CDROM
Jin, Y.: A comprehensive survey of fitness approximation in evolutionary computation. Soft Computing Journal 9(1), 3–12 (2005)
Jin, Y., Sendhoff, B.: Reducing fitness evaluations using clustering techniques and neural networks ensembles. In: Deb, K., et al. (eds.) GECCO 2004. LNCS, vol. 3102, pp. 688–699. Springer, Heidelberg (2004)
Jourdan, L., Corne, D., Savic, D.A., Walters, G.A.: Preliminary investigation of the learnable evolution model for faster/better multiobjective water systems design. In: Coello Coello, C.A., Hernández Aguirre, A., Zitzler, E. (eds.) EMO 2005. LNCS, vol. 3410, pp. 841–855. Springer, Heidelberg (2005)
Kim, H.-S., Cho, S.-B.: An efficient genetic algorithms with less fitness evaluation by clustering. In: Proceedings of IEEE Congress on Evolutionary Computation, pp. 887–894. IEEE, Los Alamitos (2001)
Larranaga, P., Lozano, J.A.: Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation. Kluwer Academic Publishers, Dordrecht (2002)
Louis, S.J.: Genetic learning from experience. In: IEEE (ed.), Congress on Evolutionary Computation (CEC 2003), Australia, December 2003, pp. 2118–2125. IEEE, Los Alamitos (2003)
Louis, S.J.: Learning for evolutionary design. In: Proceedings of the 2003 Nasa/DoD Conference on Evolutionary Hardware, July 2003, pp. 17–23 (2003)
Michalski, R.S.: Learnable evolution model: Evolutionary processes guided by machine learning. Machine Learning 38(1–2), 9–40 (2000)
Michalski, R.S., Cervon, G., Kaufman, K.A.: Speeding up evolution through learning: Lem. In: Intelligent Information Systems 2000, pp. 243–256 (2000)
Michalski, R.S., Larson, J.B.: Selection of most representative training examples and incremental generation of vl1 hypothesis: The underlying methodology and the descriptions of programs esel and aq11. Technical Report Report No. 867, Urbana, Illinois: Department of Computer Science, University of Illinois (1978)
Michalski, R.S., Mozetic, I., Hong, J., Lavrac, N.N.: The multipurpose incremental learning system aq15 and its testing application to three medical domains. In: Proc. of the Fifth National Conference on Artificial Intelligence, pp. 1041–1045. Morgan Kaufmann, PA (1986)
Muhlenbein, H., Paass, G.: From recombination of genes to the estimation of distributions: I. binary parameters. In: Ebeling, W., Rechenberg, I., Voigt, H.-M., Schwefel, H.-P. (eds.) PPSN 1996. LNCS, vol. 1141, pp. 178–187. Springer, Heidelberg (1996)
Pelikan, M., Goldberg, D.E., Cantú-Paz, E.: BOA: The Bayesian optimization algorithm. In: Banzhaf, W., Daida, J., Eiben, A.E., Garzon, M.H., Honavar, V., Jakiela, M., Smith, R.E. (eds.) Proceedings of the Genetic and Evolutionary Computation Conference GECCO-1999, Orlando, FL, 13-17, vol. I, pp. 525–532. Morgan Kaufmann Publishers, San Francisco (1999)
Ramsey, C., Grefenstette, J.: Case-based initialization of genetic algorithms. In: Fifth International Conference on Genetic Algorithms, pp. 84–91 (1993)
Rasheed, K.: An incremental-approximate-clustering approach for developing dynamic reduced models for design optimization. In: Proceedings of the 2000 Congress on Evolutionary Computation (CEC 2000), pp. 6–9. IEEE Press, Los Alamitos (2000)
Rasheed, K., Hirsh, H.: Using case based learning to improve genetic algorithm based design optimization. In: Bäck, T. (ed.) Proceedings of the Seventh International Conference on Genetic Algorithms (ICGA 1997), Morgan Kaufmann, San Francisco (1997)
Rasheed, K., Hirsh, H.: Informed operators: Speeding up genetic-algorithm-based design optimization using reduced models. In: Whitley, L.D., Goldberg, D.E., Cantú-Paz, E., Spector, L., Parmee, I.C., Beyer, H.-G. (eds.) GECCO, pp. 628–635. Morgan Kaufmann, San Francisco (2000)
Rasheed, K., Vattam, S., Ni, X.: Comparison of methods for developing dynamic reduced models for design optimization. In: Proceedings of the Congress on Evolutionary Computation (CEC 2002), pp. 390–395 (2002)
Ravise, C., Sebag, M.: An advanced evolution should not repeat its past errors. In: International Conference on Machine Learning, pp. 400–408 (1996)
Ravise, C., Sebag, M., Schoenauer, M.: A genetic algorithm led by induction, http://citeseer.ist.psu.edu/126354.html
Reynolds, R.G., Michalewicz, Z., Cavaretta, M.J.: Using cultural algorithms for constraint handling in genocop. In: Evolutionary Programming, pp. 289–305 (1995)
Reynolds, R.G., Peng, B.: Cultural algorithms: computational modeling of how cultures learn to solve problems: an engineering example. Cybernetics and Systems 36(8), 753–771 (2005)
Ribeiro, M., Plastino, A., Martins, S.: Hybridization of grasp metaheuristic with data mining techniques. Special Issue on Hybrid Metaheuristic of the Journal of Mathematical Modelling and Algorithms 5(1), 23–41 (2006)
Ribeiro, M., Trindade, V., Lastino, A., Martins, S.: Hybridization of GRASP metaheuristic with data mining techniques. In: Workshop on Hybrid Metaheuristics 16th European Conference on Artificial Intelligence (ECAI), pp. 69–78 (2004)
Santos, H.G., Ochi, L.S., Marinho, E.H., Drummond, L.M.A.: Combining an evolutionary algorithm with data mining to solve a vehicle routing problem. NEUROCOMPUTING (to appear, 2006)
Santos, L., Ribeiro, M., Plastino, A., Martins, S.: A hybrid GRASP with data mining for the maximum diversity problem. In: Blesa, M.J., Blum, C., Roli, A., Sampels, M. (eds.) HM 2005. LNCS, vol. 3636, pp. 116–128. Springer, Heidelberg (2005)
Sebag, M., Ravise, C., Schoenauer, M.: Controlling evolution by means of machine learning. In: Evolutionary Programming, pp. 57–66 (1996)
Sebag, M., Schoenauer, M.: Controlling crossover through inductive learning. In: Davidor, Y., Schwefel, H.-P., Männer, R. (eds.) Parallel Problem Solving from Nature – PPSN III, pp. 209–218. Springer, Berlin (1994)
Sebag, M., Schoenauer, M., Ravise, C.: Toward civilized evolution: Developing inhibitions. In: Bäck, T. (ed.) Proceeding of the Seventh Int. Conf. on Genetic Algorithms, pp. 291–298. Morgan Kaufmann, San Francisco (1997)
Talbi, E.-G.: A taxonomy of hybrid metaheuristics. Journal of heuristics 8(2), 541–564 (2002)
Vermeulen-Jourdan, L., Corne, D., Savic, D.A., Walters, G.A.: Hybridising rule induction and multi-objective evolutionary search for optimising water distribution systems. In: Proceeding of Fourth International Conference on Hybrid Intelligent Systems (HIS 2004), pp. 434–439 (2004)
Vermeulen-Jourdan, L., Dhaenens, C., Talbi, E.-G.: Clustering nominal and numerical data: A new distance concept for a hybrid genetic algorithm. In: Gottlieb, J., Raidl, G.R. (eds.) EvoCOP 2004. LNCS, vol. 3004, pp. 220–229. Springer, Heidelberg (2004)
Yoo, S.-H., Cho, S.-B.: Partially evaluated genetic algorithm based on fuzzy c-means algorithm. In: Yao, X., Burke, E.K., Lozano, J.A., Smith, J., Merelo-Guervós, J.J., Bullinaria, J.A., Rowe, J.E., Tiňo, P., Kabán, A., Schwefel, H.-P. (eds.) PPSN 2004. LNCS, vol. 3242, pp. 440–449. Springer, Heidelberg (2004)
Zitzler, E., Thiele, L.: Multiobjective Evolutionary Algorithms: A Comparative Case Study and the Strength Pareto Approach. IEEE Transactions on Evolutionary Computation 3(4), 257–271 (1999)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Jourdan, L., Dhaenens, C., Talbi, EG. (2006). Using Datamining Techniques to Help Metaheuristics: A Short Survey. In: Almeida, F., et al. Hybrid Metaheuristics. HM 2006. Lecture Notes in Computer Science, vol 4030. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11890584_5
Download citation
DOI: https://doi.org/10.1007/11890584_5
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-46384-9
Online ISBN: 978-3-540-46385-6
eBook Packages: Computer ScienceComputer Science (R0)