Abstract
The selection of an appropriate inducer is crucial for performing effective classification. In previous work we presented a system called NOEMON which relied on a mapping between dataset characteristics and inducer performance to propose inducers for specific datasets. Instance-based learning was applied to meta-learning problems, each one associated with a specific pair of inducers. The generated models were used to provide a ranking of inducers on new datasets.
Instance-based learning assumes that all the attributes have the same importance. We discovered that the best set of discriminating attributes is different for every pair of inducers.We applied a feature selection method on the meta-learning problems, to get the best set of attributes for each problem. The performance of the system is significantly improved.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Agresti, A.: Categorical Data Analysis. Wiley Series in Probability and Mathematical Statistics (1990)
Aha, D.W.: Generalizing from Case Studies: A Case study. In: Proceedings of the 9th International Machine Learning Conference (1992) (1–10)
Brazdil, P., Gama, J., Henery, R.: Characterizing the applicability of classification algorithms using meta level learning. In: proceedings of the European Machine Learning Conference, ECML94, Springer Verlag (1994) 83–102
Brazdil, P. and Carlos, S.: Exploiting past experience in ranking classifiers. In: Proceedings of the 16th International Conference on Machine Learning, Workshop on Recent Advances in meta-Learning and Future Work (1999)
Cohen, W.W.: Fast Effective Rule Induction. In: Proceedings of the 12th International Conference on Machine Learning (1995)
Duda, R., Hart, P.: Pattern Classification and Scene Analysis. John Willey and sons (1973) 66–73
Gama, J., Brazdil, P.: Characterization of Classification Algorithms. In: Proceedings of the 7th Portugese Conference in AI, EPIA 95 (1995) 83–102
Gama, J., Brazdil, P.: Linear Tree. Intelligent Data Analysis 3 (1999) 1–22
Hilario, M., Kalousis, A.: Quantifying the resilience of inductive algorithms for classification. In: Proceedings of the 4th European Conference on Principles and Practice of Knowledge Discovery in Databases (2000)
Kalousis, A., Theoharis, T.: NOEMON: Design, implementation and performance results of an intelligent assistant for classifier selection. Intelligent Data Analysis 3(5) (1999) 319–337
Kalousis, A., Hilario, M.: Knowledge discovery from incomplete data. In: Proceedings of the 2nd International Conference on Data Mining, WIT Press (2000)
Kalousis, A., Hilario, M.: Model Selection via Meta-learning: a Comparative Study. In: Proceedings of the 12th International IEEE Conference on Tools with AI (2000)
Kohavi, R., Sommerfield, D., Dougherty, J.: Data Mining Using MLC++, a Machine Learning Library in C++. Tools With AI (1996)
Kohavi, Ron and John, George Wrappers for Feature Subset Selection Artificial Intelligence Journal 97(1-2) (1997) 273–324
Lindner, C., Studer, R.: AST: Support for Algorithm Selection with a CBR Approach. In: Proceedings of the 16th International Conference on Machine Learning, Workshop on Recent Advances in Meta-Learning and Future Work (1999)
Liu, H., Motoda, H.: Feature Selection for Knowledge Discovery and Data Mining. Kluwer Academic Publishers (1998)
Michie, D., Spiegelhalter, D.J., Taylor, C.C.: Machine Learning, Neural and Statistical Classification. Ellis Horwood Series in Artificial Intelligence (1994)
Nakhacizadeh, G., Schnabl, A.: Development of Multi-Criteria Metrics for Evaluation of Data Mining Algorithms. In: Proceedings of the 3rd International Conference on Knowledge Discovery and Data Mining (1997) 37–42
Phahringer, B., Bensusan, H., Giraud-Carrier, C.: Tell me who can learn you and I can tell you who you are: Landmarking Various Learning Algorithms. In: Proceedings of the 7th International Conference on Machine Learning Morgan Kaufman (2000) 743–750
Quinlan, R.: http://www.rulequest.com/see5-info.html
Quinlan, R.: Induction of Decision Trees. Machine Learning 1(1) (1986) 81–106
Rendell, L., Seshu, R., Tcheng, D.: Layered Concept Learning and Dynamically Variable Bias Management. In: Proceedings of 10th International Joint Conference on AI (1987) 308–314
Salzberg, L.S.: On Comparing Classifiers: A Critique of Current Research and Methods. Data Mining and Knowledge Discovery 1(3) (1997) 317–327
Schaffer, C.: A Conservation Law for Generalization Performance. In: Proceedings of the 11th International Conference on Machine Learning (1994)
Soares, C., Brazdil, P.,: Zoomed ranking: selection of classification algorithms based on relevant performance information. In: Proceedings of the 4th European Conference on Principles Of Data Mining and Knowledge Discovery. Springer (2000) 126–135
Shavlik, J.W., Mooney, R.J., Towell, G.: Symbolic and Neural net learning algorithms: An experimental approach. Machine Learning 6 (1991) 111–114
Todorovski, L., Dzeroski, S.: Experiments in Meta-Level Learning with ILP In: Proceedings of the 3rd European Conference on Principles of Data Mining and Knowledge Discovery. Springer (1999) 98–106
Todorovski, L., Brazdil, P., and Soares, C.: Report on the Experiments with Feature Selection in Meta-Level Learning. In: Proceedings of the 4th European Conference on Principles on Data Mining and Knowledge Discovery, Workshop on Data Mining, Decision Support, Meta-learning and ILP (2000) 27–39
Weis, S.M., Kapouleas, I.,: An Empirical Comparison of Patter Recognition, Neural Nets, and Machine Learning Classification Algorithms. In: Proceedings of the 11th International Joint Conference on AI (1989) 781–787
Blake, C., Keogh, E., Merz, C.J.: http://www.ics.uci.edu/~mlearn/MLRepository.html. University of California, Irvine, Dept. of Information and Computer Sciences (1998)
Wolpert, D,: The Lack of A Priori Distinctions Between Learning Algorithms. Neural Computation (1996) 1341–1390
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2001 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kalousis, A., Hilario, M. (2001). Feature Selection for Meta-learning. In: Cheung, D., Williams, G.J., Li, Q. (eds) Advances in Knowledge Discovery and Data Mining. PAKDD 2001. Lecture Notes in Computer Science(), vol 2035. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45357-1_26
Download citation
DOI: https://doi.org/10.1007/3-540-45357-1_26
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-41910-5
Online ISBN: 978-3-540-45357-4
eBook Packages: Springer Book Archive