Abstract
Many sophisticated classification algorithms have been proposed. However, there is no clear methodology of comparing the results among different methods. According to our experiments on the popular datasets, k-NN with properly tuned parameters performs on average best. Tuning the parametres include the proper k, proper distance measure and proper weighing functions. k-NN has a zero training time and the test time can be significantly reduced by prior reference vector selection, which needs to be done only once or by applying advanced nearest neighbor search strategies (like KDtree algorithm). Thus we propose that instead of comparing new algorithms with an author’s choice of old ones (which may be especially selected in favour of his method), the new method would be rather compared first with properly tuned k-NN as a gold standard. And based on the comparison the author of the new method would have to aswer the question: ”Do we really need this method since we already have k-NN?”
Access provided by Autonomous University of Puebla. Download to read the full chapter text
Chapter PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Fix, E., Hodges, J.L.: Discriminatory analysis, nonparametric discrimination: Consistency properties. USAF School of Aviation Medicine, Randolph Field, Texas (1951)
UCI Machine Learning Repository, http://archive.ics.uci.edu/ml
Strzempa, D.: Internet System for Data Classification (in Polish), MSc Thesis, The Silesian University of Technology, Katowice (2008), http://www.ath.bielsko.pl/~mkordos/mgr/ds2008.pdf
Duda, R.O., et al.: Pattern Classification. Wisley, New York (2001)
Breiman, L., et al.: Classification and Regression Trees. Wadsworth, CA (1984)
Quinlan, J.R.: C4.5: Programs for Machine Learning. Morgan Kaufmann, San Francisco (1993)
Kohonen, T.: Statistical Pattern Recognition Revisited. Elsevier, Amsterdam (1990)
Schalkoff, R.: Pattern Recognition: Statistical, Structural and Neural Approaches. Wiley, Chichester (1992)
Fisher, R.A.: The use of multiple measurements in taxonomic problems. Wiley, Chichester (1950)
Kordos, M.: Search-based Algorithms for Multilayer Perceptrons, PhD Thesis, The Silesian University of Technology, Gliwice (2005), http://www.fizyka.umk.pl/~kordos/pdf/MKordos-PhD.pdf
Grabczewski, K.: Application of SSV Criterion for generating classification rules, PhD Thesis, Nicholaus Copernicus University, Torun (2003) (in Polish)
Jankowski, N.: Ontogenic Neural Networks for Medical Data Classification, PhD Thesis, Nicholaus Copernicus University, Torun (1999) (in Polish)
Adamczak, R.: Neural networks application for experimental data classification, PhD Thesis, Nicholaus Copernicus University, Torun (1999, 2001) (in Polish)
Schlkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization and Beyond. MIT Press, Cambridge (2001)
Michie, D., Spiegelhalter, D.J., Taylor, C.C.: Machine Learning, neural and statistical classification. Elis Horwood, London (1994)
Ster, B., Dobnikar, A.: Neural Networks in medical diagnosis: Comparison with other methods. In: EANN 1996, pp. 427–430 (1996)
Zarndt, F.: A comprehensive case study: An examination of machine learning and connectionists algorithms, MSc Thesis, Department of Computer Science, Brigham Young University (1995)
Weiss, S.M., Kapouleas, I.: An empirical comparison of pattern Recognition, neural nets and machine learning classification methods. In: Reading in Machine Learning. Morgan Kauffman Publ., CA (1990)
http://www.fqs.pl/business_intelligence/products/ghostminer/product_overview
Blachnik, M., Duch, W., Wieczorek, T.: Probabilistic distance measures for prototype-based rules. In: Proc. of the 12th Int. Conference on Neural Information Processing (ICONIP 2005), Taipei, Taiwan, pp. 445–450 (2005)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kordos, M., Blachnik, M., Strzempa, D. (2010). Do We Need Whatever More Than k-NN?. In: Rutkowski, L., Scherer, R., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M. (eds) Artificial Intelligence and Soft Computing. ICAISC 2010. Lecture Notes in Computer Science(), vol 6113. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-13208-7_52
Download citation
DOI: https://doi.org/10.1007/978-3-642-13208-7_52
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-13207-0
Online ISBN: 978-3-642-13208-7
eBook Packages: Computer ScienceComputer Science (R0)