Abstract
The k-Nearest Neighbors algorithm is a highly effective method for many application areas. Conceptually the other good properties are its simplicity and easy to understand. However, according to the measurement of the performance of an algorithm based on three considerations (simplicity, processing time, and prediction power), the k-NN algorithm lacks the high-speed computation and maintenance of high accuracy for different k values. The k-Nearest Neighbors algorithm is still under the influence of varying k values. Besides, the prediction accuracy fades away whenever k approaches larger values. To overcome these issues, this paper introduces a kd-tree based dual-kNN approach that concentrates on two properties to keep up the classification accuracy at different k values and upgrade processing time performance. By conducting experiments on real data sets and comparing this algorithm with two other algorithms (dual-kNN and normal-kNN), it was experimentally confirmed that the kd-tree based dual-kNN is a more effective and robust approach for classification than pure dual-kNN and normal k-NN.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Tang, B., He, H.: ENN: extended nearest neighbor method for pattern recognition. IEEE Comput. Intell. Mag. 10, 52–60 (2015)
Aung, S.S., Nagayama, I., Tamaki, S.: Intelligent traffic prediction by multi-sensor fusion using multi-threaded machine learning. IEIE Trans. Smart Process. Comput. 5(6), 430–439 (2016)
Aung, S.S., Nagayama, I., Tamaki, S.: Plurality rule–based density and correlation coefficient–based clustering for K-NN. IEIE Trans. Smart Process. Comput. 6(3), 183–192 (2017)
Merry, B., Gain, J., Marais, P.: Accelerating kd-tree searches for all k-nearest neighbors. In: European Association for Computer Graphics, Department of Computer Science, University of Cape Town, January 2013
Bentley, J.L.: Multidimensional binary search tree used for associative searching. Commun. ACM 18(9), 509–517 (1975). ISSN 0001-078
Mitchell, T.M.: Machine Learning. McGraw-Hill Companies, New York (1997). Carnegie Mellon University
Du, M., Ding, S., Jia, H.: Study on density peaks clustering based on k-nearest neighbors and principal component analysis. Knowl. Based Syst. J. 99, 135–145 (2016)
Blake, C., Keogh, E., Merz, C.J.: UCI repository of machine learning databases. University of California, Department of Information and Computer Science, Irvine, CA (1998). http://www.ics.uci.edu/mlearn/MLRepository.html
Aung, S.S., Nagayama, I., Tamaki, S.: Dual-kNN for a pattern classification approach. IEIE Trans. Smart Process. Comput. 6(5), 326–333 (2017)
Hu, L.Y., Huang, M.W., Ke, S.W., Tsai, C.F.: The distance function effect on k-nearest neighbor classification for medical datasets. SpringerPlus 5, 1304 (2016). https://doi.org/10.1186/s40064-016-2941-7
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Aung, S.S., Itaru, N., Shiro, T. (2019). A High Performance Classifier by Dimensional Tree Based Dual-kNN. In: Arai, K., Kapoor, S., Bhatia, R. (eds) Intelligent Systems and Applications. IntelliSys 2018. Advances in Intelligent Systems and Computing, vol 868. Springer, Cham. https://doi.org/10.1007/978-3-030-01054-6_46
Download citation
DOI: https://doi.org/10.1007/978-3-030-01054-6_46
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-01053-9
Online ISBN: 978-3-030-01054-6
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)