Abstract
A new classification algorithm called VFI (for Voting Feature Intervals) is proposed. A concept is represented by a set of feature intervals on each feature dimension separately. Each feature participates in the classification by distributing real-valued votes among classes. The class receiving the highest vote is declared to be the predicted class. VFI is compared with the Naive Bayesian Classifier, which also considers each feature separately. Experiments on real-world datasets show that VFI achieves comparably and even better than NBC in terms of classification accuracy. Moreover, VFI is faster than NBC on all datasets.
This project is supported by TUBITAK (Scientific and Technical Research Council of Turkey) under Grant EEEAG-153.
Chapter PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Akkuş., A., & Güvenir, H. A. (1995). K Nearest Neighbor Classification on Feature Projections. Proceedings of ICML'96, 12–19.
Demiröz, G., & Güvenir, H. A. (1996). Genetic Algorithms to Learn Feature Weights for the Nearest Neighbor Algorithm. Proceedings of BENELEARN-96, 117–126.
Güvenir, H. A., & Şirin, İ. (1996). Classification by Feature Partitioning. Machine Learning, Vol. 23, 47–67.
Holte, R. C. (1993). Very simple classification rules perform well on most commonly used datasets. Machine Learning, Vol. 11, 63–91.
Kononenko, I. (1993). Inductive and Bayesian Learning in Medical Diagnosis. Applied Artificial Intelligence, Vol. 7, 317–337.
Kononenko, I. & Bratko, I. (1991). Information-Based Evaluation Criterion for Classifier's Performance. Machine Learning, Vol. 6, 67–80.
Murphy, P. (1995). UCI Repository of machine learning databases, [Anonymous FTP from ics.uci.edu in the directory pub/machine-learning databases]. Department of Information and Computer Science, University of California, Irvine.
Quinlan, J. R. (1986). Induction of decision trees. Machine Learning, Vol.1, 81–106.
Quinlan, J. R. (1989). Unknown attribute values in induction. Proceedings of 6th International Workshop on Machine Learning, 164–168.
Wettschereck,D. & Aha, D. W. (1995). Weighting Features. Proceedings of the First International Conference on Case-Based Reasoning (ICCBR-95).
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1997 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Demiröz, G., Güvenir, H.A. (1997). Classification by Voting Feature Intervals. In: van Someren, M., Widmer, G. (eds) Machine Learning: ECML-97. ECML 1997. Lecture Notes in Computer Science, vol 1224. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-62858-4_74
Download citation
DOI: https://doi.org/10.1007/3-540-62858-4_74
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-62858-3
Online ISBN: 978-3-540-68708-5
eBook Packages: Springer Book Archive