Abstract
Bayesian classifiers such as Naive Bayes or Tree Augmented Naive Bayes (TAN) have shown excellent performance given their simplicity and heavy underlying independence assumptions. In this paper we prove that under suitable conditions it is possible to efficiently compute the maximum a posterior TAN model. Furthermore, we prove that it is also possible to efficiently calculate a weighted set with the k maximum a posteriori TAN models. This allows efficient TAN ensemble learning and accounting for model uncertainty. These results can be used to construct two classifiers. Both classifiers have the advantage of allowing the introduction of prior knowledge about structure or parameters into the learning process. Empirical results show that both classifiers lead to an improvement in error rate and accuracy of the predicted class probabilities over established TAN based classifiers with equivalent complexity.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Blake, C., Keogh, E., Merz, C.J.: UCI repository of machine learning databases (1998)
Cerquides, J.: Applying General Bayesian Techniques to Improve TAN Induction. In: Proceedings of the International Conference on Knowledge Discovery and Data Mining, KDD 1999 (1999)
Cerquides, J.: Improving Bayesian network classifiers. PhD thesis, Technical University of Catalonia (2003)
Cerquides, J., de Màntaras, R.L.: Tractable bayesian learning of tree augmented naive bayes classifiers. In: Proceedings of the Twentieth International Conference on Machine Learning, pp. 75–82 (2003)
Cerquides, J., de Màntaras, R.L.: Tractable bayesian learning of tree augmented naive bayes classifiers. long version. Technical Report IIIA-2003-04, Institut d’Investigació en Intel.ligència Artificial (2003)
Domingos, P., Pazzani, M.: On the Optimality of the Simple Bayesian Classifier under Zero-One Loss. Machine Learning 29, 103–130 (1997)
Friedman, N., Geiger, D., Goldszmidt, M.: Bayesian network classifiers. Machine Learning 29, 131–163 (1997)
Heckerman, D., Geiger, D., Chickering, D.: Learning bayesian networks: The combination of knowledge and statistical data. Machine Learning 20, 197–243 (1995)
Katoh, N., Ibaraki, T., Mine, H.: An algorithm for finding k minimum spanning trees. SIAM J. Comput. 10(2), 247–255 (1981)
Kontkanen, P., Myllymaki, P., Silander, T., Tirri, H.: Bayes Optimal Instance-Based Learning. In: Nédellec, C., Rouveirol, C. (eds.) ECML 1998. LNCS, vol. 1398, pp. 77–88. Springer, Heidelberg (1998)
Langley, P., Iba, W., Thompson, K.: An Analysis of Bayesian Classifiers. In: Proceedings of the Tenth National Conference on Artificial Intelligence, pp. 223–228. AAAI Press and MIT Press (1992)
Meila, M., Jaakkola, T.: Tractable bayesian learning of tree belief networks. In: Proc. of the Sixteenth Conference on Uncertainty in Artificial Intelligence (2000)
Meila, M., Jaakkola, T.: Tractable bayesian learning of tree belief networks. Technical Report CMU-RI-TR-00-15, Robotics Institute, Carnegie Mellon University, Pittsburgh, PA (May 2000)
Meila, M., Jordan, M.I.: Learning with mixtures of trees. Journal of Machine Learning Research 1, 1–48 (2000)
Pettie, S., Ramachandran, V.: An optimal minimum spanning tree algorithm. Journal of the ACM (JACM) 49(1), 16–34 (2002)
Thearling, K.: Some thoughts on the current state of data mining software applications. In: Keys to the Commercial Success of Data Mining, KDD 1998 Workshop (1998)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Cerquides, J., Lòpez de Màntaras, R. (2004). Maximum a Posteriori Tree Augmented Naive Bayes Classifiers. In: Suzuki, E., Arikawa, S. (eds) Discovery Science. DS 2004. Lecture Notes in Computer Science(), vol 3245. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-30214-8_6
Download citation
DOI: https://doi.org/10.1007/978-3-540-30214-8_6
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-23357-2
Online ISBN: 978-3-540-30214-8
eBook Packages: Springer Book Archive