Abstract
A neural network that uses the basic Hebbian learning rule and the Bayesian combination function is defined. Analogously to Hopfield's neural network, the convergence for the Bayesian neural network that asynchronously updates its neurons' states is proved. The performance of the Bayesian neural network in four medical domains is compared with various classification methods. The Bayesian neural network uses more sophisticated combination function than Hopfield's neural network and uses more economically the available information. The “naive” Bayesian classifier typically outperforms the basic Bayesian neural network since iterations in network make too many mistakes. By restricting the number of iterations and increasing the number of fixed points the network performs better than the naive Bayesian classifier. The Bayesian neural network is designed to learn very quickly and incrementally.
Article PDF
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
References
Bratko I, Kononenko I (1987) Learning diagnostic rules from incomplete and noisy data. In: Phelps B (ed) Interactions in artificial intelligence and statistical methods. Technical Press, Hampshire
Cestnik B, Kononenko I, Bratko I (1987) Assistant 86: a knowledge elicitation tool for sophisticated users. In: Bratko I, Lavrac N (eds) Progress in machine learning. Sigma Press, Wilmslow
Guez A, Protopopsecu V, Barhen J (1988) On the stability, storage capacity and design of nonlinear continuous neural networks. IEEE Trans SMC-18:80–87
Hopfield JJ, Tank D W (1985) “Neural” computation of decisions in optimization problems. Biol Cybern 52:141–152
Kohonen T (1984) Self-organization and associative memory. Springer, Berlin Heidelberg New York
Kononenko I (1989) Interpretation of neural networks decisions. Proceedings of IASTED International Conference on Expert Systems, Zurich, Switzerland, June 26–28
Kononenko I, Bratko I (1989) Informativity based evaluation criterion for classifier's performance. Mach Learn J (to appear)
Kosko B (1988) Bidirectional associative memories. IEEE Trans SMC-18:49–50
McEliece RJ, Posner EC, Rodemich ER, Venkatesh SS (1987) The capacity of the Hopfield associative memory. IEEE Trans IT-33:461–482
Michie D (1989) Personal models of rationality. J Statist Planning and Inference (in press)
Minsky M, Papert S (1969) Perceptrons. MIT Press, Cambridge
Rumelhart DE, Zipser D (1986) Feature discovery by competitive learning. In: Rumelhart DE, McClelland JL (eds) Parallel distributed processing, vol 1: Foundations. MIT Press, Cambridge
Rumelhart DE, Hinton GE, Williams RJ (1986a) Learning internal representations by error propagation. In: Rumel-hart DE, McClelland JL (eds) Parallel distributed processing, vol 1: Foundations. MIT Press, Cambridge
Rumelhart DE, Hinton GE, McClelland JL (1986b) A general framework for parallel distributed processing. In: Rumelhart DE, McClelland JL (eds) Parallel distributed processing, vol 1: Foundations. MIT Press, Cambridge
Williams RJ (1986) The logic of activation functions. In: Rumelhart DE, McClelland JL (eds) Parallel distributed processing, vol 1: Foundations. MIT Press, Cambridge
Wong AJW (1988) Recognition of general patterns using neural networks. Biol Cybern 58:361–372
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Kononenko, I. Bayesian neural networks. Biol. Cybern. 61, 361–370 (1989). https://doi.org/10.1007/BF00200801
Received:
Accepted:
Issue Date:
DOI: https://doi.org/10.1007/BF00200801