Abstract
Biologically inspired learning by neural networks has gained interest in recent times. In these research works, so far it has never been considered whether the learned weights are biologically plausible although modifications of synaptic weights underlie learning and memory in the brain. In this paper, we propose a learning rule for an artificial feedforward neural network that learns under biological constraints imposed by Dale’s law and with the requirement that weights have a monotonically decaying Gaussian distribution with some zero or near-zero weights and few large weights. We introduce this rule being inspired by feedforward learning with similar distribution of weights that occurs in cerebellum region of the brain. We test our proposed learning rule on handwritten digit recognition dataset MNIST and obtain test accuracy of 98.11% in the best case, which is comparable to the state-of-the-art accuracy (98.4%) for a two-layer feedforward neural network without any transformation of input data or any optimization technique.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Albus JS (1971) A theory of cerebellar function. Math Biosci 10(1):25–61
Barbour B, Brunel N, Hakim V, Nadal JP (2007) What can we learn from synaptic weight distributions ? TRENDS Neurosci 30(12):622–629
Bastien F, Lamblin P, Pascanu R, Bergstra J, Goodfellow I, Bergeron A, Bouchard N, Warde-Farley D, Bengio Y (2012) Theano: new features and speed improvements. arXiv:1211.5590
Bengio Y, Lee DH, Bornschein J, Lin Z (2015) Towards biologically plausible deep learning. arXiv:1502.04156
Bengio Y, Mesnard T, Fischer A, Zhang S, Wu Y (2015) An objective function for STDP. arXiv:1509.05936
Bergstra J, Desjardins G, Lamblin P, Bengio Y (2009) Quadratic polynomials learn better image features. Technical Report 1337, Département d’Informatique et de Recherche Opérationnelle, Université de Montréal
Bi GQ, Poo MM (2001) Synaptic modification by correlated activity: Hebb’s postulate revisited. Annu Rev Neurosci 24(1):139–166
Brunel N (2016) Is cortical connectivity optimized for storing information? Nat Neurosci
Brunel N, Hakim V, Isope P, Nadal JP, Barbour B (2004) Optimal information storage and the distribution of synaptic weights: perceptron versus Purkinje cell. Neuron 43(5):745–757
Chapeton J, Fares T, LaSota D, Stepanyants A (2012) Efficient associative memory storage in cortical circuits of inhibitory and excitatory neurons. Proc Natl Acad Sci 109(51):E3614–E3622
Csäji BC (2001) Approximation with artificial neural networks. Fac Sci Etvs Lornd Univ Hungary 24:48
Eccles J (1976) From electrical to chemical transmission in the central nervous system. Notes Rec R Soc Lond 30(2):219–230
Feldmeyer D, Lübke J, Sakmann B (2006) Efficacy and connectivity of intracolumnar pairs of layer 2/3 pyramidal cells in the barrel cortex of juvenile rats. J Physiol 575(2):583–602
Frick A, Feldmeyer D, Helmstaedter M, Sakmann B (2008) Monosynaptic connections between pairs of L5A pyramidal neurons in columns of juvenile rat somatosensory cortex. Cereb Cortex 18(2):397–406
Glorot X, Bordes A, Bengio Y (2011) Deep sparse rectifier neural networks. In: Aistats, vol 15, No 106, p 275
Glorot X, Bengio Y (2010) Understanding the difficulty of training deep feedforward neural networks. In: Aistats, vol 9, pp 249–256
Goodfellow IJ, Bengio Y, Courville A (2015) Deep learning. An MIT Press book in preparation. Draft chapters available at http://www.deeplearningbook.org
Haruno M, Wolpert DM, Kawato M (1999) Multiple paired forward-inverse models for human motor learning and control. Advances in neural information processing systems, pp 31–37
Hebb DO (1949) The organization of behavior
Holmgren C, Harkany T, Svennenfors B, Zilberter Y (2003) Pyramidal cell communication within local networks in layer 2/3 of rat neocortex. J Physiol 551(1):139–153
Hornik K, Stinchcombe M, White H (1989) Multilayer feedforward networks are universal approximators. Neural Netw 2(5):359–366
Ioffe S, Szegedy C (2015) Batch normalization: accelerating deep network training by reducing internal covariate shift. arXiv:1502.03167
Isope P, Barbour B (2002) Properties of unitary granule cell \(\rightarrow \) Purkinje cell synapses in adult rat cerebellar slices. J Neurosci 22(22):9668–9678
LeCun Y, Bottou L, Bengio Y, Haffner P (1998) Gradient-based learning applied to document recognition. Proc IEEE 86(11):2278–2324
Marr D, Thach WT (1991) A theory of cerebellar cortex. In From the Retina to the Neocortex. Birkhäuser Boston, pp 11–50
Mason A, Nicoll A, Stratford K (1991) Synaptic transmission between individual pyramidal neurons of the rat visual cortex in vitro. J Neurosci 11(1):72–84
Salavati AH, Karbasi A (2012) Multi-level error-resilient neural networks. In: 2012 IEEE international symposium on information theory proceedings (ISIT). IEEE, pp 1064–1068
Sayer RJ, Friedlander MJ, Redman SJ (1990) The time course and amplitude of EPSPs evoked at synapses between pairs of CA3/CA1 neurons in the hippocampal slice. J Neurosci 10(3):826–836
Setiono R (2000) Extracting M-of-N rules from trained neural networks. IEEE Trans Neural Netw 11(2):512–519
Simard PY, Steinkraus D, Platt JC (2003) Best practices for convolutional neural networks applied to visual document analysis. In: ICDAR, vol 3, pp 958–962
Sjöström PJ, Rancz EA, Roth A, Häusser M (2008) Dendritic excitability and synaptic plasticity. Physiol Rev 88(2):769–840
Sjöström PJ, Turrigiano GG, Nelson SB (2001) Rate, timing, and cooperativity jointly determine cortical synaptic plasticity. Neuron 32(6):1149–1164
Song S, Sjöström PJ, Reigl M, Nelson S, Chklovskii DB (2005) Highly nonrandom features of synaptic connectivity in local cortical circuits. PLoS Biol 3(3):e68
Varshney LR, Sjöström PJ, Chklovskii DB (2006) Optimal information storage in noisy synapses under resource constraints. Neuron 52(3):409–423
Acknowledgements
This research is partially supported by ICT Division Innovation Fund, Grant No. 56.00.0000.028.33.080.17-214, Government of Peoples’ Republic of Bangladesh.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Nimi, S.T., Adnan Arefeen, M., Adnan, M.A. (2020). CerebLearn: Biologically Motivated Learning Rule for Artificial Feedforward Neural Networks. In: Uddin, M.S., Bansal, J.C. (eds) Proceedings of International Joint Conference on Computational Intelligence. Algorithms for Intelligent Systems. Springer, Singapore. https://doi.org/10.1007/978-981-15-3607-6_1
Download citation
DOI: https://doi.org/10.1007/978-981-15-3607-6_1
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-15-3606-9
Online ISBN: 978-981-15-3607-6
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)