Abstract
This paper introduces seven brain-inspired rules that are deeply rooted in the understanding of the brain to improve multi-layer spiking neural networks (SNNs). The dynamics of neurons, synapses, and plasticity models are considered to be major characteristics of information processing in brain neural networks. Hence, incorporating these models and rules to traditional SNNs is expected to improve their efficiency. The proposed SNN model can mainly be divided into three parts: the spike generation layer, the hidden layers, and the output layer. In the spike generation layer, non-temporary signals such as static images are converted into spikes by both local and global feature-converting methods. In the hidden layers, the rules of dynamic neurons, synapses, the proportion of different kinds of neurons, and various spike timing dependent plasticity (STDP) models are incorporated. In the output layer, the function of classification for excitatory neurons and winner take all (WTA) for inhibitory neurons are realized. MNIST dataset is used to validate the classification accuracy of the proposed neural network model. Experimental results show that higher accuracy will be achieved when more brain-inspired rules (with careful selection) are integrated into the learning procedure.
创新点
本文总结和归纳了七条受脑启发的学习准则,并应用于改善脉冲神经网络。这些学习准则都来源于对生物脑的实验研究,并各自从不同的侧面反映了生物网络的学习特性,如神经元的动态分配、突触的自适应生长和消亡机制、不同的突触可塑性学习机制(如不同类型的时序依赖突触可塑性)、网络背景噪声对学习的调控机制、兴奋性和抑制性神经元的比例对学习的调节机制等。本文通过组合上述不同的受脑启发的规则,通过实验研究验证了:随着越来越多的、经过仔细选择的、受脑启发的规则的引入,深层脉冲神经网络能够得到越来越好的分类性能。
Article PDF
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
References
Hinton G, Osindero S, Teh Y W. A fast learning algorithm for deep belief nets. Neural Comput, 2006, 18: 1527–1554
He K, Zhang X, Ren S, et al. Delving deep into rectifiers: surpassing human-level performance on imagenet classification. In: Proceedings of the IEEE International Conference on Computer Vision, Santiago, 2015. 1026–1034
Maass W. Networks of spiking neurons: the third generation of neural network models. Neural Networks, 1997, 10: 1659–1671
Eliasmith C, Stewart T, Choo X, et al. A large-scale model of the functioning brain. Science, 2012, 338: 1202–1205
Zenke F, Agnes E, Gerstner W. Diverse synaptic plasticity mechanisms orchestrated to form and retrieve memories in spiking neural networks. Nat Commun, 2015, 6: 6922
Song H F, Yang G R, Wang X J. Training excitatory-inhibitory recurrent neural networks for cognitive tasks: a simple and flexible framework. PLoS Comput Biol, 2016, 12: e1004792
Beyeler M, Oros N, Dutt N D, et al. A GPU-accelerated cortical neural network model for visually guided robot navigation. Neural Networks, 2015, 72: 75–87
Maffei G, Santos-Pata D, Marcos E, et al. An embodied biologically constrained model of foraging: from classical and operant conditioning to adaptive real-world behavior in DAC-X. Neural Networks, 2015, 72: 88–108
Wade J J, McDaid L J, Santos J, et al. Swat: a spiking neural network training algorithm for classification problems. IEEE Trans Neur Net, 2010, 21: 1817–1830
Beyeler M, Dutt N D, Krichmar J L. Categorization and decision-making in a neurobiologically plausible spiking network using a STDP-like learning rule. Neural Networks, 2013, 48: 109–124
Izhikevich E M. Simple model of spiking neurons. IEEE Trans Neur Net, 2003, 14: 1569–1572
Iakymchuk T, Rosado-Munoz A, Guerrero-Martinez J F, et al. Simplified spiking neural network architecture and stdp learning algorithm applied to image classification. EURASIP J Image Vide Process, 2015, 2015: 1–11
Ionescu M, Paun G, Yokomori T. Spiking neural P systems. Fund Inform, 2006, 71: 279–308
Zhao Y, Liu X, Wang W. Spiking neural P systems with neuron division and dissolution. PLoS ONE, 2016, 11: e0162882
Jia Y, Huang C, Darrell T. Beyond spatial pyramids: receptive field learning for pooled image features. In: Proceedings of the 2012 IEEE Conference on Computer Vision and Pattern Recognition, Providence, 2012. 3370–3377
Kravitz D J, Saleem K S, Baker C I, et al. The ventral visual pathway: an expanded neural framework for the processing of object quality. Trend Cogn Sci, 2013, 17: 26–49
Häusser M. The hodgkin-huxley theory of the action potential. Nat Neurosci, 2000, 3: 1165
Brette R, Gerstner W. Adaptive exponential integrate-and-fire model as an effective description of neuronal activity. J Neurophysiology, 2005, 94: 3637–3642
Sharkey N E, Jackson S A. An internal report for connectionists. Comput Arc Integrat Neural Symb Proc, 1995, 292: 223–244
Destexhe A. Conductance-based integrate-and-fire models. Neural Comput, 1997, 9: 503–514
Heeger D J, Ress D. What does fMRI tell us about neuronal activity. Nat Rev Neurosci, 2002, 3: 142–151
Sokal R R, Rohlf F J. Biometry: the Principles and Practice of Statistics in Biological Research. New York: WH Freeman and Company, 1969
Chrol-Cannon J, Jin Y. Computational modelling of neural plasticity for self-organization of neural networks. Biosystems, 2014, 125: 43–54
Ghosh-Dastidar S, Adeli H. Spiking neural networks. Int J Neural Syst, 2009, 19: 295–308
Seress L, Ribak C E. Direct commissural connections to the basket cells of the hippocampal dentate gyrus: anatomical evidence for feed-forward inhibition. J Neurocytology, 1984, 13: 215–225
Lytton W W, Sejnowski T J. Inhibitory Interneurons Can Rapidly Phase-lock Neural Populations, Chapter for Induced Rhythms in the Brain. New York: Springer, 1992. 357–366
Waddington A, Appleby P A, De Kamps M, et al. Triphasic spike-timing-dependent plasticity organizes networks to produce robust sequences of neural activity. Front Comput Neurosci, 2012, 6: 88
Song S, Miller K D, Abbott L F. Competitive hebbian learning through spike-timing-dependent synaptic plasticity. Nat Neurosci, 2000, 3: 919–926
Clopath C, Büsing L, Vasilaki E, et al. Connectivity reflects coding: a model of voltage-based stdp with homeostasis. Nat Neurosci, 2010, 13: 344–352
Rolls E T, Deco G. Computational Neuroscience of Vision. New York: Oxford University Press, 2002
Schaul T, Bayer J, Wierstra D, et al. Pybrain. J Mach Learn Res, 2010, 11: 743–746
Acknowledgments
This work was supported by Strategic Priority Research Program of Chinese Academy of Sciences (Grant No. XDB02060007), and Beijing Municipal Commission of Science and Technology (Grant Nos. Z151100000915070, Z161100000216124).
Author information
Authors and Affiliations
Corresponding authors
Rights and permissions
About this article
Cite this article
Zeng, Y., Zhang, T. & Xu, B. Improving multi-layer spiking neural networks by incorporating brain-inspired rules. Sci. China Inf. Sci. 60, 052201 (2017). https://doi.org/10.1007/s11432-016-0439-4
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s11432-016-0439-4