Abstract
In this paper, we propose a novel generalized single-hidden layer feedforward network (GSLFN) by employing polynomial functions of inputs as output weights connecting randomly generated hidden units with corresponding output nodes. The main contributions are as follows. For arbitrary N distinct observations with n-dimensional inputs, the augmented hidden node output matrix of the GSLFN with L hidden nodes using any infinitely differentiable activation functions consists of L sub-matrix blocks where each includes n + 1 column vectors. The rank of the augmented hidden output matrix is proved to be no less than that of the SLFN, and thereby contributing to higher approximation performance. Furthermore, under minor constraints on input observations, we rigorously prove that the GLSFN with L hidden nodes can exactly learn L(n + 1) arbitrary distinct observations which is n + 1 times what the SLFN can learn. If the approximation error is allowed, by means of the optimization of output weight coefficients, the GSLFN may require less than N/(n + 1) random hidden nodes to estimate targets with high accuracy. Theoretical results of the GSLFN evidently perform significant superiority to that of SLFNs.
Access provided by Autonomous University of Puebla. Download to read the full chapter text
Chapter PDF
Similar content being viewed by others
Keywords
References
Wang, N., Er, M.J., Meng, X.: A Fast and Accurate Online Self-Organizing Scheme for Parsimonious Fuzzy Neural Networks. Neurocomput. 72, 3818–3829 (2009)
Wang, N., Er, M.J., Meng, X.Y., Li, X.: An Online Self-Organizing Scheme for Parsimonious and Accurate Fuzzy Neural Networks. Int. J. Neural Syst. 20, 389–403 (2010)
Wang, N.: A Generalized Ellipsoidal Basis Function Based Online Self-Constructing Fuzzy Neural Network. Neural Process. Lett. 34, 13–37 (2011)
Tamura, S., Tateishi, M.: Capabilities of a Four-Layered Feedforward Neural Network: Four Layers Versus Three. IEEE Trans. Neural Netw. 8, 251–255 (1997)
Huang, G.-B., Babri, H.A.: Upper Bounds on the Number of Hidden Neurons in Feedforward Networks with Arbitrary Bounded Nonlinear Activation Functions. IEEE Trans. Neural Netw. 9, 224–229 (1998)
Huang, G.-B., Zhu, Q.-Y., Siew, C.-K.: Extreme Learning Machine: Theory and Applications. Neurocomput. 70, 489–501 (2006)
Huang, G.-B., Chen, L., Siew, C.K.: Universal Approximation Using Incremental Constructive Feedforward Networks with Random Hidden Nodes. IEEE Trans. Neural Netw. 17, 879–892 (2006)
Ferrari, S., Stengel, R.F.: Smooth Function Approximation Using Neural Networks. IEEE Trans. Neural Netw. 16, 24–38 (2005)
Teoh, E.J., Tan, K.C., Xiang, C.: Estimating the Number of Hidden Neurons in a Feedforward Network Using the Singular Value Decomposition. IEEE Trans. Neural Netw. 17, 1623–1629 (2006)
Huynh, H.T., Won, Y., Kim, J.J.: An Improvement of Extreme Learning Machine for Compact Single-Hidden-Layer Feedforward Neural Networks. Int. J. Neural Syst. 18, 433–441 (2008)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Wang, N., Han, M., Yu, G., Er, M.J., Meng, F., Sun, S. (2013). Generalized Single-Hidden Layer Feedforward Networks. In: Guo, C., Hou, ZG., Zeng, Z. (eds) Advances in Neural Networks – ISNN 2013. ISNN 2013. Lecture Notes in Computer Science, vol 7951. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-39065-4_12
Download citation
DOI: https://doi.org/10.1007/978-3-642-39065-4_12
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-39064-7
Online ISBN: 978-3-642-39065-4
eBook Packages: Computer ScienceComputer Science (R0)