Abstract
Radial basis function(RBF) networks have been proved to be a universal approximator when enough hidden nodes are given and proper parameters are selected. Conventional algorithms for RBF networks training, including two-stage methods and gradient-based algorithms, cost much computation and have difficulty to determine the network size. In this paper, a new greedy incremental(GI) algorithm is proposed which constructs the RBF network by adding hidden node one by one; Each added hidden node is trained once and then fixed. The parameters of each added hidden node are trained in a greedy way to approximate the local area around the pattern with the biggest error magnitude. The center and weight are determined by local regression, the width is tuned iteratively with a simple rule. The proposed greedy incremental algorithm is tested on some practical experiments and compared with other popular algorithms. The experiments results illustrated the GI algorithm could approximate the function universally with high efficiency and robustness.
Access provided by Autonomous University of Puebla. Download to read the full chapter text
Chapter PDF
We’re sorry, something doesn't seem to be working properly.
Please try refreshing the page. If that doesn't work, please contact support so we can address the problem.
References
Broomhead, D.S., Lowe, D.: Multivariable functional interpolationand adaptive networks. Complex Syst. 2, 321–355 (1988)
Moody, J., Darken, C.J.: Fast learning in networks of locally-tuned processing units. Neural Comput. 1(2), 281–294 (1989)
Chen, S., Cowan, C.F.N., Grant, P.M.: Orthogonal least squares algorithm for radial basis funtion networks. Int. J. Control 2(2), 302–309 (1991)
Park, J., Sandberg, I.: Universal approximation using radial-basis function networks. Neural Comput. 3, 246–257 (1991)
Yu, H., Xie, T.T., Paszczynski, S., Wilamowski, B.M.: Advantages of radial basis function networks for dynamic system design. IEEE Trans. Ind. Electron. 58(12), 5438–5450 (2011)
Xie, T.T., Yu, H., Wilamowski, B.M.: Comparison of traditional neural networks and radial basis function networks. In: Proc. 20th IEEE Int. Symp. Ind. Electron., ISIE 2011, Gdansk, Poland, June 27-30, pp. 1194–1199 (2011)
Meng, K., Dong, Z.Y., Wang, D.H., Wong, K.P.: A self-adaptive RBF neural network classifier for transformer fault analysis. IEEE Trans. Power Syst. 25(3), 1350–1360 (2010)
Huang, S., Tan, K.K.: Fault detection and diagnosis based on modeling and estimation methods. IEEE Trans. Neural Netw. 20(5), 872–881 (2009)
Lee, Y.J., Yoon, J.: Nonlinear image upsampling method based on radial basis function interpolation. IEEE Trans. Image Process. 19(10), 2682–2692 (2010)
Ferrari, S., Bellocchio, F., Piuri, V., Borghese, N.A.: A hierarchical RBF online learning algorithm for real-time 3-D scanner. IEEE Trans. Neural Netw. 21(2), 275–285 (2010)
Vapnik, V.N.: Statistical Learning Theory, 1st edn. Wiley Interscience (1998)
Chng, E.S., Chen, S., Mulgrew, B.: Gradient radial basis function networks for nonlinear and nonstationary time series prediction. IEEE Trans. Neural Netw. 7(1), 190–194 (1996)
Karayiannis, N.B.: Reformulated radial basis neural networks trained by gradient descent. IEEE Trans. Neural Netw. 10(3), 657–671 (2002)
Wilamowski, B.M., Yu, H.: Improved computation for Levenberg–Marquardt training. IEEE Trans. Neural Netw. 21(6), 930–937 (2010)
Xie, T., Yu, H., Hewlett, J., Rozycki, P., Wilamowski, B.M.: Fast and efficient second-order method for training radial basis function networks. IEEE Trans. Neural Netw. Learn. Syst. 23(4), 609–619 (2012)
Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: Theory and applications. Neurocomputing 70(1-3), 489–501 (2006)
Huang, G.B., Chen, L., Siew, C.K.: Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans. Neural Netw. 17(4), 879–892 (2006)
Huang, G.B., Chen, L.: Enhanced random search based incremental extreme learning machine. Neurocomputing 71(16-18), 3460–3468 (2008)
Huang, G.B., Chen, L.: Convex incremental extreme learning machine. Neurocomputing 70, 3056–3062 (2007)
Reiner, P., Wilamowski, B.M.: Nelder-Mead Enhanced Extreme Learning Machine. In: INES 2013, Costa Rica, June 19-21 (2013)
Kwok, T.-Y., Yeung, D.-Y.: Objective functions for training new hidden units in constructive neural networks. IEEE Trans. Neural Netw. 8(5), 1131–1148 (1997)
Benoudjit, N., Verleysen, M.: On the Kernel widths in radial-basis function networks. Neural Process. Lett. 18, 139–154 (2003)
Huang, G.B., Saratchandran, P., Sundararajan, N.: An Efficient Sequential Learning Algorithm for Growing and Pruning RBF (GAP-RBF) Networks. IEEE Trans. on System, Man, and Cybernetics, Part B 34(6), 2284–2292 (2004)
Rousseeuw, P., Leroy, A.: Robust Regression and Outlier Detection, 3rd edn. John Wiley & Sons (1996)
Blake, C., Merz, C.: UCI repository of machine learning databases. Dept. Inf. Comp. Sci., Univ. California, Irvine (1998), http://www.ics.uci.edu/~mlearn/MLRepository.html
Chang, C.-C., Lin, C.-J.: LIBSVM: a library for support vector machines (2001), http://www.csie.ntu.edu.tw/~cjlin/libsvm
Malinowski, A., Yu, H.: Comparison of embedded system design for industrial applications. IEEE Trans. Ind. Informat. 7(2), 244–254 (2011)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Wu, X., Wilamowski, B.M. (2014). A Greedy Incremental Algorithm for Universal Approximation with RBF Newtorks. In: Fodor, J., Fullér, R. (eds) Advances in Soft Computing, Intelligent Robotics and Control. Topics in Intelligent Engineering and Informatics, vol 8. Springer, Cham. https://doi.org/10.1007/978-3-319-05945-7_9
Download citation
DOI: https://doi.org/10.1007/978-3-319-05945-7_9
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-05944-0
Online ISBN: 978-3-319-05945-7
eBook Packages: EngineeringEngineering (R0)