Almost optimal estimates for approximation and learning by radial basis function networks

作者:Shaobo Lin, Xia Liu, Yuanhua Rong, Zongben Xu

摘要

This paper quantifies the approximation capability of radial basis function networks (RBFNs) and their applications in machine learning theory. The target is to deduce almost optimal rates of approximation and learning by RBFNs. For approximation, we show that for large classes of functions, the convergence rate of approximation by RBFNs is not slower than that of multivariate algebraic polynomials. For learning, we prove that, using the classical empirical risk minimization, the RBFNs estimator can theoretically realize the almost optimal learning rate. The obtained results underlie the successful application of RBFNs in various machine learning problems.

论文关键词:Learning theory, Approximation theory, Radial basis function networks, Rate of convergence

论文评审过程:

论文官网地址:https://doi.org/10.1007/s10994-013-5406-z