An Efficient Hardware Implementation of Feed-Forward Neural Networks

作者:Tamás Szab#x00F3;, Gábor Horv#x00E1;th

摘要

This paper proposes a new way of digital hardware implementation of nonlinear activation functions in feed-forward neural networks. The basic idea of this new realization is that the nonlinear functions can be implemented using a matrix-vector multiplication. Recently a new approach was proposed for the efficient realization of matrix-vector multipliers, and this approach can be applied for implementing nonlinear functions if these functions are approximated by simple basis functions. The paper proposes to use B-spline basis functions to approximate nonlinear sigmoidal functions, it shows that this approximation fulfils the general requirements on the activation functions, presents the details of the proposed hardware implementation, and gives a summary of an extensive study about the effects of B-spline nonlinear function realization on the size and the trainability of feed-forward neural networks.

论文关键词:feed-forward neural networks, B-spline approximation, activation function, hardware implementation

论文评审过程:

论文官网地址:https://doi.org/10.1023/B:APIN.0000033634.62074.46