Parallel Implementation of the Nonlinear Semi-NMF Based Alternating Optimization Method for Deep Neural Networks

作者:Akira Imakura, Yuto Inoue, Tetsuya Sakurai, Yasunori Futamura

摘要

For computing weights of deep neural networks (DNNs), the backpropagation (BP) method has been widely used as a de-facto standard algorithm. Since the BP method is based on a stochastic gradient descent method using derivatives of objective functions, the BP method has some difficulties finding appropriate parameters such as learning rate. As another approach for computing weight matrices, we recently proposed an alternating optimization method using linear and nonlinear semi-nonnegative matrix factorizations (semi-NMFs). In this paper, we propose a parallel implementation of the nonlinear semi-NMF based method. The experimental results show that our nonlinear semi-NMF based method and its parallel implementation have competitive advantages to the conventional DNNs with the BP method.

论文关键词:Deep neural networks, Nonlinear semi-nonnegative matrix factorizations, Parallel implementation

论文评审过程:

论文官网地址:https://doi.org/10.1007/s11063-017-9642-2