Provable convergence of Nesterov’s accelerated gradient method for over-parameterized neural networks

作者:

Highlights:

摘要

Momentum methods, such as heavy ball method (HB) and Nesterov’s accelerated gradient method (NAG), have been widely used in training neural networks by incorporating the history of gradients into the current updating process. In practice, they often provide improved performance over (stochastic) gradient descent (GD) with faster convergence. Despite their empirical success, a theoretical understanding of their accelerated convergence rates is still insufficient. Recently, some attempts have been made by analyzing the trajectories of gradient-based methods in an over-parameterized regime, where the number of the parameters is significantly larger than that of the training instances. However, the majority of existing theoretical work is mainly concerned with GD and the established convergence result of NAG is inferior to HB and GD, which fails to explain the practical success of NAG. In this paper, we take a step towards closing this gap by analyzing NAG in training a randomly initialized over-parameterized two-layer fully connected neural network with ReLU activation. Despite the fact that the objective function is non-convex and non-smooth, we show that NAG converges to a global minimum at a non-asymptotic linear rate (1−Θ(1/κ))t, where κ>1 is the condition number of a gram matrix and t is the number of the iterations. Compared to the convergence rate (1−Θ(1/κ))t of GD, our result provides theoretical guarantees for the acceleration of NAG in neural network training. Furthermore, our findings suggest that NAG and HB have similar convergence rate. Finally, extensive experiments on six benchmark datasets have been conducted to validate the correctness of our theoretical results.

论文关键词:Neural networks,Over-parameterization,Neural tangent kernel,Nesterov’s accelerated gradient method,Non-asymptotic global convergence

论文评审过程:Received 14 December 2021, Revised 12 June 2022, Accepted 13 June 2022, Available online 18 June 2022, Version of Record 28 June 2022.

论文官网地址:https://doi.org/10.1016/j.knosys.2022.109277