An adaptive Drop method for deep neural networks regularization: Estimation of DropConnect hyperparameter using generalization gap

作者:

Highlights:

• Derivation of a generalization gap upper bound using DropConnect Rademacher complexity.

• Minimizing the obtained upper bound which is a constraint function of the DropConnect hyperparameters.

• Estimation of DropConnect hyperparameter using the gap generalization optimization.

摘要

•Derivation of a generalization gap upper bound using DropConnect Rademacher complexity.•Minimizing the obtained upper bound which is a constraint function of the DropConnect hyperparameters.•Estimation of DropConnect hyperparameter using the gap generalization optimization.

论文关键词:Deep neural networks,DropConnect,Regularization,Rademacher complexity,Generalization gap

论文评审过程:Received 4 January 2022, Revised 20 March 2022, Accepted 27 July 2022, Available online 1 August 2022, Version of Record 8 August 2022.

论文官网地址:https://doi.org/10.1016/j.knosys.2022.109567