Mutual Improvement Between Temporal Ensembling and Virtual Adversarial Training

作者:Wei Zhou, Cheng Lian, Zhigang Zeng, Yixin Su

摘要

The research of semi-supervised learning (SSL) is of great significance because it is very expensive to collect a large quantity of data with labels in some fields. Two recent deep learning-based SSL algorithms, temporal ensembling and virtual adversarial training (VAT), have achieved state-of-the-art accuracy in some classical SSL tasks, while both of them have shortcomings. Because of simply adding random noise to training data, temporal ensembling is not fully utilized. In addition, VAT has considerable time costs because there are two inferences in each epoch for unlabeled samples. In this paper, we propose the use of virtual adversarial perturbations (VAP) in temporal ensembling rather than random noises to improve performance. Moreover, we also find that reusing VAP can accelerate the training process of VAT without losing obvious accuracy. The two methods are validated on MNIST, FashionMNIST and SVHN.

论文关键词:Temporal ensembling, Virtual adversarial perturbations, Accelerate training process, Semi-supervised learning

论文评审过程:

论文官网地址:https://doi.org/10.1007/s11063-019-10132-7