A Neural Autoregressive Approach to Attention-based Recognition

作者:Yin Zheng, Richard S. Zemel, Yu-Jin Zhang, Hugo Larochelle

摘要

Tasks that require the synchronization of perception and action are incredibly hard and pose a fundamental challenge to the fields of machine learning and computer vision. One important example of such a task is the problem of performing visual recognition through a sequence of controllable fixations; this requires jointly deciding what inference to perform from fixations and where to perform these fixations. While these two problems are challenging when addressed separately, they become even more formidable if solved jointly. Recently, a restricted Boltzmann machine (RBM) model was proposed that could learn meaningful fixation policies and achieve good recognition performance. In this paper, we propose an alternative approach based on a feed-forward, auto-regressive architecture, which permits exact calculation of training gradients (given the fixation sequence), unlike for the RBM model. On a problem of facial expression recognition, we demonstrate the improvement gained by this alternative approach. Additionally, we investigate several variations of the model in order to shed some light on successful strategies for fixation-based recognition.

论文关键词:Deep learning, Attention-based recognition, Neural networks, Neural autoregressive distribution estimator

论文评审过程:

论文官网地址:https://doi.org/10.1007/s11263-014-0765-x