Symmetry-adapted representation learning

作者:

Highlights:

摘要

In this paper, we propose the use of data symmetries, in the sense of equivalences under signal transformations, as priors for learning symmetry-adapted data representations, i.e., representations that are equivariant to these transformations. We rely on a group-theoretic definition of equivariance and provide conditions for enforcing a learned representation, for example the weights in a neural network layer or the atoms in a dictionary, to have the structure of a group and specifically the group structure in the distribution of the input. By reducing the analysis of generic group symmetries to permutation symmetries, we devise a regularization scheme for representation learning algorithm, using an unlabeled training set. The proposed regularization is aimed to be a conceptual, theoretical and computational proof of concept for symmetry-adapted representation learning, where the learned data representations are equivariant or invariant to transformations, without explicit knowledge of the underlying symmetries in the data.

论文关键词:Representation learning,Equivariant representations,Invariant representations,Dictionary learning,Convolutional neural networks,Regularization,Data transformations

论文评审过程:Received 28 February 2018, Revised 25 May 2018, Accepted 22 July 2018, Available online 9 August 2018, Version of Record 23 September 2018.

论文官网地址:https://doi.org/10.1016/j.patcog.2018.07.025