On the Choice of Inter-Class Distance Maximization Term in Siamese Neural Networks

作者:Abdulrahman O. Ibraheem

摘要

Recent systems from premier research labs, such as Facebook’s and Google’s, employ variants of the basic siamese neural networks (SNNs), a testimony to how SNNs are becoming very important in practical applications. The objective function of an SNN comprises two terms. Whereas there are no issues about the choice of the first term, there appears to be some issues concerning the choice of the second term, along the lines of: 1. apriori boundedness from below; and 2. vanishing gradients. Therefore, in this work, I study four possible candidates for the second term, in order to investigate the roles of apriori boundedness from below, and vanising gradients, on classification accuracy, as well as to, more importantly, from a practical standpoint, elucidate the effects, on classification accuracy, of using different types of second terms in SNNs. My results suggest that neither apriori boundedness nor vanishing gradients are crisp decisive factors governing the performances of the candidate functions. However, results show that, of the four candidates evaluated, a particular candidate features generally superior performance. I therefore recommend this candidate to the community, and this recommendation attains especial importance when taken against a backdrop of another facet of this work’s results which indicates that choosing a wrong objective function could cause classification accuracy to dip by as much as \(17 \%\).

论文关键词:Siamese, Neural networks, Backpropagation

论文评审过程:

论文官网地址:https://doi.org/10.1007/s11063-018-9882-9