Enhancing low-resource neural machine translation with syntax-graph guided self-attention

作者:

Highlights:

• We propose a syntax-aware self-attention that integrates syntactic knowledge.

• The syntactic dependency is exploited as a guidance, without any extra cost.

• The syntactic dependency is converted as a graph to combine with the NMT model.

• The syntax-aware approach also explicitly exploits sub-word units.

• We introduce multiple attention representations for stronger robustness.

• Experiments demonstrate that the approach achieves state-of-the-art results.

摘要

•We propose a syntax-aware self-attention that integrates syntactic knowledge.•The syntactic dependency is exploited as a guidance, without any extra cost.•The syntactic dependency is converted as a graph to combine with the NMT model.•The syntax-aware approach also explicitly exploits sub-word units.•We introduce multiple attention representations for stronger robustness.•Experiments demonstrate that the approach achieves state-of-the-art results.

论文关键词:Neural machine translation,Low-resources,Prior knowledge incorporating

论文评审过程:Received 27 July 2021, Revised 20 February 2022, Accepted 16 March 2022, Available online 28 March 2022, Version of Record 13 April 2022.

论文官网地址:https://doi.org/10.1016/j.knosys.2022.108615