Multiple premises entailment recognition based on attention and gate mechanism

作者:

Highlights:

• A parallel model for the Multiple Premises Entailment recognition is proposed.

• A gate mechanism to obtain the local matching features is proposed.

• Use self-attention to match all local features to get overall semantic dependence.

• A fusion gate is used to integrate local features to a final feature.

• A fine-tuning method is proposed to alleviate the over-fitting problem.

摘要

•A parallel model for the Multiple Premises Entailment recognition is proposed.•A gate mechanism to obtain the local matching features is proposed.•Use self-attention to match all local features to get overall semantic dependence.•A fusion gate is used to integrate local features to a final feature.•A fine-tuning method is proposed to alleviate the over-fitting problem.

论文关键词:Natural language inference,Multiple premise entailment,Attention mechanism,Gate mechanism,Fine-tune

论文评审过程:Received 20 March 2019, Revised 3 August 2019, Accepted 16 January 2020, Available online 16 January 2020, Version of Record 24 January 2020.

论文官网地址:https://doi.org/10.1016/j.eswa.2020.113214