SILT: Efficient transformer training for inter-lingual inference

作者:

Highlights:

• Efficient alignment of multilingual embeddings for Natural Language Inference.

• Siamese pretrained multilingual transformers with frozen weights and mutual attention.

• A curated Spanish version of the SICK dataset, called SICK-es is provided.

• Drastic reduction of trainable parameters and ability to perform inter-lingual tasks.

摘要

•Efficient alignment of multilingual embeddings for Natural Language Inference.•Siamese pretrained multilingual transformers with frozen weights and mutual attention.•A curated Spanish version of the SICK dataset, called SICK-es is provided.•Drastic reduction of trainable parameters and ability to perform inter-lingual tasks.

论文关键词:Natural language inference,Embeddings,Sentence alignment,Transformers,Deep learning

论文评审过程:Received 18 May 2021, Revised 9 January 2022, Accepted 14 March 2022, Available online 28 March 2022, Version of Record 4 April 2022.

论文官网地址:https://doi.org/10.1016/j.eswa.2022.116923