Combining Non-sampling and Self-attention for Sequential Recommendation

作者:

Highlights:

• We propose a sequential recommendation model fusing non-sampling and self-attention.

• We introduce a Non-sampling Training loss to improve the accuracy and training speed.

• Our model significantly improves the performance and speeds up the training.

摘要

•We propose a sequential recommendation model fusing non-sampling and self-attention.•We introduce a Non-sampling Training loss to improve the accuracy and training speed.•Our model significantly improves the performance and speeds up the training.

论文关键词:Non-sampling mechanism,Self-attention,Sequential recommendation,User preference modeling

论文评审过程:Received 25 May 2021, Revised 2 October 2021, Accepted 4 November 2021, Available online 14 January 2022, Version of Record 14 January 2022.

论文官网地址:https://doi.org/10.1016/j.ipm.2021.102814