A modified attention mechanism powered by Bayesian Network for user activity analysis and prediction

作者:

Highlights:

摘要

Analyzing and predicting user activity is important in the current digital era with a lot of use-cases and applications. In this paper, we present an approach that facilitates a modification of the attention mechanism in a Transformer model. This work enables to improve the predictive capacity of a forecasting model which is progressively fed by somewhat erratic and small data generated by the early stages of online activity. The key element of the work is to use a Bayesian Network (BN) as a tool for feature engineering that helps to modify the attention mechanism in the Transformer model in that scenario. The model predicts the next activity on a sequence of online activities that the user will engage in while interacting with a Learning Management System (LMS). Click-stream data refers to a detailed log of how participants navigate through an online platform during a working session. The main application of our work is to improve the Predictor module of a smart hybrid-classifier for an LMS. Several configurations and architectures for the RNN-powered predictor, are tested and assessed. The results of the improved predictive capacity of this work can be useful to users in an online learning environment where early assistance in quasi-real time is required. This research answers the questions of how click-stream data can assist in refining the tasks of the Attention mechanism to improve the quality of the prediction. Performance is measured by the accuracy, right-content and first-state accuracy scores for the incoming sequence and compared across alternative models. The method also provides systematic customization of the attention mechanism in Transformers that can be applied to a range of problems involving click-stream data.

论文关键词:Deep learning,Bayesian Networks,Click-stream data,Hybrid methods

论文评审过程:Received 19 September 2021, Revised 21 February 2022, Accepted 8 May 2022, Available online 18 May 2022, Version of Record 10 June 2022.

论文官网地址:https://doi.org/10.1016/j.datak.2022.102034