Multi-task prediction method of business process based on BERT and Transfer Learning

作者:

Highlights:

摘要

Predictive Business Process Monitoring (PBPM) is one of the essential tasks in Business Process Management (BPM). It aims to predict the future behavior of an ongoing case using completed cases of a process stored in the event log, such as the prediction of the next activity and outcome of the case, etc. Although various deep learning methods have been proposed for PBPM, none of them consider the simultaneous application to multiple predictive tasks. This paper proposes a multi-task prediction method based on BERT and Transfer Learning. First, the method performs the Masked Activity Model (MAM) of a self-supervised pre-training task on many unlabeled traces using BERT (Bidirectional Encoder Representations from Transformers). The pre-training task MAM captures the bidirectional semantic information of the input traces using the bidirectional Transformer structure in BERT. It obtains the long-term dependencies between activities using the Attention mechanism in the Transformer. Then, the universal representation model of the traces is obtained. Finally, two different models are defined for two prediction tasks of the next activity and the outcome of the case, respectively, and the pre-trained model is transferred to the two prediction models for training using the fine-tuning strategy. Experiments evaluation on eleven real-world event logs shows that the performance of the prediction tasks is affected by different masking tactics and masking probabilities in the pre-training task MAM. This method performs well in the next activity prediction task and the case outcome prediction task. It can be applied to several different prediction tasks faster and with more outstanding performance than the direct training method.

论文关键词:Predictive business process monitoring,Transfer Learning,Transformer,BERT,Masked Activity Model

论文评审过程:Received 19 April 2022, Revised 20 July 2022, Accepted 3 August 2022, Available online 6 August 2022, Version of Record 22 August 2022.

论文官网地址:https://doi.org/10.1016/j.knosys.2022.109603