Evaluation of interpretability methods for multivariate time series forecasting

作者:Ozan Ozyegen, Igor Ilic, Mucahit Cevik

摘要

Being able to interpret a model’s predictions is a crucial task in many machine learning applications. Specifically, local interpretability is important in determining why a model makes particular predictions. Despite the recent focus on interpretable Artificial Intelligence (AI), there have been few studies on local interpretability methods for time series forecasting, while existing approaches mainly focus on time series classification tasks. In this study, we propose two novel evaluation metrics for time series forecasting: Area Over the Perturbation Curve for Regression and Ablation Percentage Threshold. These two metrics can measure the local fidelity of local explanation methods. We extend the theoretical foundation to collect experimental results on four popular datasets. Both metrics enable a comprehensive comparison of numerous local explanation methods, and an intuitive approach to interpret model predictions. Lastly, we provide heuristical reasoning for this analysis through an extensive numerical study.

论文关键词:Interpretable AI, Time series forecasting, Multivariate, Regression, Local explanation

论文评审过程:

论文官网地址:https://doi.org/10.1007/s10489-021-02662-2