Debiased learning and forecasting of first derivative

作者:

Highlights:

摘要

In the era of big data, there are many data sets recorded in equal intervals of time. To model the change rate of such data, one often constructs a nonparametric regression model and then estimates the first derivative of the mean function. Along this direction, we propose a symmetric two-sided local constant regression for interior points, an asymmetric two-sided local polynomial regression for boundary points, and a one-sided local linear forecasting model for outside points. Specifically, under the framework of locally weighted least squares regression, we derive the asymptotic bias and variance of the proposed estimators, as well as establish their asymptotic normality. Moreover, to reduce the estimation bias for highly-oscillatory functions, we propose debiased estimators based on high-order polynomials and derive their corresponding kernel functions. A data-driven two-step procedure for simultaneous selection of the model and tuning parameters is also proposed. Finally, the usefulness of our proposed estimators is demonstrated by simulation studies and two real data examples.

论文关键词:Boundary problem,Differenced estimation,Equally spaced design,Kernel learning,Nonparametric derivative estimation

论文评审过程:Received 11 October 2020, Revised 2 November 2021, Accepted 9 November 2021, Available online 30 November 2021, Version of Record 13 December 2021.

论文官网地址:https://doi.org/10.1016/j.knosys.2021.107781