A smoothed monotonic regression via L2 regularization

作者:Oleg Sysoev, Oleg Burdakov

摘要

Monotonic regression is a standard method for extracting a monotone function from non-monotonic data, and it is used in many applications. However, a known drawback of this method is that its fitted response is a piecewise constant function, while practical response functions are often required to be continuous. The method proposed in this paper achieves monotonicity and smoothness of the regression by introducing an L2 regularization term. In order to achieve a low computational complexity and at the same time to provide a high predictive power of the method, we introduce a probabilistically motivated approach for selecting the regularization parameters. In addition, we present a technique for correcting inconsistencies on the boundary. We show that the complexity of the proposed method is \(O(n^2)\). Our simulations demonstrate that when the data are large and the expected response is a complicated function (which is typical in machine learning applications) or when there is a change point in the response, the proposed method has a higher predictive power than many of the existing methods.

论文关键词:Monotonic regression, Kernel smoothing, Penalized regression, Probabilistic learning

论文评审过程:

论文官网地址:https://doi.org/10.1007/s10115-018-1201-2