Relative Loss Bounds for On-Line Density Estimation with the Exponential Family of Distributions

作者:Katy S. Azoury, M. K. Warmuth

摘要

We consider on-line density estimation with a parameterized density from the exponential family. The on-line algorithm receives one example at a time and maintains a parameter that is essentially an average of the past examples. After receiving an example the algorithm incurs a loss, which is the negative log-likelihood of the example with respect to the current parameter of the algorithm. An off-line algorithm can choose the best parameter based on all the examples. We prove bounds on the additional total loss of the on-line algorithm over the total loss of the best off-line parameter. These relative loss bounds hold for an arbitrary sequence of examples. The goal is to design algorithms with the best possible relative loss bounds. We use a Bregman divergence to derive and analyze each algorithm. These divergences are relative entropies between two exponential distributions. We also use our methods to prove relative loss bounds for linear regression.

论文关键词:relative loss bounds, worst-case loss bounds, on-line algorithms, exponential family of distributions, linear regression, Bregman divergences

论文评审过程:

论文官网地址:https://doi.org/10.1023/A:1010896012157