New Error Bounds for Solomonoff Prediction

作者:

Highlights:

摘要

Solomonoff sequence prediction is a scheme to predict digits of binary strings without knowing the underlying probability distribution. We call a prediction scheme informed when it knows the true probability distribution of the sequence. Several new relations between universal Solomonoff sequence prediction and informed prediction and general probabilistic prediction schemes will be proved. Among others, they show that the number of errors in Solomonoff prediction is finite for computable distributions, if finite in the informed case. Deterministic variants will also be studied. The most interesting result is that the deterministic variant of Solomonoff prediction is optimal compared to any other probabilistic or deterministic prediction scheme apart from additive square root corrections only. This makes it well suited even for difficult prediction problems, where it does not suffice when the number of errors is minimal to within some factor greater than one. Solomonoff's original bound and the ones presented here complement each other in a useful way.

论文关键词:induction,Solomonoff, Bayesian, deterministic prediction,algorithmic probability, Kolmogorov complexity

论文评审过程:Received 28 January 2000, Revised 16 November 2000, Available online 25 May 2002.

论文官网地址:https://doi.org/10.1006/jcss.2000.1743