Discrete sequence prediction and its applications

作者:Philip Laird, Ronald Saul

摘要

Learning from experience to predict sequences of discrete symbols is a fundamental problem in machine learning with many applications. We present a simple and practical algorithm (TDAG) for discrete sequence prediction. Based on a text-compression method, the TDAG algorithm limits the growth of storage by retaining the most likely prediction contexts and discarding (forgetting) less likely ones. The storage/speed tradeoffs are parameterized so that the algorithm can be used in a variety of applications. Our experiments verify its performance on data compression tasks and show how it applies to two problems: dynamically optimizing Prolog programs for good average-case behavior and maintaining a cache for a database on mass storage.

论文关键词:sequence extrapolation, statistical learning, text compression, speedup learning, memory management

论文评审过程:

论文官网地址:https://doi.org/10.1007/BF01000408