An overhead reduction technique for mega-state compression schemes

作者:

Highlights:

摘要

Many of the most effective compression methods involve complicated models. Unfortunately, as model complexity increases, so does the cost of storing the model itself. This paper examines a method to reduce the amount of storage needed to represent a Markov model with an extended alphabet, by applying a clustering scheme that brings together similar states. Experiments run on a variety of large natural language texts show that much of the overhead of storing the model can be saved at the cost of a very small loss of compression efficiency.

论文关键词:

论文评审过程:Received 17 January 1997, Accepted 6 May 1997, Available online 11 June 1998.

论文官网地址:https://doi.org/10.1016/S0306-4573(97)00034-4