Entropic measures, Markov information sources and complexity

作者:

Highlights:

摘要

The concept of entropy plays a major part in communication theory. The Shannon entropy is a measure of uncertainty with respect to a priori probability distribution. In algorithmic information theory the information content of a message is measured in terms of the size in bits of the smallest program for computing that message. This paper discusses the classical entropy and entropy rate for discrete or continuous Markov sources, with finite or continuous alphabets, and their relations to program-size complexity and algorithmic probability. The accent is on ideas, constructions and results; no proofs will be given.

论文关键词:Shannon's entropy,Entropy rate,Program-size complexity,Algorithmic probability

论文评审过程:Available online 5 August 2002.

论文官网地址:https://doi.org/10.1016/S0096-3003(01)00199-0