Quantcast
Channel: Limiting Entropy of deterministic sequences - 1 - MathOverflow
Viewing all articles
Browse latest Browse all 2

Answer by Bjørn Kjos-Hanssen for Limiting Entropy of deterministic sequences - 1

$
0
0

Regarding (2), the answer seems to be no by the reasoning at Entropy difference dominance of sequencesThat is, the limiting entropy can be that of the geometric distribution, which is finite.

The boundary between finite and infinite entropy may be close to a distribution like $p_1,\dots,p_m$ where $p_k$ is on the order of$$\frac1{k(\log k)^a},\quad a\in\{2,3\}.$$Because then the entropy will be$$\sum_k \frac{-1}{k(\log k)^a}\cdot\log\left(\frac{1}{k(\log k)^a}\right)$$$$=\sum_k \frac{\log k+a\log\log k}{k(\log k)^a}\approx \sum_k \frac{1}{k(\log k)^{a-1}}$$which goes to infinity as $m\rightarrow\infty$ if $a=2$, but not if $a=3$.


Viewing all articles
Browse latest Browse all 2

Trending Articles