By Stéphane Laurent


2013-04-27 17:15:54 8 Comments

Let $X_n$ be a random variable distributed on $A_n:=\{1, \ldots, n\}$ and $g_n\colon A_n \to A_n$ such that $\Pr\big(X_n \neq g_n(X_n)\big) \to 0$. Putting $Y_n=g(X_n)$ then by Fano's inequality $$\frac{H(X_n\mid Y_n)}{\log n} \to 0,$$ which can be written $$\frac{H(X_n\mid Y_n)}{H(X_n)} \to 0 \qquad (\ast)$$ in the particular case when $X_n$ is uniformly distributed on $A_n$.

I'm looking for a case for which $H(X_n)\to \infty$ and $(\ast)$ fails.

1 comments

@Anthony Quas 2013-04-27 23:46:35

Let $n=k+k^k$ and let $X$ take each element $1\le j\le k$ with probability $1/k-1/k^2$. Let the remaining $k^k$ elements have probability $1/k^{k+1}$. Let $g(i)=\min(k+1,i)$.

We now have $\mathbb P(X\ne Y)=\mathbb P(X>k)=k^k/k^{k+1}=1/k$, which converges to 0 as required.

We have $$ \begin{split} H(X)&=-k(1/k-1/k^2)\log (1/k-1/k^2)-k^k/k^{k+1}\log (1/k^{k+1}) \cr &= (1-1/k)(\log k-\log(1-1/k))+(1/k)(k+1)\log k\cr &=2\log k+O(\log k/k), \end{split} $$ which converges to $\infty$ as required.

Finally, given $Y$, $X$ is known if $Y$ is in the range $\{1,\ldots,k\}$. But if $Y$ takes the value $k+1$ (which occurs with probability $1/k$), then $X$ takes one of $k^k$ values with equal probability. Hence $H(X|Y)=(1/k)\log k^k=\log k$.

In particular, we have $$\frac{H(X|Y)}{H(X)}\to\frac 12. $$

@Stéphane Laurent 2013-04-28 08:06:30

Nice - thank you very much. You should just switch $H(Y|X)$ to $H(X|Y)$ :)

@Anthony Quas 2013-04-28 19:47:59

I made the substitution...

Related Questions

Sponsored Content

0 Answered Questions

1 Answered Questions

0 Answered Questions

Existence of a proper entropy scaling for a discrete measure

1 Answered Questions

[SOLVED] Finding discrete entropy via differential entropy

0 Answered Questions

0 Answered Questions

Maximizing Renyi entropy for a certain channel

1 Answered Questions

2 Answered Questions

[SOLVED] Proving a messy inequality

1 Answered Questions

Sponsored Content