문제

I'm reading a paper that refers to the limit as n goes to infinity of Rényi entropy. It defines it as ${{H}_{n}}\left( X \right)=\dfrac{1}{1-n} \log_2 \left( \sum\limits_{i=1}^{N}{p_{i}^{n}} \right)$. It then says that the limit as $n\to \infty $ is $-\log_2 \left( p_1 \right)$. I saw another article that uses the maximum of the ${{p}_{i}}'s$ instead of ${{p}_{1}}$. I think that this works out fairly easily if all of the ${{p}_{i}}'s$ are equal (a uniform distribution). I have no idea how to prove this for anything other than a uniform distribution. Can anyone show me how it's done?

도움이 되었습니까?

해결책

Suppose $p_1 = \max_i p_i$. We have $$ p_1^n \leq \sum_{i=1}^N p_i^n \leq N p_1^n. $$ Therefore $$ \frac{n \log p_1 + \log N}{1-n} \leq H_n(X) \leq \frac{n \log p_1}{1-n}. $$ As $n \rightarrow \infty$, $\log N/(1-n) \rightarrow 0$ while $n/(1-n) \rightarrow -1$.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 cs.stackexchange
scroll top