For discrete probability distributions R over a finite set $I$, I would like to use the $\delta-\epsilon$ method to show that the function that maps $R$ to $H_R(i)$ is uniformly continuous, where $H_R(i)$ is the entropy w.r.t $R$: $H_R(i)=-\sum_{x\in I}R(x)\log(R(x))$, hence, I would like to show the following:
For every $\epsilon>0$ there is a $\delta=\delta(\epsilon)>0$, s.t. for all discrete probability distributions $R,R'$ over $I$, if $|R-R'|<\delta$, then $|H_R(i)-H_{R'}(i)|<\epsilon$.
I found in a paper the following upper bound: $|H_R(i)-H_{R'}(i)|\leq \log(n)|R-R'|$, with $n\geq |I|$, so if I can prove that, I am done! But I cannot figure out how to prove this upper bound - there was no further explanation so I gather it must be some simple step I don't see.
I would appreciate any help on this!
(Not an answer, but too long for a comment)
If $I$ is finite, and discrete, it means that $I = \left\{x_1,...,x_n\right\}$ So the generic $R(x)$ can be written as
$$ R(x) = R(x ; \alpha_1,\ldots,\alpha_n) = \sum_{k=1}^{n} \alpha_k\delta(x-x_k) $$
where the $\alpha$s are positive parameters such that they sum to 1. So to prove that the uniform continuity you need to prove that is such respect to the $\alpha$s. I would also observe that the entropy $H(R) = H(\alpha_1,...\alpha_n)$ is max when the distribution is uniform, so for the bound I would try to bound $H\left(\frac{1}{n},\ldots,\frac{1}{n}\right)$.