Consider a random variable $$X = \begin{cases}x_0:&\text{with probability } \delta \\ x_1:& \text{with probability } 1-\delta \end{cases}$$ The entropy of $X$ is $$H(X) = \delta \log \frac{1}{\delta} + (1-\delta) \log \frac{1}{1-\delta}$$
How can we show that for all $\epsilon>0$, we can always find a $\delta > 0$ such that $H(X) < \epsilon$?
I'm not able to find a way to do this cleanly. For example, I've tried playing around with setting $\delta = c \epsilon$ for different values of $c$ (such as $\delta = \epsilon, \delta = \frac{\epsilon}{1000}$, etc) but I haven't been able to find a way to find the appropriate $\delta$ or $c$ that would do this.
Some other approaches I had included using the log-sum inequality ($\sum_i a_i \log \frac{a_i}{b_i} \geq (\sum_i a_i) \log \frac{\sum_i a_i}{\sum_i b_i}$) and other bounds on $\log \delta$ such as $1-\frac{1}{\delta} \leq \ln x \leq \delta-1$, but the best I've been able to do has been to show that $H(X) \leq 1$, which isn't useful.
Some thoughts:
If you want to find a good bound for the solution $\delta \in (0, 1/2)$ of the equation $\delta \log \frac{1}{\delta} + (1-\delta) \log \frac{1}{1-\delta} = \epsilon$, here are some thoughts.
Let $\epsilon \in (0, \ln 2)$.
It is easy to prove that, for all $\delta \in (0, 1)$, $$(1 - \delta)\ln\frac{1}{1 - \delta} \le \delta.$$ (Note: Simply use $\ln u \le u - 1$.)
Thus, we have, for all $\delta \in (0, 1)$, $$\delta \ln \frac{1}{\delta} + (1-\delta) \ln \frac{1}{1-\delta} \le \delta \ln \frac{1}{\delta} + \delta.$$
The equation $\delta \ln \frac{1}{\delta} + \delta = \epsilon$ has exactly one real solution on $(0, 1)$, say $$\delta_0 = \frac{\epsilon}{-W_{-1}(-\epsilon \mathrm{e}^{-1})}$$ where $W_{-1}(\cdot)$ is the Lambert W function.
We have, for all $\delta \in (0, \delta_0)$, $$\delta \ln \frac{1}{\delta} + (1-\delta) \ln \frac{1}{1-\delta} < \epsilon.$$