I have two inequalities for the Shannon entropy $H(y)=-\sum{}y_{i}\log{}y_{i}$, where the $y_{i}$ are the $n$ coordinates of a point in an $n-1$-dimensional simplex with $\sum{}y_{i}=1$ (think of $y$ as a probability distribution on a finite outcome space).
$$ \mbox{(i) }H(y)<\frac{1}{n}\sum\left(\log{}y_{i}\right)-\log\frac{1}{n^{2}} $$
$$ \mbox{(ii) }H(y)>\log\frac{4}{n}-\sum\left[\left(\frac{3}{2}y_{i}+\frac{1}{2n}\right)\log\left(y_{i}+\frac{1}{n}\right)\right] $$
In the attached diagrams, the $y$ that fulfill these equations are in red and the remaining points in the simplex are in green. The two sets are not the same (the second one is slightly larger), but they have the same shape and are very similar. Why? (I came across these looking at some interesting properties of the Kullback-Leibler divergence, for example its violation of the triangle inequality.)


The greater and smaller sign are just unintuitive because you almost have $\frac{3}{2}H(y)$ on your right-hand side of (ii).
In fact, your (i) is equivalent to $$\log \frac{1}{n^2}< \sum (y_i+\frac{1}{n}) \log y_i$$ while your (ii) follows from $$ 2 \log \frac{4}{n} < \sum (y_i+\frac{1}{n}) \log (y_i+\frac{1}{n})$$ So your inequalities are very similar. As long as you don't want to conclude something more interesting, I don't think that there is more to it.