Let $P$ be discrete distribution where the $i$-th value has probability $p_i$. Define $H$ as the random variable returning $-\log(p_i)$ with probability $p_i$.
By definition, ${E}(H) = \mathrm{H}(P)$. Is any non-trivial upper bound on ${Pr}(H \geq a{E}(H)+b)$ is known which is independent of the distribution $P$?
The following counterexample will show that the inequality you ask about, as stated, cannot be non-trivially bounded (i.e. restrictions on a and b must be made). If $X \sim Ber(p)$, then for $p > 0.5$, we have that $H_1 < H_0$ where $H_i$ is the value of $H$ when $X = i$. Thus, $\mathrm{Pr}(H \leq E[H]) = \mathrm{Pr}(X = 1) = p$. Since $p$ can be made arbitrarily close to 1, we have that the only possible bound is the trivial one.
Another example can also show that a lower bound is also not possible. To do this, consider fixed $n$ and consider $X \in \{1,\ldots,n\}$ with $p_n > \frac1n$ and $p_i = \frac{1-p_n}{n} < p_n$ for $i \neq n$. Then we have that $H_i > H_n$ for all $i \neq n$ by the monotonicity of $-\log$. But then we have that $\mathrm{Pr}(H \geq E[H]) = \mathrm{Pr}(X \neq n) = 1 - p_n$. But for each $n$, we can make $1 - p_n$ arbitrarily close to $\frac1n$ and sending $n$ to $\infty$, we get that we can make this probability arbitrarily close to 1.
Edit: For $a > 1$, however, we can get a non-trivial bound for $P(H \geq a E[H])$ by simply applying Markov's inequality: $$P(H \geq a E[H]) \leq \frac{E[H]}{a E[H]} = \frac1a$$.