For iid $X_i$ taking values in $\{0, 1\}$ with parameter $E[X_i]$ show that when $nE[X_i] > 1$: $$P\left(\frac{1}{n}\sum_i^n X_i > E[X_i]\right) \leq1/4$$
This inequality is from the proof of Lemma 4.1 in Vapnik's Statistical Learning Theory.
My first thought is to bound this with a normal approximation, but I need this for all n such that $nE[X_i] > 1$. Also, of course Markov's inequality is no help here.
Your probability is $P(n^{-1/2} \sum_1^n (X_i - E[X_i])>0)$. By the CLT, this converges to 1/2 as $n\to\infty$, where your condition $n E[X_i]>1$ is satisfied for $n$ large enough.
So I doubt that the claimed inequality is true.