I'm working on a problem that uses an alternative approach of proving Bennett's inequality. In the common approach, one uses Taylor expansion for the MGF to derive an expression which is then used to prove Bennett's inequality. However, this does not seem to be the case here.
Show that for any $s > 0$, and any random variable $X$ with $\mathbb{E}(X) = 0$, $\mathbb{E}(X^2) = \sigma^2, X \leq c$, $$ \mathbb{E}(e^{sX}) \leq e^{f(\sigma^2/c^2)}, $$ where $$ f(u) = \log \left( \frac{1}{1+u}e^{-csu} + \frac{u}{1+u}e^{cs} \right). $$
Can anyone see how to go about proving this first step? It looks like this result uses some kind of convexity argument but I cannot figure out how they go about deriving this exact expression.
I'm not sure if there is a typo in the question, but I'll explain my thoughts about it.
By performing some simple algebra, we see that the desired inequality can be stated as
$$ E(e^{sX}) \le \frac{c^2}{\sigma^2 + c^2 } e^{-s \sigma^2/c} + \frac{\sigma^2}{\sigma^2 + c^2 } e^{sc} $$ We write
$$ E(e^{sX}) = E(e^{sX}1_{\{X < -\sigma^2/c \} })+E(e^{sX}1_{\{X \ge -\sigma^2/c \} }). $$
For the first term, since $s \ge 0$,: $$ E(e^{sX}1_{\{X < -\sigma^2/c \} }) \le e^{-s \sigma^2/c}P(X < -\sigma^2/c). $$ Using Cantelli's inequality, https://en.wikipedia.org/wiki/Cantelli%27s_inequality ,
$$ P(X < -\sigma^2/c) = P(-X > \sigma^2/c) \le \frac{\sigma^2 }{\sigma^2 + \sigma^4/c^2} = \frac{c^2}{\sigma^2 + c^2 }. $$
Hence the first term satisfies the inequality
$$ E(e^{sX}1_{\{X < -\sigma^2/c \} }) \le \frac{c^2}{\sigma^2 + c^2 } e^{-s \sigma^2/c}. $$ This is half of what we want. Using that $X \le c$ a.s., we get for the second term that
$$ E(e^{sX}1_{\{X \ge -\sigma^2/c \} }) \le e^{sc} P(X \ge -\sigma^2/c). $$ As such the result follows if $$P(X \ge -\sigma^2/c) \le \frac{\sigma^2}{\sigma^2 + c^2 }.$$
But actually we have by Cantelli's inequality again that $$P(X \ge -\sigma^2/c) \ge \frac{\sigma^2}{\sigma^2 + c^2 }!$$.