The full question is:
Show that if $$E(X) = 0$$ then $$P(x\geq a)\leq \frac{Var[X]}{Var[X] + a^2}$$
Also show that there is an X for which the equality holds.
I was able to note that: $$Var[X] = E[X^2]$$ but that's about as far as I've gotten. I know this is supposed to be a Markov or Chebyshev Inequality question, but that's as far as I've been able to get for now.
Let $V$ be the variance of the distribution.For $t,u > 0$, using Markov's inequality, we have the following: $$ \Pr{\{X \geq t\}} = \Pr{\{X+u \geq t+u\}} \leq \Pr{\{(X+u)^2 \geq (t+u)^2\}} \leq \frac{E[(X+u)^2]}{(t+u)^2} = \frac{V + u^2}{(t+u)^2} $$
(the inequality was used in the fourth comparison, for going from the probability to the fraction)
Now, since the above holds for all $u$, we actually would like to minimize the function $\frac{V + u^2}{(t+u)^2}$ with respect to $u$, given that $V$ and $t$ are constants. This is a simple exercise in differential calculus, the answer being $u = \frac{V}{t}$. Putting that value gives: $$ \Pr{\{X \geq t\}} \leq \frac{V}{V + t^2} $$