Markov inequality: More precise bound?

78 Views Asked by At

Given a random variable X, we know that $P[X\geq A] = 1$. By Markov inequality, we obtain that $E[X]\geq A$. Or, in other words, $E[X] = A + \lambda,\,\,\lambda\geq 0$. Is there any way I can more precisely characterize the $\lambda$? E.g., if I know the variance of $X$? Or applying some other bounds, less conservative than Markov's.

1

There are 1 best solutions below

0
On

Knowing the variance will not tell you anything useful here. For example, consider $X$ such that for some $B > A$ and $0 < p < 1$, $X = A$ with probability $p$ and $B$ with probability $q=1-p$. Then $$ \eqalign{E[X] &= pA + q B = A + q(B-A)\cr E[X^2] &= A^2 + q(B^2 - A^2)\cr \text{Var}(X) &= E[X^2] - E[X]^2 = (B-A)^2(q-q^2) = \frac{1-q}{q} (E[X]-A)^2\cr}$$ For any desired values $v > 0$ and $e > A$ of $\text{Var}(X)$ and $E[X]$ respectively, you can solve for $B$ and $q$ to get those values: $$ \eqalign{B &= e + \frac{v}{e-A}\cr q &= \frac{1}{1 + v/(e-A)^2}\cr} $$