Lower Bounds on Markov's Inequality

815 Views Asked by At

Problem 20.3 If $R$ is a nonnegative random variable, then Markov’s Theorem gives an upper bound on $Pr[R \geq x]$ for any real number $x \gt Ex[R]$. If b is a lower bound on R, then Markov’s Theorem can also be applied to $R - b$ to obtain a possibly different bound on $Pr[R \geq x]$.

a) Show that if $b > 0$, applying Markov’s Theorem to $R - b$ gives a smaller upper bound on $Pr[R \geq x]$ than simply applying Markov’s Theorem directly to R.

b) What value of $b > 0$ in part (a) gives the best bound?

This question is from Mathematics for Computer Science text book (MIT text for discrete math course). I would like to receive some help with this problem and if possible to send any resources to check if my solution is right or not. I am using the textbook but the problem is that I don't have a way to verify my solutions.

2

There are 2 best solutions below

1
On

For part a:

$$\mathbb P (R - b \geq t) \leq \frac{\mathbb E R - b}{t} \leq \frac{\mathbb E R}{t}$$

where the second inequality follows when $b \geq 0$

For part b:

The RHS of the inequality given is linear in $b$ and thus is maximized at the constraint: $ X-b \geq 0$ with probability $1$. So the best value of $b$ is the largest lower bound of $X$ also known at the essential infimum of $X$.

2
On

Since $R-b\ge 0$ by assumption, a simple application of the Markov's inequality gives $$\Pr(R\ge x)=\Pr(R-b\ge x-b)\le \frac{\text{Ex}[R-b]}{x-b}=\frac{\text{Ex}[R]-b}{x-b}.$$ Now, observe that $$f(b):=\frac{\text{Ex}[R]-b}{x-b}\le \frac{\text{Ex}[R]}{x}$$ holds on $0\le b < \text{Ex}[R]$ and that $f(b)$ is monotonically decreasing on $0\le b < \text{Ex}[R]$ (you can sketch a graph and verify these results).

Moreover, notice that the upper bound $\frac{\text{Ex}[R]-b}{x-b}$ on $\Pr(R\ge x)$ approaches to $0$ as $b\uparrow \text{Ex}[R].$ So, you can make this upper bound tighter and tighter by selecting $b$ to be closer to $\text{Ex}[R]$ from left. Thus, the best (tightest) bound would be when $b$ is the largest lower bound on $\text{Ex}[R].$ However, I'm not sure if I would call it the best bound (although it is certainly the tightest) since this requires further assumptions that $R\ge b$ (so that $R-b$ remains non-negative) and obviously this assumption becomes stronger as we increase $b$.


Edit:

"I still don't get how did you reach the conclusion that $f(b)\le \text{Ex}[R] / x.$": Simple analysis using derivatives. Also, note that I wrote $f(b)\le \text{Ex}[R] / x$ holds on $[0,\text{Ex}[R])$. In fact, this inequality doesn't hold if $b\in(-\infty,0)\cup(x,\infty).$ I suppose you are familiar with calculus and how functions behave.

Note that $\text{Ex}[R]$ and $x$ are non-negative constants, such that $\text{Ex}[R]<x$. So, you can modify $f(b)$ as $$ f(b)=\frac{\text{Ex}[R]-b}{x-b}=\frac{x-b}{x-b}-\frac{x-\text{Ex}[R]}{x-b}=1-\frac{c}{x-b},$$ where $c=x-\text{Ex}[R]$ is a positive constant. Hence, $$\frac{df}{db}=\frac{-c}{(x-b)^2},$$ which is always negative since $c>0$. We have $b<E[R]<x$, so that we don't have to worry about the function being undefined (i.e., $b=x$). As you mentioned $(\text{Ex}[R]-b)/(x-b)$ and $\text{Ex}[R]/x$ are equal when $b=0$ and noting that $\text{Ex}[R]/x$ is just a constant horizontal line, we clearly have $$\frac{\text{Ex}[R]-b}{x-b}<\frac{\text{Ex}[R]}{x}$$ on the interval $(0,\text{Ex}[R])$ since $df/db$ is negative. Obviously, this inequality holds on $(0,x)$ as well but this is not necessary. So, yes if you don't want to include equality just get rid of $0$.