Chebyshev's inequality and quadratic function

183 Views Asked by At

I am trying to use Chebyshev's inequality in order to find sample sizes $n$ such that some condition is met with probability $p$ or larger. That is, find $n\in \mathbb{N}$ such that $\mathbb{P}(X_n > k) \geq p$ $(p \in (0,1), k > 0)$ .

Subtracting from both sides, I get that $\mathbb{P}(X_n > k) = \mathbb{P}(X_n - \mathbb{E}(X_n) > k - \mathbb{E}(X_n))$.

Also, $\mathbb{P}(X_n > k) = 1 - \mathbb{P}(X_n \leq k)$.

So, using Chebyshev's inequality, I end up with: $\mathbb{P}(X_n > k) \geq \mathbb{P}(|X_n - \mathbb{E}(X_n)| > k - \mathbb{E}(X_n)) \geq \displaystyle\frac{V(X_n)}{(k - \mathbb{E}(X_n))^2}$.

Now, after doing that in some exercises, I am getting an inequality consisting of a quadratic function and the constant $0$, where the function has $a > 0$ and two positive roots.

If I take that inequality literally, it says that the original probability bound can be satisfied with small values of $n$ (even $n=0$!), which does not make much sense.

Therefore there must be some other equation or hypothesis to take into account before finding a correct bound for $n$. What is that extra piece of information am I missing?

Thanks a lot.