Suppose that you take a random sample of size n from a discrete distribution with mean $\mu$ and variance $x^2$. Using Chebyshev's inequality, determine how large n needs to be to ensure that the difference between the sample mean and $\mu$ is less that two standard deviations with probability exceeding 0.99.
I assume I will need to use the weak law of large numbers and subsequently Chebyshev's inequality but don't know how the two standard deviations fits in or exactly how to set it up. Any help or hints would be appreciated.
You may want to start of with formulating your problem. You want:
$P(| \bar {X} - \mu | \leq \epsilon)$ where $\bar{X}$ is the sample mean. So:
$P (|\bar{X} − \mu| ≤ ε) = P ((\bar{X} − \mu)^2 ≤ \epsilon^2 )$ $ =1-\frac{E[(\bar{X} − \mu)^2 ]}{\epsilon^2}$ $ =1-\frac{x^2/n}{\epsilon^2} =0.99$
Can you plug in the values and solve now?