Chebyshev Inequality - How is the following inferred ??

112 Views Asked by At

In chapter 3, Norm and Distance of Introduction to Applied Linear Algebra by Boyd, an example explaining the Chebyshev inequality for standard deviation is given as:

Consider a time series of return on investment, with a mean return of 8%, and a risk (standard deviation) 3%.

The author states -

By the Chebyshev inequality, the fraction of periods with a loss (i.e., $x_i$ ≤ 0) is no more than $(\frac{3}{8})^2 = 14.1%$. (In fact, the fraction of periods when the return is either a loss, $x_i$ ≤ 0, or very good, $x_i$ ≥ 16%, is together no more than 14.1%.)

Currently, I know that the Chebyshev inequality states that If k is the number of entries of x that satisfy |$x_i$ − avg(x)| ≥ a, then $\frac{k}{n}$$(\frac{std(x)}{a})^2$ .

But I am not able to correlate this to the example. Can someone please explain, how $(\frac{3}{8})^2$ is inferred?

1

There are 1 best solutions below

0
On

The way to see that is to remember he wants to know the fraction of periods incurred in losses, i.e. $$x_i <= 0$$.

If the mean return is 8% it implies that in order to have a loss, $$a$$ has to be equal or greater than 8. So $$a = 8$$ here. Also the reason he says the fraction of periods is no more than 14.1% for either a loss, $$x_i <= 0$$ or for very good returns, $$x_i >= 16$$.