Chebyshev's Inequality and Euler's constant question clarification

128 Views Asked by At

enter image description here

Here's a question to which I don't understand the answer to:

enter image description here

In particular:

1) I don't understand why do they choose epsilon to be between 0 and 1? Is there a specific reason i'm missing?

2)Why does the second inequality hold? The first I understand why it holds, but don't see the second one being obviously true

3)How did they deduce this whole expression? How did they plug in $C_n$ ?

Any help would be appreciated!

1

There are 1 best solutions below

0
On BEST ANSWER
  1. Well, the thing that they're showing goes to zero is decreasing in $\epsilon,$ so there's no reason not to cap $\epsilon$ at one if expressions will cease to be as simple for $\epsilon > 1.$ (For instance, would the second inequality even make sense for $\epsilon >1$?)
  2. Because $H_n:=\sum_{k=1}^n (1/k)$ is larger than $\log(n)$ by $\gamma$ as $n\to \infty$ and changing $\epsilon$ to $\epsilon/2$ also works in your favor. (To my mind, this is somewhat easier than the first inequality, which needs to contend with the fact that $\log(n)$ is asymptotically smaller than $H_n.$)
  3. The inequalities combine to give $$|E(C_n)-\log(n)|\le \epsilon\log(n)-(\epsilon/2)H_n $$ so if we have $|C_n-\log(n)|\ge \epsilon \log(n)$ then $$ \epsilon\log(n)\le |C_n-\log(n)| \le |C_n-E(C_n)|+|E(C_n)-\log(n)|\le |C_n-E(C_n)|+\epsilon\log(n)-(\epsilon/2)H_n,$$ so we have $ (\epsilon/2)H_n \le |C_n-E(C_n)|.$