I'm trying to solve the following exercise but I'm not sure if what I'm doing is right.
"Let $X$ be an r.v. distributed as $\chi_{40}^{2}$. Use Tchebichev’s inequality in order to find a lower bound for the probability $P(|(X/40) − 1| ≤ 0.5)$, and compare this bound with the exact value found from the $\chi^{2}$ Distribution Table."
Considering that $\mu=40$ and $\sigma=\sqrt{2\times40}$ my approach was turning the inequality into:
$P(-20\leq|X-40|\leq 20)\geq 1-\frac{1}{k^{2}}$
In order to obtain:
$P(|X-40| ≤ 20)\geq 1-\frac{1}{k^{2}}$
$P(|X-40| ≤ 20)\geq 1-\frac{1}{2.236^{2}}=0.8$
But this result doesn't match with the Distribution Table.
First recall Markov's inequality:
This follows from observing that $$ \mathbb E[X] = \int_0^\infty x\ \mathsf d F(x) \geqslant \int_a^\infty x\ \mathsf dF(x) \geqslant a\int_a^\infty\ \mathsf dF(x) = a\cdot\mathbb P(X\geqslant a) $$
Chebyshev's inequality states:
This follows immediately from Markov's inequality:
\begin{align} \mathbb P\left(|X-\mathbb E[X]|\geqslant a\sigma \right) &= \mathbb P\left(|X-\mathbb E[X]|^2\geqslant a^2\sigma^2 \right)\\ &\leqslant \frac{\mathbb E[|X-\mathbb E[X]|^2]}{a^2\sigma^2}\\ &= a^2. \end{align}
Since $X$ is an absolutely continuous random variable, we have $$ 1 = \mathbb P(|X-E[X]|\geqslant a\sigma) + \mathbb P(|X-E[X]|\leqslant a\sigma), $$ so that $$ \mathbb P(|X-E[X]|\leqslant a\sigma)\geqslant 1-\frac1{a^2}. $$
In this example, it is clear that $$ \{|(X/40)-1|\leqslant 1/2 \}= \{|X-40|\leqslant 20\}. $$ Since $X\sim\chi_{40}^2$, we have $\mathbb E[X] = 40$ and $\sigma^2 = 2\mathbb E[X] = 80$ so that $\sigma = 2\sqrt{20}$. Now, $20 = \sqrt{20}/2\cdot\sigma$, so we conclude that $$ \mathbb P(|X/40-1|\leqslant 1/2) \geqslant 1 - \geqslant \frac 15 = \frac45. $$
Indeed this inequality is valid, as if we compute the actual probability from the density of $X$, we find that \begin{align} \mathbb P[|X-40|\leqslant 20] &= \int_{(20,60)} f_X(x)\ \mathsf dx\\ &= \int_{20}^{60} \frac{x^{19} e^{-\frac{x}{2}}}{2^{20} ((20-1)!)} \, dx\\ &= \frac{325946782122931}{14849255421}e^{-10}-\frac{529037857402226791}{2263261}e^{-30}\\ &\approx 0.974672. \end{align}
It is important to note that such simply derived tail bounds, while easy to apply, need not be "tight." For example, if $X_1,\ldots,X_n$ are independent Bernoulli random variables with success probabilities $p_1,\ldots,p_n$ and $X = \sum_{i=1}^n X_i$, with $\mu:=\sum_{i=1}^n p_i$ then Markov's inequality yields $$ \mathbb P(X>n/2) \leqslant \frac{\mu}{n/2} = \frac 2n\mu. $$ Clearly this bound is only useful for $0<\mu<\frac n2$. Now, if $\varphi:[0,\infty]\to\mathbb R$ is a monotone increasing function with $\varphi(a)>0$ and $X$ only assumed to be integrable (not necessarily non-negative), then applying Markov's inequality to $\varphi(|X|)$ and $\varphi(a)$, we find that $$ \mathbb E[|X|\geqslant a] = \mathbb P(\varphi(|X|\geqslant \varphi(a))\leqslant\frac{\mathbb E[\varphi(|X|)}{\varphi(a)}.\tag1 $$ Take $\varphi$ to be the map $x\mapsto e^{\theta x}$ - then $\mathbb E[\varphi(|X|)]$ is the moment-generating function of $|X|$. Then $(1)$ implies that $$ \mathbb P(X\geqslant a)\leqslant e^{-\theta a} \mathbb E\left[\prod_{i=1}^n e^{\theta X_i}\right], $$ and hence $$ \mathbb P(X\geqslant a)\leqslant\min_{\theta >0} e^{-\theta a}\mathbb E\left[\prod_{i=1}^n e^{\theta X_i}\right], $$ where For each $i$ we have $$ \mathbb E[e^{\theta X_i}] = 1-p_i+p_ie^{\theta} = 1+p_i(e^\theta-1)\leqslant e^{p_i(e^\theta-1)}, $$ which converges for all real $\theta$. Since the moment-generating function of the (finite) sum of random variables is simply the product of the individual moment-generating functions (a good exercise to prove), we have \begin{align} \mathbb E[e^{\theta X}] &= \prod_{i=1}^n \mathbb E[e^{\theta X_i}]\\ &= \prod_{i=1}^n (1-p_i+p_ie^{\theta})\\ &\leqslant \prod_{i=1}^n e^{p_i(e^\theta-1)}\\ &\leqslant e^{(e^\theta-1)\mu}. \end{align} As this is true for all $\theta>0$, it follows that $$ \mathbb E[e^{\theta X}] \leqslant \min_{\theta>0} e^{(e^\theta-1)\mu}. $$ Fix $\delta>0$ and set $a=(1+\delta)\mu$, then $$ \mathbb P(X\geqslant (1+\delta)\mu) \leqslant\min_{\theta>0} e^{-\theta(1+\delta)\mu} e^{(e^\theta-1)\mu}.\tag2 $$ The motivation for minimizing over $\theta$ is to make the bound as tight as possible (this particular method is known as the Chernoff bound). Indeed, differentiating the right-hand side of $(2)$ gives $$ \frac{\mathsf d}{\mathsf d\theta} \left[e^{-\theta(1+\delta)\mu} e^{(e^\theta-1)\mu}\right] = e^{\left(e^{\theta }-1\right) \mu -(\delta +1) \theta \mu } \left(e^{\theta } \mu -(\delta +1) \mu \right), $$ which is equal to zero precisely when $\theta = \log(1+\delta)$. Plugging in this value of $\theta$, we have $$ \mathbb P(X\geqslant (1+\delta)\mu) \leqslant\left(\frac{e^\delta}{(1+\delta)^{1+\delta}} \right)^\mu\tag3. $$ Taking the logarithm of the right-hand side of $(3)$ yields $$ \mu(\delta-(1+\delta)\log(1+\delta)).\tag4 $$ Now, comparing the series expansions $$ \log(1+x) = \sum_{n=1}^\infty \frac{(-1)^{n+1}}n $$ and $$ \frac x{1+x/2} = \sum_{n=1}^\infty (-1)^{n+1}2^{-(n+1)}, $$ we see that for $x>0$, $$ \log(1+x)\geqslant \frac x{1+x/2}.\tag 5 $$ Applying the inequality from $(5)$ to $(4)$, we obtain $$ \mu(\delta-(1+\delta)\log(1+\delta))\leqslant -\frac{\delta^2}{2+\delta}\mu. $$ Putting this all together, we find the following bound for the upper tail: $$ \mathbb P(X\geqslant (1+\delta)\mu) \leqslant e^{-\frac{\delta^2}{2+\delta}\mu}. $$ An analogous argument leads to the following bound for the lower tail: $$ \mathbb P(X\leqslant (1-\delta)\mu) \geqslant 1 - e^{-\mu\delta^2/2}. $$
It is considerably more difficult to derive these bounds for the $\chi^2$ distribution, but the result is that for $0<\delta<1$, \begin{align} \mathbb P(X\geqslant (1+\delta)\mu )&\leqslant e^{-\mu(\varepsilon^2(1-\varepsilon)/4}\\ \mathbb P(X\leqslant (1-\delta)\mu )&\leqslant 1- e^{-\mu(\varepsilon^2(1-\varepsilon)/4} \end{align} Here $\mu = 40$, $\delta=\frac12$, and $\varepsilon=\frac12$. Plugging those values in results in the bound $$ \mathbb P(|X-40|\leqslant 20) \geqslant 1 - e^{-\frac54} \approx 0.713495, $$ which indeed is a tighter bound.