Chebyshev's inequalities: Let $X$ be a random variable with finite expected value $\mu$ and finite non-zero variance $\sigma^{2}$. Then for any real number $\delta > 0$,
$$ \Pr[|X - \mu| \geq \delta\sigma] \leq \frac {1}{\delta^{2}}$$
There is a tight example in wiki, $X_c$ is a random variable with $\sigma = 1/c$: $$ \left\{ \begin{aligned} &\Pr[X_c = -1] = \frac{1}{2c^{2}} \\ &\Pr[X_c = 0] = 1 - \frac{1}{c^{2}} \\ &\Pr[X_c = 1] = \frac{1}{2c^{2}} \\ \end{aligned} \right. $$ If $\delta = c$, then we have $$ \Pr[|X - \mu| \geq \delta\sigma] = \Pr[|X| \geq 1] = \frac {1}{\delta^{2}}$$ But for $\delta > c$, it is not tight. Is there another example that is tight for infinite large $\delta$?
In addition, suppose $X_{1}, X_{2}, \ldots, X_{n}$ are i.i.d. random variables with finite expected value $\mu$ and finite non-zero variance $\sigma^{2}$. According to Chebyshev's inequalities: $$\Pr\left[\left|\sum_{i}^{n}X_{i} - n\mu\right| \geq \delta n\sigma\right] \leq \frac{1}{n\delta^{2}}$$ Is there also an (asymptotic) tight example for $\{ X_{i} \}_{i}$?
For your first question, consider a r.v. $X$ with finite first and second moment, variance $\sigma^2>0$ and w.l.o.g. $\mu=E(X)=0$. Assume that the Chebyshev inequality $$P(|X|\geq\varepsilon)\leq\frac{\sigma^2}{\varepsilon^2}$$ is tight for all $\varepsilon\geq\varepsilon_0$. Let $f_X$ be the PDF of $X$. Then $$P(|X|\geq\varepsilon)=\int_\varepsilon^\infty f_X(x)+f_X(-x)dx=\sigma^2\varepsilon^{-2}$$ which implies $f_X(x)+f_X(-x)=2\sigma^2x^{-3}$ for all $x>\varepsilon_0$. But now we have run into a contradiction, since $$E(X^2)\geq\int_{\varepsilon_0}^\infty x^2(f_X(x)+f_X(-x))dx=\int_{\varepsilon_0}^\infty 2\sigma^2x^{-1}dx=\infty$$ contrary to our assumption that the second moment is finite, which is what allowed us to use the inequality in the first place.