Recall the Polynomial Markov Inequality:
Fix $k \in \mathbb{N}$. Suppose that $X$ is a random variable such that $\mathbb{E}[|X|^k] < \infty$. Then, for all $t > 0$, we have that (denoting $\mathbb{E}[X] = \mu$) \begin{equation*} P(|X-\mu|\geq t] \leq \frac{\mathbb{E}[|X-\mu|^k]}{t^k}. \end{equation*}
Now, it can be shown that the inequality is tight for when $k = 1$ and 2, consider the following random variables:
- $X$ such that $P(X = t) = \lambda$ and $P(X = 0) = 1- \lambda$ for the given $t$;
- $X$ such that $P(X = -t) = P(X = t) = \frac{1}{2}$ for the given $t$.
See Markov's inequality tight in general? for the arguments for why these work (or merely compute).
However, I might be missing the pattern in the construction, but how does one construct a random variable such that the Polynomial Markov is tight for any given $k$?