If we look at CDF of various famous random variables around 0, they tend to be well approximated by a power law. Why?
Below are 24 examples, notable exceptions are log-normal, Frechet, Inverse Gamma and Inverse Normal, the rest appear to be fit with a power law.
$$\left( \begin{array}{cc} \text{ChiSquared} & \sqrt{\frac{2}{\pi }} \sqrt{x} \\ \text{Marchenko-Pastur} & \frac{2 \sqrt{x}}{\pi } \\ \text{Beta(1/2,1)} & \sqrt{x} \\ \text{ArcSin} & \frac{2 \sqrt{x}}{\pi } \\ \text{Weihbul(1/2,2)} & \frac{\sqrt{x}}{\sqrt{2}} \\ \text{Exponential} & x \\ \text{Chi} & \sqrt{\frac{2}{\pi }} x \\ \text{Student-T} & \frac{2 x}{\pi } \\ \text{Normal} & \sqrt{\frac{2}{\pi }} x \\ \text{Cauchy} & \frac{2 x}{\pi } \\ \text{Semicircle} & \frac{2 x}{\pi } \\ \text{Gumbel} & x \\ \text{F-ratio(2,2)} & x \\ \text{Gamma(1,2)} & \frac{x}{2} \\ \text{Extreme Value} & \frac{x}{e-1} \\ \text{Logistic} & \frac{x}{2} \\ \text{Uniform} & x \\ \text{Inverse Normal} & -\frac{e x^2}{16}-\frac{x^2}{16}+\sqrt{\frac{2}{\pi }} e^{-\frac{(x-2)^2}{8 x}} \sqrt{x} \\ \text{Erlang(2,2)} & 2 x^2 \\ \text{Triangle} & 2 x^2 \\ \text{Kumaraswamy(2,3)} & 3 x^2 \\ \text{Bates(3)} & \frac{9 x^3}{2} \\ \text{LogNormal} & -\frac{e^{-\frac{1}{2} \log ^2(x)}}{\sqrt{2 \pi } \log (x)} \\ \text{InverseGamma} & e^{-2/x} \\ \text{Frechet(2,1,0)} & e^{-\frac{1}{x^2}} \\ \text{Pareto(1,2)} & 0 \\ \end{array} \right)$$
This summarizes my comments with some more detail:
Suppose $X$ has a continuous CDF and has a PDF that is continuous over some interval $(0,\delta)$ (for some $\delta>0$). Then for any $c>0$ we have by L'Hopital's rule together with the fundamental theorem of calculus: $$ \lim_{x\rightarrow 0^+} \frac{P[X \in [0, x]]}{x^c} = \lim_{x\rightarrow 0^+}\frac{\int_0^x f_X(t)dt}{x^c} = \lim_{x\rightarrow 0^+} \frac{f_X(x)}{cx^{c-1}}$$ assuming the final limit exists.
For example:
Exponential: $f_X(x)=\lambda e^{-\lambda x} 1_{\{x\geq 0\}}$ $$ \lim_{x\rightarrow 0^+}\frac{P[X \in [0,x]]}{x} = \lim_{x\rightarrow 0^+} \frac{f_X(x)}{1} = \lambda$$
Erlang: $f_X(x) = x\lambda^2 e^{-\lambda x} 1_{\{x\geq 0\}}$ $$ \lim_{x\rightarrow 0^+}\frac{P[X \in [0,x]]}{x^2} = \lim_{x\rightarrow 0^+} \frac{f_X(x)}{2x} = \frac{\lambda^2}{2}$$
Normal: $f_X(x) = \frac{1}{\sqrt{2\pi\sigma^2}}e^{-x^2/(2\sigma^2)}$ $$ \lim_{x\rightarrow 0^+}\frac{P[X \in [0,x]]}{x} = \lim_{x\rightarrow 0^+} \frac{f_X(x)}{1} = \frac{1}{\sqrt{2\pi\sigma^2}}$$
Chi squared: $f_X(x) = \frac{1}{\sqrt{2\pi x}}e^{-x/2}1_{\{x\geq 0\}}$ $$ \lim_{x\rightarrow 0^+}\frac{P[X \in [0,x]]}{x^{1/2}} = \lim_{x\rightarrow 0^+} \frac{f_X(x)}{(1/2)x^{-1/2}} = \sqrt{2/\pi}$$
The unusual cases are when $f_X(x)\rightarrow 0$ very rapidly as $x\rightarrow 0^+$, faster than a power law $x^c$ for any $c>0$ (such as the $e^{-\beta/x}$ behavior of an inverse gamma with parameter $\beta>0$).
Note that your chart seems to be giving functions $g(x)$ such that $$ \lim_{x\rightarrow 0^+} \frac{P[|X|\leq x]}{g(x)}=1$$
When $X$ is a nonnegative random variable then $$P[|X|\leq x]=P[X \in [0,x]] \quad \forall x>0$$
When $X$ has a PDF that is symmetric about 0 then $$P[|X|\leq x]=2P[X\in [0,x]] \quad \forall x>0$$