Let $x$ be random variable, such that $E(x)=0,E(x^2)=1$ and $P(x^2\geq s^2)\geq\displaystyle\frac{C}{s^t}$, where $C>0,s\geq 1 , t>0$. Let $m<n$ and $m,n$ are natural numbers very big. Let also $L\geq 1$ . Consider $1-(1-\frac{n}{2}P(x^2\geq Ln))^m$.
Assume (*) $\frac{n}{2}P(x^2\geq Ln)\leq \frac{2c_0}{m}$, where $0 <c_0 <0.6$;
using inequality $(1-y)^m\leq 1-\displaystyle\frac{my}{2}$, valid for $y\in [0, c_0]$ , with natural $n$, we get
$$1-(1-\frac{n}{2}P(x^2\geq Ln))^m\geq\displaystyle\frac{nm}{4}P(x^2\geq Ln),$$
using assumption $P(x^2\geq s^2)\geq\displaystyle\frac{C}{s^t}$ with $s^2=Ln$ , we get $$1-(1-\frac{n}{2}P(x^2\geq Ln))^m\geq \displaystyle\frac{nm}{4}P(x^2\geq Ln)\geq \frac{mnC'}{(Ln)^{\frac{t}{2}}}.$$
How to show, that if $t\geq 4$, then (*) holds?
Let us first recall Markov inequality: let $Z$ denote a nonnegative integrable random variable. Then, for every positive $z$, $$ \mathrm P(Z\geqslant z)\leqslant z^{-1}\mathrm E(Z). $$ Application: consider $Z=X^2$ for a square integrable random variable $X$ such that $\mathrm E(X^2)=1$, and $z=n$ for any positive integer $n$. Then $\mathrm P(X^2\geqslant n)\leqslant n^{-1}$.
Note that this proof uses neither the hypothesis that $\mathrm E(X)=0$ nor any lower bound on the tail of the distribution of $X^2$.
Now, what you seem to be asking for is a proof that $\mathrm P(X^2\geqslant n)\leqslant\Theta(n^{-2})$ under the additional hypotheses that $\mathrm E(X)=0$ and that $\mathrm P(X^2\geqslant s)\geqslant Cs^{-t/2}$ for every $s\geqslant1$, for some $t\geqslant4$.
But there is no hope such a result could hold, is there? Consider a symmetric random variable such that $\mathrm P(X^2\geqslant s)=\Theta(s^{-u})$ when $s\to\infty$, for some positive $u$. Then your hypothesis may hold as soon as $u\leqslant t/2$ (for the tail estimate) and $u\gt1$ (for the square integrability) and your conclusion asks that $u\geqslant2$. These simply do not fit.