Consider $g\in \mathbb{R}^n$ whose entries are i.i.d. normal random variables, how can I get an estimate of the upper bound of $$ P(X_s \geq Y_{n-s}) \leq \, ? $$ where $X_s = \sum_{i=1}^s |g_i|$ and $Y_{n-s} = \sum_{i=s+1}^n |g_i|$. Note that $X_s$ and $Y_s$ are both sum of absolute value sum of $s$ i.i.d. normal R.V. so they are independent copies of each other.
In other words, we want to find the probability that absolute value sum of $s$ i.i.d. normal R.V. is greater than that of $n-s$ i.i.d. normal R.V. It would be interesting to see the dependence of upper bound on $s$ and $n$.
An idea to solve this problem is $$ P(X_s \geq Y_{n-s}) = \int_0^\infty P(X_s > y) f_Y(y)\,\mathrm dy $$ where $f_Y(y)$ is the pdf of $Y_{n-s}$.
Some helpful facts:
$ P(X_s > y) \leq 2^s \exp(-t^2/2s),$see link
$P(Y_s \leq st) $ can be explicitly calculated, see link. But it is hard for me to use it here since we want an upper bound of the density function of $Y_{n-s}$.
This seems to be a quite classical probability question so I guess someone must have studied it before. Any idea and reference will be appreciated.
Not an answer, but too long to fit in the comment.
For the case when $s=\epsilon n$ where $\epsilon$ is small, maybe we can try the following.
$$\begin{align} &\mathbb{P}\left(X_{\epsilon n} > Y_{(1-\epsilon)n}\right) \\ &= \mathbb{P}\left(\{X_{\epsilon n} > Y_{(1-\epsilon)n}\} \cap {\{ Y_{(1-\epsilon)n} \leq {\epsilon}^{0.1}n \}}\right) + \mathbb{P}\left(\{X_{\epsilon n} > Y_{(1-\epsilon)n}\}\cap{\{ Y_{(1-\epsilon)n} >{\epsilon}^{0.1}n \}}\right)\\ &\leq \mathbb{P}\left( Y_{(1-\epsilon)n} \leq {\epsilon}^{0.1}n \right) + \mathbb{P}\left(X_{\epsilon n} >{\epsilon}^{0.1}n\right) \end{align}$$
Second term decays like $\mathbb{P}(X_{\epsilon n} >{\epsilon}^{0.1}n) \leq e^{-Cn/{\epsilon}^{0.8}}$, following the formula from the question.
For the first term, if we normalize by the expectation, (denote $\mathbb{E}[Y_{(1-\epsilon)n}] = L(1-\epsilon)n$), we have $$\begin{align} &\mathbb{P}( Y_{(1-\epsilon)n} \leq {\epsilon}^{0.1}n ) \\ &= \mathbb{P}( Y_{(1-\epsilon)n} - L(1-\epsilon)n \leq {\epsilon}^{0.1}n - L(1-\epsilon)n)\\ &\leq \mathbb{P}(|Y_{(1-\epsilon)n} - L(1-\epsilon)n| \geq (L(1-\epsilon) - {\epsilon}^{0.1}\big)n) \end{align}$$ From here, we can probably get a general bound of the type $e^{-Cn}$ using the usual derivation for Chernoff bounds. But I think the decay rate should improve when $\epsilon \to 0$. So maybe we have to use the Gaussian distribution and the derivation from fact 2 in the question (which I don't know how to rewrite in a more explicit form).