Problem:
Let $(\epsilon_i), (g_i)$ be sequences of independent Bernoulli$(\{1,-1\},0.5)$ and Gaussian$(0,1)$ random variables respectively (the sequances are also independent from each other).
Show that $$\| \sum_{i=1}^n g_ia_i\|_{L_p} \ge \frac{\sqrt{2}}{\sqrt{\pi}} \| \sum_{i=1}^n \epsilon_i a_i\|_{L_p}$$ for any $p \ge 1$ and any fixed real sequence $(a)$ and fixed $n \in \mathbb N.$
Attempt: The hint is to start by showing that $\epsilon_i |g_i|$ is also a standard Gaussian random variable which is reasonably doable. Furthermore noticing that $\sqrt{2/\pi}= \|g_i\|_{L_1}$ we have $\| g_N \|_1 \| \sum a_i\epsilon_i\|_{L_p} =\left[(\int |g_N|)^p(\int| \sum a_i \epsilon_i|^p) \right]^{1/p}$ and $$ \left[(\int |g_N|)^p(\int| \sum a_i \epsilon_i|^p) \right]^{1/p} \overbrace{\le}^{Jensen} \left[(\int |g_N|^p)(\int| \sum a_i \epsilon_i|^p) \right]^{1/p} \overbrace{=}^{independence} \left[\int (|g_N|\times| \sum a_i \epsilon_i|)^p \right]^{1/p}$$ but carrying on from here seems difficult because we're losing the independence at the last step.
Context:
The goal is to show that the Khinchine inequalities (top of page 14 in the source) for Bernoulli random variables (ie the $\psi_2$ character of the finite sum $\sum_1^n \epsilon_i a_i$ for an arbitrary real sequence $(a)$ and IID random variables $\epsilon_i$ following the Bernoulli law on $\{1,-1\}$ with parameter $p=1/2$). One known way to do this is via the Hoeffding inequality.
However here we want to do it via the $\psi_2$ character of $\sum_1^n a_i g_i$ for $g_i$ iid standard gaussian variables.
Let $(\epsilon_i), (g_i)$ be sequences of independent Bernoulli$(\{1,-1\},0.5)$ and Gaussian$(0,1)$ random variables (the sequances are also independent from each other).
Q1. Show that $\epsilon_i |g_i|$ is also a standard Gaussian random variable.
This question is relatively simple and follows from the independence and the fact that it is enough to check that $P(\epsilon_i |g_i|>t)= P(g_i>t)$ for all $t$.
Q2. Show that $\| \sum_{i=1}^n g_ia_i\|_{L_p} \ge \frac{\sqrt{2}}{\sqrt{\pi}} \| \sum_{i=1}^n \epsilon_i a_i\|_{L_p}$ for any $p \ge 1$ and any real sequence $(a)$. Deduce the Khinchine inequalities.
For the first part, I tried to remark that by independence $\| \sum_{i=1}^n \epsilon_ia_i\|_{L_p} \| \sum_{i=1}^n f(g_i)\|_{L_p} =\| (\sum_{i=1}^n \epsilon_ia_i)( \sum_{i=1}^n f(g_i))\|_{L_p} $ for $f(.) = |.|$ but I don't see how to proceed from here.
Solution for the deduction: It is enough to establish that for all $p\ge 1$ we have control over the $p$ norms $\| \sum_{i=1}^n \epsilon_i a_i\|_{L_p} \le c p^{1/2} $ for some $c>0$ which directly follows from the $\psi_2$ character of $\sum_{i=1}^n g_i a_i$ and the initial inequality.
A hint on how to carry on would be more than welcome.
Source: https://webusers.imj-prg.fr/~dario.cordero/Docs/M2/2020_2021/chap3_new.pdf page 15
I found a solution using convexity.
As you said, suppose $$ \lVert\sum g_i a_i\rVert_{L_p} = \lVert\sum a_i \varepsilon_i |g_i| \rVert_{L_p}. $$
By the Law of Iterated Expectations,
\begin{align} \|\sum g_i a_i\rVert_{L_p} &= \|\sum a_i \varepsilon_i |g_i| \|_{L_p} \\ \text{(by independence)}&=\mathbb E_{\varepsilon} \left[\mathbb E_g \left(\left|\sum a_i \varepsilon_i |g_i| \right|^p\right)\right]^{1/p}\\ (\text{by Jensen on }x \mapsto |x|^{p})&\ge \mathbb E_{\varepsilon} \left[ \left|\mathbb E_g \left(\sum a_i \varepsilon_i |g_i| \right)\right|^p\right]^{1/p} \\ &=\sqrt{2/\pi} \cdot \mathbb E_\varepsilon\left(\left|\sum a_i \varepsilon_i\right|^p \right)^{1/p} \end{align}
and the result follows.