The following problem appears as an exercise in the Russian version of Probability, Shiryaev, 2003 edition(it seems that no English version containing this problem is available yet).
Let $\xi_1,\xi_2,\ldots$ be independent, symmetrically distributed random variables. Then $$\mathsf E\left(\left(\sum_n\xi_n\right)^2\land1 \right)\le\sum_n\mathsf E(\xi_n^2\land1)$$
To avoid discussing convergence issue, I would like to assume that the summation is finite. It's not quite clear from context what "symmetrically distributed" means, but it's reasonable to guess this means $\xi_n$ and $-\xi_n$ has the same distribution. In this sense, I tried to write, $$\mathsf E\left(\left(\sum_{n=1}^N\xi_n\right)^2\land1 \right)=\int(x^2\land1)dF_{S_N}(x)=2\int_0^\infty(x^2\land1)dF_{S_N}(x)$$ and $$\mathsf E(\xi_n^2\land1)=\int(x^2\land1)dF_{\xi_n}(x)=2\int_0^\infty(x^2\land1)dF_{\xi_n}(x)$$ and, after integrating by parts, reduce the problem to proving $$\int_0^1x\left(\mathsf P(S_N\ge x)-\sum_{n=1}^N\mathsf P(\xi_n\ge x)\right)dx\le0$$ The problem would be settled if $$\mathsf P\left(\sum_{n=1}^N\xi_n\ge x\right)\le\sum_{n=1}^N\mathsf P(\xi_n\ge x),\quad 0\le x\le1$$ which is, unfortunately, not true in general.
My question: Is my approach above completely nonsense? Is it possible to turn it into a proof? If not, how to prove this inequality?
Edit. By induction, it suffices to prove for $N=2$, i.e. $\mathsf E((\xi_1+\xi_2)^2\land1)\le\mathsf E(\xi_1^2\land1)+\mathsf E(\xi_2^2\land1)$. This should be easier, but still not quite obvious for me.
I don't know how/if one can make your proof idea into a proper proof, but you can prove the result as follows:
First note that if $\omega_1, \dots, \omega_n$ are independent and uniformly distributed in $\{1,-1\}$, then for any $a_1,\dots,a_n \in \Bbb{R}$, we have $\Bbb{E} \sum_i \omega_i a_i = 0$, and hence $$ \Bbb{E} \left(\sum_i \omega_i a_i\right)^2 = \mathrm{Var} \sum_i \omega_i a_i = \sum_i a_i^2 . $$ Also note that $\Bbb{E} f(\omega_1, \dots, \omega_n) = 2^{-n} \sum_{\omega_1, \dots , \omega_n \in \{1,-1\}} f(\omega_1, \dots, \omega_n) $ is just a finite sum.
Next, note for arbitrary (fixed) $\omega_1, \dots, \omega_n$ that $(\omega_1 \xi_1, \dots, \omega_n \xi_n )$ has the same distribution as $(\omega_1, \dots,\omega_n)$, by symmetry and independence.
Finally, for any random variable $X$, it is not hard to see $\Bbb{E} [X \wedge 1] \leq 1 \wedge \Bbb{E} X$.
All in all, this shows \begin{align} \Bbb{E}_\xi \left[\left( \sum_i \xi_i \right)^2 \wedge 1 \right] &= \Bbb{E}_\omega \Bbb{E}_\xi \left[\left( \sum_i \omega_i \xi_i \right)^2 \wedge 1 \right] \\ & =\Bbb{E}_\xi \Bbb{E}_\omega \left[\left( \sum_i \omega_i \xi_i \right)^2 \wedge 1 \right] \\ &\leq \Bbb{E}_\xi \left[1 \wedge \Bbb{E}_\omega \left(\sum_i \omega_i \xi_i \right)^2 \right] \\ &= \Bbb{E}_\xi \left[1\wedge \sum_i \xi_i^2\right] \leq \Bbb{E}_\xi \sum_i (1\wedge \xi_i^2) , \end{align} which easily implies the claim.