Edit: A lot of my computations in the question below are utterly wrong, I just left them as they since the comments/answers applied to the original form.
Edit: I have updated the question to hopefully better explain the draws I am referring to. In the actual problem, $x_{(i)}, i\in\{1,\ldots,n\}$ are the $n$ lowest out of $m$ i.i.d. draws from $U[0,1]$. I hope I am making a duly simplification below.
Let $x_{(1)}\leq\ldots\leq x_{(n)}$ be $n$ i.i.d. draws of a random variable uniformly distributed in $[0,1]$. I know that (if I am not mistaken)
\begin{alignat*}{2} E[x_{(i)}]&=\frac{i}{n+1}\\ E[\sum_{i=1}^n x_{(i)}] &=\frac{n}{2}, \\ E[\sum_{i=1}^n (x_{(i)} ^2)] &= \sum_{i=1}^n\left(\frac{i}{n+1}\right)^2=\frac{n(2n+1)}{6(n+1)}. \end{alignat*}
I furthermore know that \begin{align} \underbrace{E[x_{(i)}^2]}_{\frac{2n+1}{6(n+1)}}\neq\underbrace{(E[x_{(i)}])^2}_{\frac{1}{4}} \end{align} and hence was expecting that \begin{align} E[\left(\sum_{i=1}^n x_{(i)}\right)^2]\neq\left(E[\sum_{i=1}^n x_{(i)}]\right)^2. \end{align} However, my calculation is \begin{alignat*}{3} E[\left(\sum_{i=1}^n x_{(i)}\right)^2]% &= E[\sum_{i=1}^n (x_{(i)}^2)+2\sum_{i=1}^n x_{(i)} \sum_{j=1}^{i-1}x_{(j)}] \\ &= \frac{n(2n+1)}{6(n+1)} + 2\sum_{i=1}^n \frac{i}{n+1} \frac{(i-1)i}{2(n+1)}\\ &= \frac{n(2n+1)}{6(n+1)} + \frac{1}{(n+1)^2}\sum_{i=1}^n (i^3-i^2)\\ &= \frac{n(2n+1)}{6(n+1)} + \frac{1}{(n+1)^2}\left(\left(\frac{n(n+1)}{2}\right)^2-\frac{n(n+1)(2n+1)}{6}\right)\\ &= (\ldots) = \left(\frac{n}{2}\right)^2=\left(E[\sum_{i=1}^n x_{(i)}]\right)^2 \end{alignat*}
Are the two (i.e. expected value of squared sum, and square of expected sum) really equal or have I made a mistake in my calculations? If they are equal, is there an intuition why squares and expectations are interchangeable here but not in the case above (i.e. for the sum but not for the individual $x_{(i)}$)?
(Apologies if the calculations are somewhat cumbersome, I have tried to simplify the original problem with more parameters to this form.)
Edit I will consider the total cases are $n$. There were many mistakes in my last answer so I'm doing it a little bit different.
Let's see a fact: $ \displaystyle\sum_{i=1}^n x_{(i)} = \sum_{i=1}^n x_i $ because for the sum, the order doesn't matter. And therefore, as every $x_{(i)}$ is uniform in $(0,1)$, our sum $Y = \displaystyle\sum_{i=1}^n x_{i}$ has an Irwin-Hall distribution with parameter $n$.
Therefore $ \mathbf{E} \left[ \displaystyle\left(\sum_{i=1}^n x_i\right)^2\right] = \mathbf{E}[Y^2] = Var(Y)+\left(\mathbf{E}[Y]\right)^2 = \frac{n}{12}+\left(\frac{n}{2}\right)^2 $.
You can definetely conclude from here. I hope there aren't any dumb mistakes this time.