Let $X_1$ and $X_2$ be a random sample from normal distribution with mean equal to zero and variance $\sigma^2$. Prove $E[X_{(1)}]= \frac{-\sigma}{\sqrt{\pi}}$.
May I have to standarize the sample? Looks like it has a lot of tricky parts. I don't know how to proceed.
We start with a little trick. Let $Y$ be the minimum of $X_1$ and $X_2$. Then $$Y=\frac{1}{2}\left(X_1+X_2-|X_1-X_2|\right).$$ Since the $X_i$ have mean $0$, we have $$E(Y)=-\frac{1}{2}E(|X_1-X_2|).$$ The random variable $X_1-X_2$ is normally distributed mean $0$, variance $2\sigma^2$.
If $W$ is normal with this mean and variance, the mean of its absolute value is $2\int_0^\infty wg(w)\,dw$, where $g(w)$ is the density function of $w$. Thus $$E(Y)=\int_0^\infty -wg(w)\,dw.$$ But $$g(w)=\frac{1}{2\sigma\sqrt{\pi}}\exp(-w^2/(4\sigma^2)).$$ To evaluate, make the substitution $u=w^2/(4\sigma^2)$. Then $du=\frac{w}{2\sigma^2}\,dw$, and we end up with $$\int_0^\infty -\frac{\sigma}{\sqrt{\pi}}e^{-u}\,du.$$ This is $-\frac{\sigma}{\sqrt{\pi}}$.