$$ I = \int_{0}^{\infty} x \phi(x) dx $$
where $\phi(x)$ is the pdf of a normal distribution.
Here I read that:
If $X = \mu + \sigma U$ with $U$ a std normal,
$$ I = E[\mu + \sigma U; mu + \sigma U > 0] = \mu P[U > -\mu /\sigma] + \sigma E[U; U > - \mu / \sigma]$$
thus
$$ I = \mu \Psi(\frac{\mu}{\sigma}) + \sigma \Phi(\frac{\mu}{\sigma})$$
where $\Phi$ and $\Psi$ are the PDF and CDF of a std normal.
However, I don't understand why $ E[U; U > - \mu / \sigma]$ leads to the cumulative distribution. If we define $c=-\mu/\sigma$: $$ f_{X | X > c} = \frac{P(X \in dx)}{P(X>c)}$$ we get to:
$$E[U | U > c] = \int_c^{\infty} x \frac{\Phi(x)}{1-\Psi(c)} dx $$
$$ = \bigg[ \frac{1}{1-\Psi(c)} x\Psi(x) \bigg]_0^\infty - \int_c^\infty\Psi(x) dx $$
which I can't simplify further besides saying:
$$ x \Psi(x) = - \Phi(x) $$
Actually, using $\Phi' = -x \Phi$ (thanks @copper.hat)
$$ E[U|U>c] = \frac{1}{1 - \Psi(c)} \Phi(c) $$
which is still different than the original answer.
Hint: define $Q(x) = x^2/2$.
$$x\phi(x) = \frac 1{\sqrt{2\pi}}Q'(x)\exp(-Q(x)) $$