The integral is:
$\int^\infty_0 x \exp{\Big(\frac{-(\log{x}-\mu)^2}{2\sigma^2}\Big)}dx \ \ \ $ (it is a second moment of log-normal distribution).
I've tried several subsitutions, such as
$u=\log{x}$,
$u=\log{x} - \mu$,
$u=(\log{x} - \mu)^2$.
However, all of them lead to more complicated results. I can calculate this integral when there is $x$ instead of $\log{x}$, but with the logarithm it gets complicated. Is there a way to calculate it using some trick? (undergraduate level)
Thanks in advance.
Denote by
$$p(x) = \frac{1}{x} \frac{1}{\sqrt{2\pi \sigma^2}} \exp \left(- \frac{(\log x- \mu)^2}{2\sigma^2} \right)$$
the probability density function of the log-normal distribution. For $y := \log x- \mu$ we have
$$\frac{dy}{dx} = \frac{1}{x} = \exp(-y-\mu),$$
i.e.
$$dx = \exp(y+\mu) \, dy,$$
and therefore a change of variables gives
$$\int_0^{\infty}x p(x) \, dx = \frac{1}{\sqrt{2\pi \sigma^2}} \int_{\mathbb{R}} \exp(y+\mu) \exp \left(- \frac{y^2}{2\sigma^2} \right) \, dy.$$
The right-hand side is the exponential moment of a Gaussian random variable; more precisely, if $Y \sim N(0,\sigma^2)$ then the right-hand side equals
$$\exp(\mu) \mathbb{E}\exp(Y).$$
Since exponential moments of Gaussian random variables can be calculated explicitly, we get
$$\int_0^{\infty}x p(x) \, dx = \exp \left( \mu + \frac{1}{2} \sigma^2 \right).$$
Equivalently,
$$\int_{(0,\infty)} \exp \left(- \frac{(\log x-\mu)^2}{2\sigma^2} \right) \, dx = \sqrt{2\pi \sigma^2}\exp \left( \mu + \frac{1}{2} \sigma^2 \right).$$
Remark: The same reasoning works also for higher moments, i.e.
$$\int_{(0,\infty)} x^k p(x) \, dx$$
for $k \geq 1$. Following the argumentation from above we get
$$\int_{(0,\infty)} x^k p(x) \, dx = \exp(k \mu) \mathbb{E}\exp(kY) = \exp \left( k \mu + \frac{1}{2} \sigma^2 k^2 \right),$$
i.e.
$$\int_{(0,\infty)} x^{k-1} \exp \left( - \frac{(\log x-\mu)^2}{2\sigma^2} \right) \, dx = \sqrt{2\pi \sigma^2} \exp \left( k \mu + \frac{1}{2} \sigma^2 k^2 \right).$$