I want to calculate numerically the expectation of a lognormal random variable $Y=e^X$, where $X$ is normally distributed with mean $m$ and variance $V$.
The expectation is known as $e^{m+\frac{1}{2}V}$. When it comes to simulation , we can generate $N$ random numbers $\{Y_{k}\}_{k=1}^{N}$ centered normally distributed, and calculate : $\frac{1}{N}\sum_{k=1}^{N}{e^{m+\sqrt{V}Y_k}}$.
When $m$ and $V$ are relatively small, we can replicate the expected value. When $m$ and $V$ are very high, we get ridiculously high number.
What is the best method to reduce that kind of numerical errors?
Thanks.
Of course, it depends on what high means when you say $\ldots$
Nonetheless, I could not replicate your concerns. I did two simulations ($N = 10\,000$ each) using MATLAB, one with $m$ varying from $-70$ to $70$, and another with $v$ varying from $0.5$ to $30$. The two figures show the expected values, computed from the simulated random variables, versus $\mathrm{e}^{m + \frac{1}{2}v}$. The axes are log-scaled to better see the deviations from the diagonal.
Left figure: If $v$ is fixed and $m$ varies from $-70$ to $70$, expected values are on the diagonal.
Right figure: If $m$ is fixed and $v$ is varied from $0.5$ to $30$, expected values scatter around the diagonal, and it does so the more, the larger $v$ is.