Here is the question I was trying to workout:
Consider a theoretical stock whose annual return has log-normal distribution with parameters $\mu$ and $\sigma$ with $\mu = \ln(1.1)$ and $\sigma = \ln(1.2)$. Assume that the return of each year is independent of other years. For this theoretical stock, the fraction of wealth lost with $0.1 \%$ chance when invested over $T$ years is $40.5 \%$, $58.43 \%$, $61.48 \%$, and $73.92 \%$, for $T = 1, 5, 10, 20$, respectively.
I tried to evaluate the first step for $1$ year return as below:
$$
X\sim \operatorname{LOGN}(\operatorname{ln}(1.1),\operatorname{ln}(1.2))
$$
If we assume $x$ to be the return with $0.1 \%$ chance:
$$
\operatorname{Pr}(X\lt x)=0.1\%=0.001,\\
x=\operatorname{ppf}(0.001)
$$
I used the below python code to get the ppf:
from scipy import stats
import math
sigma=math.log(1.2)
mu=math.log(1.1)
logNdist=stats.lognorm(s=sigma,scale=math.exp(mu))
print('Return on portfolio with 0.1% prob is {:.4f}'.format(logNdist.ppf(0.001)))
The output I get from above is
Return on portfolio with 0.1% prob=0.6262
Therefore the expected loss with $0.1 \%$ chance is $37.38 \%$.
Where am going wrong? I should be getting a loss of $40.5 \%$!
You can find the running code here.