Let $X$ be an observation from a lognormal distribution of known mean $\mu$ and standard deviation $\sigma$.
What is the probability that $X$ lies within $\mu\pm c\sigma$, for a given $c>0$?
(I believe that is called a tolerance interval, but not being a statistician, please correct me if I'm wrong.)