Wikipedia states that
A log-normal distribution is not uniquely determined by its moments $\text{E}[X^k]$ for $k\ge 1$, that is, there exists some other distribution with the same moments for all $k$. In fact, there is a whole family of distributions with the same moments as the log-normal distribution.
However, just prior to that, it states that
Equivalently, parameters $\mu$ and $\sigma$ can be obtained if the expected value and variance are known[.]
This confuses me. What am I missing? Is it perhaps a freedom in the base itself?
NB: I would also like to be sure that the 'ln-normal' (log-normal in base $e$?) is uniquely determined by expectation and variance.
They are referring to the fact that the moment generating function is not convergent for the lognormal distribution. For a random variable $X$ that ranges over $[0,\infty)$ with a probability density function $f$, the moment generating function is defined as
$$M(t) = E(e^{tX}) = \int_{0}^{\infty} e^{tx} f(x) dx $$
If it exists, then expanding the exponential in a Taylor series leads to a series expansion for $M$ of the form
$$M(t) = 1+ \sum_{k=1}^{\infty}\frac{t^k}{k!}E(X^k) $$
This is useful as an alternative to direct integration to find the moments $E(X^k)$ if the generating function is known.
Unfortunately this does not converge for the lognormal distribution (due to the $x^{-1}$ factor in the probability density function.) When the moment generating function does not exist, then the moments do not uniquely define the distribution.
That said, it is true that two parameters appear in the lognormal pdf and you can solve for them using the mean and variance. However, there can be other distributions with additional parameters that have the same moments.
I believe an example is to multiply the usual lognormal pdf by $[1+ a\sin(2n\pi \log(x)].$