I have a set of measurements $a$ (units m) which are log-normally distributed (with parameters $\mu$ and $\sigma$). The expected value (or mean) of $a$ is just the first arithmetic moment, i.e.
$$ E(a)=\int_0^\infty a f_A(a)\ da, $$
where $f_A(a)$ is the probability density function with parameters $\mu$ and $\sigma$. For the log-normal distribution, the solution to this integral is apparently
$$ E(a) = \exp\left(\mu + \frac{1}{2}\sigma\right). $$
This implies that $E(a)$ (or the mean of $a$) has no units, since the argument of an exponential cannot have units. However, $E(a)$ obviously has units (m), which we can also clearly see from the integral. This seems like a contradiction, so I'm clearly missing something. Does anybody know what the problem is?