Modelling something where I need some sort of average time, I reached Pareto Distribution, and the PDF is
$$ R(t)=\begin{cases}\sqrt{\frac{1}{2\pi t^{3}}}, & \text{for } t > \frac{2}{\pi}\\ 0 & \text{for } t \leq \frac{2}{\pi} \end{cases} $$
This is a heavy tailed distribution with an undefined (or $\infty$) mean. Median is also useless to approximate "average" behaviour. Median time, let's call it $\tau_M$ only gives the "approximate time" my event happens with 50% probability, but if it doesn't happen within the first $\tau_M$ seconds, the chance of it happening within the next $\tau_M$ seconds is less than 50%.
Then I said, I can model two events happening, one following the other. The PDF would probably look like
$$ R^2(t)=\int_{\frac{4}{\pi}}^{t} R(\tau)R(t-\tau)\mathrm{d}\tau. $$
Wolfram Alpha was gracious enough to give me $R^2$:
$$ R^2(t)=\frac{\sqrt{2}}{\pi}\frac{(\pi t-4)}{t^2\sqrt{\pi t-2}}u(t-\frac{4}{\pi}), $$ where $u()$ is the Heaviside function.
I thought that this might have a mean value, which means I am done, but it doesn't have a mean. Still, the median of this distribution would help me a little better. It still gives the time that two of these events happen with 50% probability.
Then, theoretically, I believe I might do better by modelling for $k$ events, each following the other one.
$$ R_{k}=\lim_{k\rightarrow\infty}\underbrace{\left(R(t)\ast\dots\ast R(t)\right)}_{k\text{ times}}, $$
and $\tau_{M_k}$ being the median of this distribution, I thought I might get what I want through dividing it by $k$ for a large $k$.
How reasonable is this? Can I use $\lim_{k\rightarrow\infty}\frac{1}{k}\tau_{M_k}$ as a reasonable replacement for "average"?