I have two problems which are quite straighforward in theory, but unexpectedly computationally hard for me.
- Suppose that probability density function of independent identically distributed random variables $x_i, ~i = 1, \ldots, N$ has asymptotics of the form $$p(x_i) \sim \frac{A}{x_i^\alpha}, ~x_i\to\infty.$$ What is the asymptotics of probability density function of $x = x_1 + \ldots + x_N$?
The pdf of a sum of independent random variables is given by the convolution of pdfs. Let $p_\xi(x)$ be the pdf of random variable $\xi$ and $p_\eta(x)$ be the pdf of $\eta$, then pdf of $\xi + \eta$ is $$p_{\xi+\eta} = (p_\xi \ast p_\eta)(x) =\int_\mathbb{R} p_\xi(x-t) p_\eta(t) dt.$$ We can proceed for $N > 2$ just convoluting the result with next $p_i$.
However, manually calculating the convolution seems to be pretty hard exercise for me even for $N = 2$. Wolfram Mathematica gives the following output using the hypergeometric function.
- The Levy distribution has probability density function of the form $$p(x) = \sqrt{\frac{\gamma}{2\pi}}\frac{1}{x^{3/2}}\exp(-\frac{\gamma}{2x}), ~0 < x < \infty.$$ How does the probability density function of the sum of two independent random variables with Levy distributions exactly look like?
I faced exactly the same trouble here -- the closed form answer should be given by convolution, but I failed to deal with such integral manually and so did Wolfram Mathematica (you can see my question on Mathematica SE here with source code and screenshot).
I would be very glad if somebody will explain me how these problems should be done. Maybe there exists another way that avoids calculation of convolutions?
