Peace be upon you
In my project I needed to find the second moment of the subtraction of two Beta-distributed random variables. I have computed it and reached to the following integral which I should analytically solve \begin{align*} \int_{-1}^{1} z^2 \int_{max(z-1,-1)}^{min(z,0)} (z-y)^{a-1} (1-z+y)^{b-1} (-y)^{c-1} (1+y)^{d-1} dy dz \end{align*} Or equally \begin{align*} &\int_{-1}^{0} z^2 \int_{-1}^{z} A(x,y;a,b,c,d) ~ dy dz + \int_{0}^{1} z^2 \int_{z-1}^{0} A(x,y;a,b,c,d) ~ dy dz,\\ &A(x,y;a,b,c,d) = (z-y)^{a-1} (1-z+y)^{b-1} (-y)^{c-1} (1+y)^{d-1} \end{align*} I have attacked to this problem by miscellaneous techniques; I have tried to divide and conquer; I have used by part integration; I have tried to convert it to a PDE system and solve it... But it seems a little more complicated than it looks like.
Any idea for uprooting this old tree?
Note: how I reached to the integral?
we should find the density function for $X-X_2$. We know that they are Beta-distributed RVs and the Beta distribution pdf is \begin{align*} f_{X}(x) = \frac{x^{a-1} (1-x)^{b-1}}{Β(a,b)}~,~~~~x\in[0,1] \end{align*} Also we can suppose the $-X_2$, by another Random Variable Y which its pdf is \begin{align*} f_Y (y)=\frac{(-y)^{c-1} (1+y)^{d-1}}{Β(c,d)}~,~~~~y\in[-1,0] \end{align*} We name $X-X_2$, by another random variable $Z$ and since these RVs are independent, the pdf of Z_(p,i,k) would be the convolution of the two above pdfs \begin{align*} f_Z (z)&=f_X (x) * f_Y (y)\\ &=\int_{-\infty}^{\infty} f_X (z-y) f_Y (y) dy\\ &=\int_{-\infty}^{\infty}\frac{(z-y)^{a-1} (1-z+y)^{b-1} (-y)^{c-1} (1+y)^{d-1}}{Β(a,b)Β(c,d)} dy \end{align*} Regarding the domain of the two pdf(s) we can write \begin{align*} f_Z (z)=\frac{1}{Β(a,b)Β(c,d)}\int_{max(z-1,-1)}^{min(z,0)}(z-y)^{a-1} (1-z+y)^{b-1} (-y)^{c-1} (1+y)^{d-1} dy \end{align*} But, we need $\mu_2' (Z)$ which is \begin{align*} \frac{1}{Β(a,b)Β(c,d)}\int_{-1}^1 z^2 \int_{max(z-1,-1)}^{min(z,0)}(z-y)^{a-1} (1-z+y)^{b-1} (-y)^{c-1} (1+y)^{d-1} dy~dz \end{align*}
For the variance one has $$Var(X\pm Y) = Var(X) + Var(Y) \pm 2Cov(X,Y)$$ so if $X$ and $Y$ are uncorrelated, i.e. $Cov(X,Y)=0$ (which does not mean independent, but if they are independent then they are uncorrelated) then $$Var(X-Y) = Var(X)+Var(Y)$$ and hence $$E[(X-Y)^2] = Var(X)+Var(Y) + E[X-Y]^2$$ if you look at wikipedia, having $X\sim Beta(\alpha,\beta)$ and $Y\sim Beta(\alpha',\beta')$ then $$E[(X-Y)^2] = \frac{\alpha \beta}{(\alpha+\beta)^2(\alpha+\beta+1)}+\frac{\alpha' \beta'}{(\alpha'+\beta')^2(\alpha'+\beta'+1)} + \left(\frac{\alpha}{\alpha+\beta} - \frac{\alpha'}{\alpha'+\beta'}\right)^2$$