I have functions of the form \begin{align} I_i = \int_0^\infty F_0(x)^aF_1(x)^b(1-F_0(x))^c(1-F_1(x))^ddF_i(x)~~~~i = 0,1 \end{align}
$F_0(x)$ and $F_1(x)$ are CDFs corresponding to the random variables $X_0, X_1$, sith support $[0,\infty)$. The closed form expression for CDFs is extremely complicated - so substitution is out of the question.
I am trying to get a more tractable form of this in the form of upper or lower bounds. Does anybody have any suggestion for tricks that can be tried. This seems a simple enough form, but I am missing something trivial.
I was trying to relate this to the total variational distance between the distributions $TV(X_0,X_1)$. If we assume $$TV(X_0, X_1) < \epsilon,$$ then $$|F_0(x) - F_1(x)| < \epsilon, ~~~\forall x.$$ For example, simplify $I_0$, we use $F_1(x) < F_0(x) + \epsilon$, to obtain
\begin{align} I_0 \leq & \int_0^\infty F_0(x)^a(F_0(x)+\epsilon)^b(1-F_0(x))^c((1-F_0(x)) + \epsilon)^ddF_0(x) \\ = & \int_0^\infty F_0(x)^a\left(\sum_{j=0}^b {b \choose j}F_0(x)^j\epsilon^{b-j}\right)(1-F_0(x))^c\left(\sum_{k=0}^d {d \choose k}(1-F_0(x))^k\epsilon^{d-k}\right) dF_0(x) \\ = & \int_0^\infty \sum_{j=0}^b \sum_{k=0}^d {b \choose j} {d \choose k} \epsilon^{b+d-j-k}F_0(x)^{a+j}(1-F_0(x))^{c+k} dF_0(x) \\ = & \sum_{j=0}^b \sum_{k=0}^d {b \choose j} {d \choose k} \epsilon^{b+d-j-k}\int_0^\infty F_0(x)^{a+j}(1-F_0(x))^{c+k} dF_0(x) \\ = & \sum_{j=0}^b \sum_{k=0}^d {b \choose j} {d \choose k} \epsilon^{b+d-j-k}\mathcal B(a+j+1,c+k+1)\\ \end{align} Here $\mathcal B(m,n)$ is the Beta function. Since the arguments of the Beta function are not trivial, I am unable to simplify this further. In its current form, it is still not usable for me as I would be trying to bound the sum and difference of a few such terms.
My questions are
Are there any bounds on the Beta function which can help simplify the expression above?
Is there any better approach to bound the expression for $I_i$ so that we don't run into the Beta function issue?
Thanks!