Inequality involving elliptical integrals

17 Views Asked by At

Let $a_1,a_2,a_3 \in (0,1]$ and $a_1 \geq a_2 > a_3$. And let $b_1,b_2,b_3$ be defined by elliptical integrals:

\begin{equation} b_j = a_1 a_2 a_3 \int^\infty_0 \frac{dx}{(a_j^2+x)\sqrt{a_1^2+x}\sqrt{a_2^2+x}\sqrt{a_3^2+x}}. \end{equation}

Useful identity can be $b_1+b_2+b_3 = 2$. I want to prove the following (at least check whether it is true in general, I checked for cases and seems to hold):

\begin{equation} \sqrt{\frac{b_1 a_1^2+b_2 a_2^2+b_3 a_3^2}{b_1^2 a_1^2 + b_2^2 a_2^2 + b_3^2 a_3^2}} \geq 1, \end{equation}

for all $a_1,a_2,a_3 \in (0,1]$.

Does this hold or not and why? For my research in physics it would be useful to have a mathematical reasoning for it and not just a check by some cases...