So, I've got a real doozy of a question. I'm trying to provide a proof for the relationship between the major and minor radius ($a$ and $b$, respectively) of an ellipse of constant circumference as the eccentricity $k=\sqrt{1-\frac{b^2}{a^2}}$ changes. The relationship I determined is as follows:
$$ \pi^b=\frac{(2\pi-2)a+2-\pi^2}{2-\pi} $$
I determined this relationship computationally by minimizing the following residual function: $$ R=2\pi-2\pi a\left(1-\sum_{n=0}^\infty \left(\frac{(2n-1)!!}{(2n)!!}\right)^2\frac {k^{2n}}{2n-1}\right) $$ The residual function was derived from the equation for a circumference of an ellipse:
$$ C=4aE(k), $$
where $E(k)$ is the complete elliptic integral of the second kind. The complete elliptic integral of the second kind may be represented by the series:
$$ E(k)=\frac \pi 2\left(1-\sum_{n=0}^\infty \left(\frac{(2n-1)!!}{(2n)!!}\right)^2 \frac{k^{2n}}{2n-1}\right)) $$
So, the residual function is given by: $$ R=2\pi-4aE(k) $$
I minimized the function by assigning a value to $b$, and then performing a line search to determine an optimal value for $a$. I did this for several values of $b$, and then plotted $a$ versus $b$. My question is whether or not there is some way of proving the relationship between $a$ and $b$ above.
One method I've pursued, at another's suggestion, is to determine $\frac {db}{da}$ and then separate and integrate. So I went through the following steps:
$$ \frac {dC}{da}=0=4E(k)\frac d{da}(a)+4a\frac d{da}(E(k)) $$ $$ \frac {dC}{da}=0=4E(k)\frac {da}{da}+4a\frac {dE(k)}{dk}\frac {dk}{da} $$ Given $ k=\sqrt{1-\frac{b^2}{a^2}} $ and $ \frac{dE(k)}{dk}=\frac{E(k)-K(k)}k $ it can be shown that $$ 0=4E(k)+4a\frac {E(k)-K(k)}{\sqrt{1-\frac{b^2}{a^2}}}\frac d{da}\left(\sqrt{1-\frac{b^2}{a^2}}\right) $$ If one considers $b$ to be a function of $a$, it can be shown through implicit differentiation that $$ \frac d{da}\left(\sqrt{1-\frac{b^2}{a^2}}\right)=\frac{\frac b{a^2}\frac{db}{da}-\frac{b^2}{a^3}}{\sqrt{1-\frac{b^2}{a^2}}} $$ The problem is, when I isolate $\frac{db}{da}$, I seem to get an ODE that doesn't seem to be solvable by any ordinary means (i.e. separating and integrating, integrating factors, etc.): $$ \frac{db}{da}=\frac ba\left(1+\frac{E(k)}{E(k)-K(k)}\right)-\frac ab\left(\frac{E(k)}{E(k)-K(k)}\right) $$
Does anyone have any ideas about how I should proceed? All input is greatly appreciated. Thanks!