Take:
$$a_0=x,~~~~b_0=y$$
$$a_{n+1}=\frac{a_n+\sqrt{a_nb_n}}{2},~~~~b_{n+1}=\frac{b_n+\sqrt{a_nb_n}}{2}$$
Then we obtain as a limit the logarithmic mean of $x,y$:
$$\lim_{n \to \infty} a_n=\lim_{n \to \infty} b_n=\frac{x-y}{\ln x-\ln y}$$
I don't know how to prove this. But I do know that numerically it fits really well.
In fact, the best approximation is obtained if we take geometric mean of $a_n,b_n$:
$$x=5,~~~~y=3$$
$$\begin{array}( n & \sqrt{a_nb_n} & \frac{x-y}{\ln x-\ln y} \\ 4 & \color{blue}{3.915}0640985032 & 3.9152303779424 \\ 10 & \color{blue}{3.915230}33734566 & 3.9152303779424 \\ 20 & \color{blue}{3.9152303779424} & 3.9152303779424 \end{array}$$
The convergence rate can be approximated by:
$$\frac{a_{n+1}-b_{n+1}}{a_n-b_n}=\frac{1}{2}$$
This seems like a very simple way to compute logarithms, for example:
$$x=2,~~~~y=1$$
$$\ln2=\lim_{n \to \infty}\frac{1}{\sqrt{a_nb_n}}$$
How do I prove that the limit of this sequence is really the logarithmic mean?
Edit
It turns out this algorithm is mentioned in (at least) two papers by B. C. Carlson as early as 1971:
https://www.jstor.org/stable/2317088
https://www.jstor.org/stable/2317754
Still, if someone can provide their own proof, I would be grateful.
You have several exact identities, such as $$ a_{n+1}-b_{n+1}=\frac{a_n-b_n}2\implies a_n-b_n=2^{-n}(a_0-b_0), \\ \frac{a_{n+1}}{b_{n+1}}=\sqrt{\frac{a_n}{b_n}}\implies \frac{a_n}{b_n}=\left(\frac{a_0}{b_0}\right)^{2^{-n}} $$ The first tells us that if one of the sequences has a limit the other has the some limit. In combination $b_n$ can be eliminated to get, via mean value theorem, or $N(\sqrt[N]x-1)\to \ln x$ for $N\to \infty$, $$ a_n=\frac{2^{-n}(a_0-b_0)}{1-\left(\frac{b_0}{a_0}\right)^{2^{-n}}} =\frac{a_0-b_0}{\ln a_0-\ln b_0+O(2^{-n})} $$