Given a sphere with two great circles.
The distance where points on the circles are furthest apart is small compared to the radius.
How do I calculate how fast the two lines converge?
First approximation. They converge linearly. We know at 90 degrees around the sphere they intersect. So at n degrees from the starting point the lines have converged by n/90ths of the original distance.
Second approximation. The lines visually seem to converge most rapidly near the intersection. So my second supposition is that they would converge with the cosine of hte angle. So at n degress from the starting point the distance would be cos(n) of the original distance.
But I can't prove this. I can come up with a few less simple formulas that go from 1 to 0 in 90 degrees.
This actually comes up as a practical problem.
The floor in my kitchen is dished. First order, I've got a 24 x 15 foot rectangle on the surface of a 432 foot radius sphere.
I want to use laminate flooring. I'm trying to figure out the extra width of the crack in the middle of the floor. It it amounts to a few thousands of an inch, I won't worry about it.
One way to model this is to put your floor on the sphere with the $24$ foot dimension running east-west along the equator and the $15$ foot dimension running north-south centered on the equator. We assume the $24$ foot dimension is measured exactly around lines of latitude and ask what the distance at the equator is. The latitude of the sides of the floor is $\frac {7.5}{432}\approx 0.01736$ (using the small angle approximation that $x=\sin x$). This is about $1^\circ$. The $24$ feet at $1^\circ$ becomes $24.0036$ feet at the equator, an additional $0.043$ inches.