Investigating the shortest line segment connecting two disjoint ellipses

70 Views Asked by At

Suppose you have the following two ellipses

$ (r_1 - C_1)^T Q_1 (r_1 - C_1) = 1 \tag{1} $

$ (r_2 - C_2)^T Q_2 (r_2 - C_2) = 1 \tag{2}$

which are disjoint (i.e. they have no intersection). I'd like to find $r_1$ and $r_2$ such that $\| r_1 - r_2 \|$ is minimum.

My approach

There are $4$ unknowns here. The two equations of the ellipses form the first two equations relating them, and in addition we have that the vector $r_1 - r_2$ is is along the gradient of the first as well as the gradient of second ellipse, that is

$ r_1 - r_2 = K_1 Q_1 (r_1 - C_1) = K_2 Q_2 (r_2 - C_2 ) \tag{3} $

To eliminate $K_1$ and $K_2$, we can cross multiply the vectors, and this will result in

$ (r_1 - r_2)^T E Q_1 (r_1 - C_1) = (r_1 - r_2)^T E Q_2 (r_2 - C_2) = 0 \tag{4}$

where $E = e_1 e_2^T - e_2 e_1^T $, and $e_1 = [1, 0]^T, e_2 = [0, 1]^T $.

These are four quadratic equations in the four coordinates of $r_1, r_2$.

Alternatively, we could work with eccentric angles and have

$ r_1 = C_1 + V_1 u_1 \tag{5}$

$ r_2 = C_2 + V_2 u_2 \tag{6}$

where $V_1, V_2$ are the semi-axes of the two ellipses and $u_1 = [\cos t_1, \sin t_1]$, $u_2 = [\cos t_2, \sin t_2 ]$. Our variables now become two, namely , $t_1$ and $t_2$.

The above two equations becomes

$ (C_1 - C_2 + V_1 u_1 - V_2 u_2)^T E Q_1 V_1 u_1 = 0\tag{7} $

and

$ (C_1 - C_2 + V_1 u_1 - V_2 u_2)^T E Q_2 V_2 u_2 = 0 \tag{8}$

So this will lead to two quadratic trigonometric equations involving the angles $t_1$ and $t_2$. Or, alternatively we can look it as involving four variables, $x_1 = \cos(t_1), x_2 = \sin(t_1) , x_3 = \cos(t_2), x_4 = \sin(t_2) $, where in addition to the above two quadratic equations in these four variables, we have

$x_1^2 + x_2^2 = 1\tag{9}$

and

$x_3^2 + x_4^2 = 1\tag{10}$

When trying to find possible solutions to the above equations, that will correspond to the minimum distance, maximum distance, or saddle points, we can scan the initial guess of the coordinates of $r_1$ and $r_2$ to be in terms of $t_1$ and $t_2$, both of which vary in the interval $[0, 2 \pi]$, or using the second version of the equations, directly scan for $x_1, x_2, x_3, x_4$ being the cosines and sines of the $t_1$, $t_2$ respectively. So for example, if we sample the interval $[0, 2 \pi]$ with a number of samples equal to $N= 50$, then we have to run the search method (for example, the multivariate Newton-Raphson method) $N^2 = 2500$ times to find all possible solutions. In the absence of a closed-form solution for a $4 \times 4$ quadratic system of equations, this is the only path I can think of.

An alternative would be using iterative minimization of the distance, using an optimization method, the simplest is possibly the steepest descent method, where we define our objective function as

$ f (t_1, t_2) = \| C_1 - C_2 + V_1 u_1(t_1) - V_2 u_2(t_2) \| \tag{11}$

It follows that the gradient of this function is:

$ \dfrac{\partial f}{\partial t_1} = 2 (C_1 - C_2 + V_1 u_1(t_1) - V_2 u_2(t_2) )^T ( V_1 u'_1(t_1) ) \tag{12} $

and

$\dfrac{\partial f}{\partial t_2} = 2 (C_1 - C_2 + V_1 u_1(t_1) - V_2 u_2(t_2) )^T (V_2 u'_2(t_2) ) \tag{13}$

where

$u'_1(t_1) = [ - \sin(t_1), \cos(t_1) ]^T , u'_2(t_2) = [-\sin(t_2), \cos(t_2)]^T \tag{14}$

To help improve convergence chances, we can run a pre-optimization phase to select the best initial guess of $u_1$ and $u_2$ by sampling the interval $[0, 2 \pi]$ for $t_1, t_2$, and the selecting the initial guess based on the minimum $f$ obtained from the sampled points.

Update:

I've implemented the first method that uses equations $(7),(8),(9),(10)$, for two arbitrary disjoint ellipses. I used the multivariate Newton-Raphson method to solve the quadratic $4\times4$ system of equations. Before envoking the method, I got a good estimate of the unknowns vector $X = [x_1, x_2, x_3 , x_4]$ as defined above, by sampling $[t_1,t_2]$ over $[0, 2 \pi]$ with $N = 30$ resulting in a total of $900$ function evaluations. The vector $X$ resulting in the minimum $f$ after these evaluations, was my initial guess for the multivariate Newton-Raphson iteration. The iteration converged very quickly (only $3$ iterations) to the solution. The result is shown graphically below. The green lines are tangents of the ellipses at their respective points that result in the minimum distance. As the figure shows the the tangents are perpendicular to the difference vector, which means the normals to the two ellipses are along this difference vector.

enter image description here

1

There are 1 best solutions below

4
On

Just solve the following convex optimization problem:

$\min \, \| r_1 - r_2 \|^2$

subject to:

$ (r_1 - C_1)^T Q_1 (r_1 - C_1) \le 1$

$ (r_2 - C_2)^T Q_2 (r_2 - C_2) \le 1,$

which is classified as a convex quadratic programming model, using an appropriate optimization solver.