Finding the angle of the tangent of an ellipse from the intersecting point

41 Views Asked by At

So this should be a simple problem, but I must be doing something wrong and the tangents are ending up completely wrong.

I have a four-parameter ellipse centered at $[0, 0]$:

$$ x^2 / a^2 + y^2 / b^2 = 1 $$

A tangent intersects the ellipse at the point $[u, v]$, having the coordinate equation:

$$ \frac{u}{a^2} × x + \frac{v}{b^2} × y = 1 $$

I would like to find the angle of the tangent. My approach is to find the slope $m$, and thereby $tan(\theta)$ of the angle $\theta$ between the line and the $x$ axis.

The general shape of a line is:

$$ y = m × x + c $$

$m$ being the slope, and $c$ the constant term.

I rearrange the equation for the tangent to get the same shape, and then I'll apply substitution to get $m$.

$$ \frac{v}{b^2} × y = 1 - \frac{u}{a^2} × x $$

Divide by $\frac{v}{b^2}$:

$$ y = \frac{1}{\frac{v}{b^2}} - \frac{u/a^2}{v/b^2} × x $$

Substitute $ m = \frac{u/a^2}{v/b^2} $ and $c = \frac{1}{v/b^2} $

Now I should have the formula to compute the slope of the tangent for any concrete point $[u, v]$ intersecting an ellipse with the semi-major and semi-minor axes $a$ and $b$, centered in $[0, 0]$.

However, the formula is wrong - it gives the right answers for a circle (when $a = b$), but nonsensical answers for an ellipse with non-zero eccentricity.

I've been staring at this for too long, and I can't see where I made the mistake. I'd really appreciate it if someone could point it out to me. Thanks!

1

There are 1 best solutions below

0
On

Well, I figured out the problem. The ellipse was not centered in $[0, 0]$, that was where the focus was. By translating the points by $-C$ for to compute the slope, it now gives the right answers.

Linear algebra is just a series of gotchas.