I have two arbitrary points (A and B) in a plane, and I'm working in cartesian coordinates.
Each of these points has an associated arbitrary unit direction vector - so from point A, for example, I want to travel in direction U - and from point B, I want to travel in direction V.
I would like to travel out from both of these points, following a circular path that is tangent to the associated direction vector at that point.
There can be either 1 (in some cases) or 2 circles, but the circles should meet when their tangents match, so the resulting curve between the two points is everywhere differentiable.
Usually I want the radius of these circles to match, but it would be better if I could define a ratio between them.
This is very similar to this question: Connecting two tangents with two circles of equal radius
Except for the addition of the radius ratio.
My approach for solving this so far has been:
- The circle assocaited with point A will have a center that lies on a line orthogonal to U, the vector direction associated with A.
- The circle associated with point B will similarly have a center that lies on a line orthogonal to V, the vector direction associated with B.
- Solve for A + r*U = B + r*V = 2r
However there are a couple problems with this approach:
- I don't think this will account for the ratio constraint (I'm not sure if that CAN be enforced)
- I'm not sure if it's mathematically consistent with what direction I need to travel along the orthogonal-to-the-direction-vector vector: for instance, if I travel -r units along one vector but +r units along the second vector, will the quadratic still work to solve for R?
This is not to mention the fact that I've apparently messed up the programming and it's not giving me the answers I expect... I'll keep trying on that front, but I would really appreciate input on the above 2 points.