I want to understand the formula for using a slope and distance from a point either a positive or negative direction: to: determine the X coordinate of a point on that line which is "D" distance away from your original point in the direction which you specified.
I found this link: " https://www.freemathhelp.com/forum/threads/70424-Find-coordinates-given-original-point-and-slope " which has an equation which supposedly does what I want; but could someone please explain the formula I need, to me in plain enlgish, and also in the standard formula form (like the kind most people use), and also in psuedo-code?
So that I may tripple-check my comprehension?
I can get anxiety and then second-guess myself on these matters; so I get next-to-nothing done sometimes...
bonus point if you can explain to me a procedure for writing the program in N-dimensional coordinate space; but if you would rather not: I only desperately need 2-dimensional (x,y) coordinate space right now...
Consider the equation of your line $y=m(x-a)+b$ and the equation of a circle with radius $D$ from your original point, $D^2=(x-a)^2+(y-b)^2$. Here I am assuming the coordniates of your original point is $(a,b)$. Because the point of intersection between the circle and line have to have the same x and y coordinate you can then substitute the $y$ from your line into the circle to give you $D^2=(x-a)^2+(m(x-a))^2$. From there you can solve for $x$. If you need more guidance from there I can give you some more hints.