I'm working on a computer program that displays on a screen that is 1280x800 pixels. I have a point that exists within the coordinates of the screen at the middle/origin. That gives me a view of 400 pixels/points in the +y and -y axes and 640 in the +x and -x axes. Then I have an imaginary point offscreen, which I can obtain the coordinates of in relation to my center point. My problem is that I need to find the projection of that point on the edge of my rectangular screen.
If this doesn't make sense, I'll put it in more "mathy" terms:
A circle can be thought of as all the points a given distance from the center. I can model this with the equation x^2+y^2=r^2. How could I do the same thing for a rectangle? Was thinking of doing something with the angle to the point, but I couldn't figure out how to project it onto the rectangle to find a set of coordinates I could use.
If the math doesn't aid your imagination, the game is centered around a spaceship and I need to place an arrow at the edge of the screen pointing to where another ship is.
To project the point $(x,y)$ onto the boundary of the $2a\times 2b$ rectangle centered at the origin (in your case $a=640$, $b=400$):
Note that you may need to use $a=639\frac12$ and $b=399\frac12$ and consider the point between the central four pixels as coordinate origin if you want the projected points to be definitely on screen.