I'm creating a system that allows users to create objects on the screen (as rectangles) and move them around and resize them using the corners and the mouse. My only problem is, when one of the objects is not a square and it is resized, the corner doesn't seem to match the mouse movement correctly.
For example, if I resize a box that is wider than it is tall, then the size of the box grows much more than the distance the mouse moved, but if the box is taller than it is wide, the box grows slower.
The algorithm that I'm currently using is as follows for resizing from the top-left of the box
((OldMouseX - NewMouseX) + (OldMouseY - NewMouseY)) / 2 * (1 / Screen.scale)
Where Screen.scale is the scale of the user window, which makes this work if the user is "zoomed" in or out.
I eventually came to this formula where I would try different numbers until the resized properly for n
((OldMouseX - NewMouseX) + (OldMouseY - NewMouseY)) / 2 * (1 / Screen.scale) * n
Where for r equals the ratio of Width/Height and n is about the number that is needed to make my boxes resize correctly.
(r, n), (50, 0.04), (5, 0.34), (3, 0.51), (2, 0.67), (1, 1), (.5, 1.53), (1/3, 1.35), (.2, 1.7), (.02, 1.96)
What algorithm do I need to find n with the given ratio?
Note: I'm usually a StackOverflow user so I know that a question like this doesn't belong there, but I was hoping that it would in this Stack Exchange branch. That being said, if this question doesn't follow the rules of Mathematics Stack Exchange, please let me know and I'll try my best to tailor the question for this website. Thanks!
Think of the mouse coordinate as one corner of the new bounding rectange that is to be filled with the object without affecting the aspect ratio. So if the object is $a\times b$ and the mouse was mooved x,y , the scaled a is $a+\mathrm{min}(\frac{y\,a}{b},x)$ and the scaled b is $b+\mathrm{min}(\frac{x\,b}{a},y)$