Normalize infinite range into finite one

1.4k Views Asked by At

First of all; I'm a programmer, not a mathematician so please excuse the informality of my math-vocabulary.

I have a series of slopes, calculated out of random angles (their tangents). These angles will always be located in the I and IV quadrants i.e. there are no angles "pointing to the left".

As you probably know, the tangent curve goes from -infinity to +infinity in this range.

I need to normalize the slopes into a range from -1 to 1. This means that the slope of an angle approaching (pi/2) should get closer and closer to 1 instead of infinity.

I would like to know how to do this in two ways:

1) Preserving a smaller tangent curve in my new range

2) Making it a linear range

1

There are 1 best solutions below

1
On BEST ANSWER

I'm not sure if this is what you're looking for, but you could first normalize your point so that it is at distance $1$ from the origin, then take the resulting $y$-coördinate.

In other words, given $(x,y)$, you would be calculating $\frac{y}{\sqrt{x^2+y^2}}$, which is the sine of the angle from the $x$-axis.