I am looking at some ML notes which talk about the inner product and the fact that in a one perceptron case, the perceptron fires if the angle between the weight and input vectors is small enough. They prove it by saying the following: $$x^T w > T \\ |x||w|\cos{\theta} > T \\ \cos{\theta} > \frac{T}{|x||w|}$$
They then invert the inequality sign:
$$\theta < \cos^{-1} \left( \frac{T}{|x||w|} \right)$$
I do not understand why we need to invert the sign there.
The function $f(x)=\arccos(x)$ is decreasing on its domain (as can be checked by the derivative). Thus, for $x<y$, we have $\arccos(x)>\arccos(y)$.