I continue developing a 2D Collision Detection System in a programming language (Javascript) and one of the last things I need to sharpen it is to know a formula to find this angle:
NOTE: X and Y increase their value FROM LEFT TO RIGHT AND TOP TO BOTTOM
As you can see the angle is relative to the 0° degree or north pole of the 2D space.
Knowing the coordinates of the two points, how can I know that angle? I might have an idea of finding the bearing to rectangle vertices and stuff like that (I just used them for the system) but I want to know if there is already a simple formula for this.
Thank you beforehand!

Define the bearing angle $\theta$ from a point $A(a_1,a_2)$ to a point $B(b_1,b_2)$ as the angle measured in the clockwise direction from the north line with $A$ as the origin to the line segment $AB$.
Then,
$$ (b_1,b_2) = (a_1 + r\sin\theta, a_2 + r\cos\theta), $$
where $r$ is the length of the line segment $AB$. It follows that $\theta$ satisfies the equation
$$ \tan\theta = \frac{b_1 - a_1}{b_2 - a_2} $$
As suggested by @rogerl we can use the $\mathrm{atan2}$ function to compute $\theta$. Let
$$ \hat{\theta} = \mathrm{atan2}(b_1 - a_1, b_2 - a_2) \in (-\pi,\pi] $$
Then the bearing angle $\theta\in[0,2\pi)$ is given by
$$ \theta = \left\{ \begin{array}{ll} \hat{\theta}, & \hat{\theta} \geq 0\\ 2\pi + \hat{\theta}, & \hat{\theta} < 0 \end{array}\right. $$
Note that the equations are given in terms of Cartesian coordinates, so it is necessary to transform to screen coordinates. I believe the formula for $\hat{\theta}$ in terms of screen coordinates $(a_1,a_2)$ and $(b_1,b_2)$ is $\hat{\theta} = \mathrm{atan2}(b_1 - a_1,a_2 - b_2)$.
You could code this function in C++ as follows.