Suppose I have a polar curve written as $\theta = f(R)$ where $0<f(R)<2\pi$ for all $R$. I want to find the angle between this curve at radius $R_0$ and the tangent to the circle with radius $R_0$ (centered at the origin). My reference tells me that this angle simply obeys $$ \cot\alpha = \left|R_0\left.\frac{d\theta}{dR}\right|_{R=R_0}\right| $$ I want to reproduce this result. For a curve $\theta = f(R)$, I get that the slope at a given radius is $$ m_1(R) = \frac{\tan\theta + \frac{d\theta}{dR}}{1-\tan\theta\frac{d\theta}{dR}} $$ This is achieved by dividing $\frac{dy}{dr},\frac{dx}{dr}$, where $x = r\cos\theta,y=r\sin\theta$.
To compute the slope of the tangent, I used the fact that the slope of a radius with an angle $\theta$ is simply $\tan\theta$, and since the tangent is perpendicular, the slope of the tangent is $m_2(R) = -\frac{1}{\tan\theta(R)}$.
Now I wanted to use a formula I found in many places, that tells me that the angle between these lines, call it $\alpha$ is given by $$ \tan\alpha = \left|\frac{m_1-m_2}{1-m_1m_2}\right| $$ but I just can't see how to come up with the simple expression written above. In particular, I don't see how this factor of $R_0$ comes into play.