I have two signals. One has a frequency of $10$ Hz, the other has a frequency of $5$ Hz.
Or, in other words, one has a period of $\dfrac 1{10}$ seconds, and the other has a $\dfrac 15$ seconds.
The difference in their frequencies is $10 Hz - 5 Hz = 5 Hz$.
The difference in their periods is $\dfrac 1{10} - \dfrac 15 = -0.1$ seconds.
But $\dfrac 15 Hz$ = $2$ seconds $!= -0.1$ seconds.
Where am I going wrong here?
As observed by MalayTheDynamo note that since
$$f=\frac1T \implies f_2-f_1=\frac1{T_2}-\frac1{T_1}=\frac{T_1-T_2}{T_2T_1}$$
thus
$$10 Hz-5Hz =5Hz = \frac{0.20-0.10}{0.20\cdot 0.10}=\frac{0.10}{0.02}= 5 Hz$$