I am in Electronics Technician "A" School in the Navy and we are learning the basics of radars. In one module, we were exposed to this formula which confuses me. If someone has an inkling as to why it works and where it comes from I'd be grateful:
$ R_{nm} = 1.25 \sqrt{h}$ where $h$ is the height of a ship's antenna in feet. The result $R_{nm}$ - the radar horizon is in nautical miles which sounds very strange to me. How can the square root of feet give you nautical miles? The petty officer instructor was once a math major before he joined the Navy and even he has trouble deriving this formula.
The radius $R$ of the Earth is about $3444$ nautical miles. Let $m$ be the height of the antenna (above sea level) in nautical miles. Let $O$ be the centre of the Earth, let $T$ be the top of the antenna, and let $H$ be a point on the horizon, as viewed from the top of the antenna. Let $d$ be the distance from $T$ to $H$. This is very close to the distance to the horizon, however we measure that distance.
Then $\triangle OTH$ is right-angled at $H$, so we have by the Pythagorean Theorem that $(R+m)^2=R^2+d^2$. Expand. We get $d^2=2Rm+m^2$.
Note that $m^2$ is very small in comparison to $2Rm$, so we have $d^2\approx 2Rm$. Note also that $m=\frac{h}{6080}$. (There are about $6080$ feet in a nautical mile.) Thus $$d\approx \sqrt{(2)(3444)/6080}\,\sqrt{h}.$$ Note that $\sqrt{(2)(3444)/6080}$ is about $1.07$, not really close to $1.25$, though not too bad.
Remark: If we work with ordinary miles, we do much better, with the constant in front of the $\sqrt{h}$ reasonably close to $1.25$.