Strange formula for radar horizon, does anyone know where it comes from

686 Views Asked by At

I am in Electronics Technician "A" School in the Navy and we are learning the basics of radars. In one module, we were exposed to this formula which confuses me. If someone has an inkling as to why it works and where it comes from I'd be grateful:

$ R_{nm} = 1.25 \sqrt{h}$ where $h$ is the height of a ship's antenna in feet. The result $R_{nm}$ - the radar horizon is in nautical miles which sounds very strange to me. How can the square root of feet give you nautical miles? The petty officer instructor was once a math major before he joined the Navy and even he has trouble deriving this formula.

2

There are 2 best solutions below

3
On BEST ANSWER

The radius $R$ of the Earth is about $3444$ nautical miles. Let $m$ be the height of the antenna (above sea level) in nautical miles. Let $O$ be the centre of the Earth, let $T$ be the top of the antenna, and let $H$ be a point on the horizon, as viewed from the top of the antenna. Let $d$ be the distance from $T$ to $H$. This is very close to the distance to the horizon, however we measure that distance.

Then $\triangle OTH$ is right-angled at $H$, so we have by the Pythagorean Theorem that $(R+m)^2=R^2+d^2$. Expand. We get $d^2=2Rm+m^2$.

Note that $m^2$ is very small in comparison to $2Rm$, so we have $d^2\approx 2Rm$. Note also that $m=\frac{h}{6080}$. (There are about $6080$ feet in a nautical mile.) Thus $$d\approx \sqrt{(2)(3444)/6080}\,\sqrt{h}.$$ Note that $\sqrt{(2)(3444)/6080}$ is about $1.07$, not really close to $1.25$, though not too bad.

Remark: If we work with ordinary miles, we do much better, with the constant in front of the $\sqrt{h}$ reasonably close to $1.25$.

0
On

This quote from American Practical Navigator (Bowditch, 1981) Volume II, Article 506, may be useful: "Radar horizon. The distance to the radar horizon is the distance between the transmitter and the point at which the radar rays graze the surface of the earth. In the standard atmosphere, radar rays, like light rays, are bent or refracted slightly downwards, approximating the curvature of the earth. Where h is the height of the radar antenna in feet, the distance D, to the radar horizon in nautical miles, assuming standard atmospheric conditions, may be found from the formula $D=1.22\sqrt{h}$. Although this formula is based upon a wavelength of 3 centimeters, it may be used in computation of the distance to the radar horizon for other wavelengths normally used with navigational or surface search radar."