Why doesn't the manhattan distance tend towards the euclidean distance as the number of subdivisions become infinite?

73 Views Asked by At

Suppose that we have a unit square and are interested in the distance between two opposite corners.

The euclidean distance is $\sqrt{2}$.

The manhattan distance is $1 + 1 = 2$.

Suppose we subdivide the square by divide both the width and height in half.

The euclidean distance remains $\sqrt{2}$.

The manhattan distance of $n$ subdivisions is $n^2 \frac{1}{n^2} + n^2 \frac{1}{n^2} = 2$.

Then, $\lim_{n \to \inf} = 2$.

That said, if $n$ is large I could be presented with a unit square with such high resolution that I wouldn't be able to distinguish the one with a distance of $2$ and $\sqrt{2}$. This seems strange to me. Why doesn't $\lim_{n \to \inf} = \sqrt{2}$ intuitively? I'm assuming it has to do with the fact that no matter how small the resolution, the hypothenuse of each subdivision will always be shorter than the sides, but it's still a bit strange to me.