There is a circle with center $(0, 0)$ and radius $r$. Let $n$ be the number of grid points inside or on the circle that at least one of its neighboring (up, down, left, right) grid points is outside the circle.

With my computer, I got some $r-n$ pairs: $$ \begin{array}{c|lcr} r&n\\ \hline 1&4\\ 2&8\\ 3&16\\ 4&20\\ 5&28\\ 10&56\\ 10^2&564\\ 10^3&5656\\ 10^4&56568\\ 10^5&565684\\ 10^6&5656852\\ 10^7&56568540\\ 10^8&565685424\\ 10^9&5656854248\\ 10^{10}&56568542492\\ \end{array} $$
And I found that $$\lim_{r\to\infty}\frac nr\approx5.6568542\approx4\sqrt2$$
My question is: how to prove the following equation? $$\lim_{r\to\infty}\frac nr=4\sqrt2$$
Let's look at the 1st quadrant (and then multiply by 4). For each $x$ value from $1$ to $r\sqrt2/2$, there is exactly one $y$-value, and it's greater than $r\sqrt2/2$. For each $y$-value from $1$ to $r \sqrt2/2$, there's exactly one $x$-value, and it's greater than $r\sqrt2/2$. So that gives you $r\sqrt2$ points in the first quadrant, and proves your observation.
Edited due to missing "$r$" in 2nd line.