This question relates to a proof of some theorem, and to avoid reckless discussion, I will incorporate some of the statements in the theorem that might seem irrelevant at first glance.
Let $f:\mathbb{R}^n\to\mathbb{R}$ be a smooth function such that $\displaystyle\lim_{x\to\infty}f(x)$ is either a constant or $\infty$, let $M$ be the graph of $f$ defined by $$M=\{(x,f(x))\in\mathbb{R}^{n+1}:x\in\mathbb{R}^n\},$$ and let $g$ be the metric on $M$ induced by the Euclidean metric on $\mathbb{R}^{n+1}$. Assume that $f_i f_j=O_2(|x|^{-q})$ for some $q>\frac{n-2}{2}$. Here, the subscripts on $f$ denote partial differentiation, and $O_2(|x|^{-q})$ refers to a function in $C_{-q}^2$. We say that $f\in C_{-q}^2$ if $$|f(x)|+|x|\cdot|\partial f(x)|+|x|^2\cdot|\partial^2 f(x)|<C|x|^{-q}$$ for some constant $C$. $\partial f$ is the vector field whose components are $f_i$'s.
Let's bring in the main course.
According to the author who proved the theorem, we can apply the divergence theorem to get $$\int_M\sum_{i,j=1}^n\partial_j\left(\frac{f_{ii}f_j-f_{ij}f_i}{1+|\partial f|^2}\right)\overline{d\mu}=\lim_{r\to\infty}\int_{S_r}\sum_{i,j=1}^n\left(\frac{f_{ii}f_j-f_{ij}f_i}{1+|\partial f|^2}\right)\overline{\nu}^j\overline{d\mu}_{S_r},$$ where $S_r$ is the sphere of radius $r$ centered at the origin of $\mathbb{R}^n$ and $\overline{\nu}$ is the outward-pointing unit normal vector field along the sphere.
I didn't see the viability of the equality; after all, if one implements the divergence theorem on $M$, he/she will have to consider an integral over $\partial M$ instead of an expanding sphere. Why is that? Does anyone have an idea? Thank you.