Suppose we have a function $f$ defined on the 2D lattice points that takes values between $[0, 1]$. Furthermore, suppose that $f$ satisfies
$$f(a,b) = \frac{f(a-1,b) + f(a + 1,b) + f(a, b-1) + f(a, b+1)}4.$$
That is, the value of $f$ at $(a,b)$ is the average value of its neighbors. (This is also knows as the discrete harmonic property.) The problem is:
Prove that $f$ must be the constant function.
I have tried is to assume that $f$ is only defined on a finite grid of lattice points. Using an extremal argument, we can prove that $f$ must take its maximal value on the boundaries. However, this doesn't seem to lead anywhere in the infinite case. Any ideas?
One thing you can do is to first establish a mean value property (so that $f(x,y) = \frac{1}{4r}\sum\limits_{|a_1|+|a_2|=r} f(x+a_1,y+a_2)$). Use this to then show that $f(x,y)$ is equal to the average value over any "ball" (that is $\{(x+a,y+b)\colon |a|+|b|\leq r\}$).
Now consider the points $(x_1,y_1)$ and $(x_2,y_2)$. If we take larger and larger balls around these two points, they will intersect at a greater and greater proportion of their respective areas. Because we assume $f$ is bounded this means that the averages of $f$ over these two balls will converge to the same number, as the radius tends to infinity.
We conclude that
$$f(x_1,y_1)=f(x_2,y_2)$$
and so the function is constant.