Let $f:[a,b]\to\mathbb{R}$ such that for every $h>0$ such that $$(x-h,x+h)\subset[a,b]$$ we have $$f(x)=\frac{1}{2}(f(x-h)+f(x+h)).$$
How can I conclude that $f$ is harmonic in $[a,b]$?
My idea is that $f$ satisfies the Mean Value Property and so is harmonic. But how can I show that the previous formula is equivalent to the one dimensional Mean Value Property?
Consider the difference $g(x) = f(x)-\frac{f(b)-f(a)}{b-x} x $. Check that $g(a)=g(b)$. Conclude that either $g$ is identically constant, or it has an interior point of extremum. Show that the latter contradicts the assumed property (which holds for $g$ since it holds for $f$).
In more details: suppose $\max g>g(a)$. Consider the set $\{x:g(x)=\max g\}$. Let $x_0$ be the infimum of this set. For sufficiently small $h$ you'll find that $$g(x_0)>\frac12(g(x_0+h)+g(x_0-h))$$