Given a Markov chain with state space $\Omega$ and its transition matrix $P$, a function $h(x):\Omega\to\Bbb R$ is called a harmonic at state $x$ if $h(x)=\sum_{y\in\Omega}P(x,y)h(y)$, and is called harmonic on $D\subseteq\Omega$ if $h$ is harmonic on every state of $D$.
I learned before that given an open domain $\Omega\subset\Bbb R^n$ and $u\in C(\Omega)$, then $u$ is harmonic in $\Omega$ if $\Delta u(x)=0$ for any $x\in \Omega$, where $\Delta u=\sum_{i=1}^n\frac{\partial^2u}{\partial x_i^2}$.
I think since they share the same name there must be a strong connection. Any one can help explain why the discrete function $h$ is named "harmonic" and explain about the relation between two definitions, or provide some reference? Thank you!
Suppose you have a uniform grid of points in the domain $\Omega$. Each of these points has a certain number $N(x)$ of neighbors (most of them have $2n$ of them). If you introduce a Markov chain with $P(x,y)=1/N(x)$ if $x$ and $y$ are neighbors and $0$ otherwise, then a harmonic function $h$ in the sense of your probability definition is a finite difference approximation for a harmonic function on $\Omega$ in the sense of PDE. (Note that the factor of the grid spacing $h$ has been multiplied out of the problem; if you had a Poisson equation, or a non-uniform grid, you would need to be careful about this.)
There are other connections between the theory of Markov processes and potential theory out there. http://www.ntu.edu.sg/home/nprivault/papers/greifswald_potential.pdf is a great treatment.