I'm confused about the $\pi(x_p|x)$ and $\pi(x|x_p)$
Let's say $X\sim $Bin$(10,0.3)$, so $\pi(x)=\binom{10}{x}0.3^x0.7^{(10-x)}$, so what does $\pi(x_p|x)$ or $\pi(x|x_p)$ mean? It doesn't make any sense, for example what's the value of $\pi(4|3)$?
If I got a misunderstanding on $\pi(x)$, could you provide an example with values to show why $\pi(x_p|x)$ is meaningful?

The notation is overloaded here - $\pi(x_p|x)$ is not related (directly) to $\pi(x)$, and arguably it would be clearer to replace $\pi(x_p|x)$ with some other notation, such as $q(x_p|x)$. The "constructing a Markov chain" referred to in your text means choosing a transition distribution $q(x_p|x)$ so that the stated condition holds. So you could, for example, choose something like
$q(x_p|x) = 0.5$ if $x_p \equiv x \pm 1 \mod 10$ and $q(x_p|x) = 0$ otherwise.
This example corresponds to a "random walk" transition distribution over the state space $\{1,\ldots,10\}$ (with wrapping at the boundaries). It turns out this will not satisfy the stated condition; the text is describing a way in which to choose a transition distribution $q(x_p|x)$ so that the condition $\pi(x)q(x_p|x) = \pi(x_p)q(x|x_p)$ will be satisfied. This method is known as the Metropolis-Hastings algorithm.
This kind of overloading of notation is not uncommon in the computational statistics literature, and it's definitely something to be aware of.