This is the first step of proof of MCMC in my notes

I have a question, how come $\pi(x)\pi(x_p\mid x)=\pi(x_p)\pi(x\mid x_p)$? Is it true for any markov chains which are ergodic and aperiodic? The text says "we start by constructing a markov chain, making it $\pi(x)\pi(x_p\mid x) = \pi(x_p)\pi(x\mid x_p)$... But how can we prove for any any markov chains which are ergodic and aperiodic, we have $\pi(x)\pi(x_p\mid x)=\pi(x_p)\pi(x\mid x_p)$?
I'm confused again, in this example, target distribution is Bin$(10,0.3)$ and the proposal distribution is $q_{i,j}=\frac{1}{12}$, as Metropolis-Hastings scheme creates a reversible Markov chain with an equilibrium distribution equal to the target distribution, in the example, their equilibrium distributions are not the same. The target distribution's equilibrium is $\binom{10}{x}0.3^x0.7^{(10-x)}$ while the proposal distribution's equilibrium is always $\frac{1}{12}$?

A Markov chain can have stationary distribution $\pi(x)$ and not satisfy the detailed balance equation $\pi(x) \pi(x_p | x) = \pi(x_p) \pi(x | x_p)$. If an aperiodic and ergodic Markov chain doesn't require that $\pi(x_p | x) > 0$ for all $x_p, x$ then you can construct a counter example by having some $\pi(x_p | x) = 0$ and another $\pi(x | x_p) > 0$.
The definition you gave of a reversible Markov chain is incorrect. A Markov chain is reversible if there exists a probability distribution $p$ over its states $i, j$ such that $p_i \pi(j | i) = p_j \pi(i | j)$. The definition that you gave doesn't include the $p_i, p_j$ terms, which are essential. https://en.wikipedia.org/wiki/Markov_chain#Reversible_Markov_chain
If $\pi(x)$ satisfies the detailed balance equation for the Markov chain $X$ with transition density $\pi(x_p | x)$ then $\pi(x)$ is the equilibrium distribution for $X$. Also $X$ is reversible. This theorem (6.46 in the book Monte Carlo Statistical Methods) is used to show that the Metropolis-Hastings works.
Do you mean Bin(11, 0.3)? 11 isn't in the domain of Bin(10, 0.3). I'm going to set $$f(x) = \binom{11}{x} 0.3^x0.7^{11-x}.$$ Suppose we are at iteration $x_n$ (you can set the first observation $x_1$ to be any number). You then propose a new iteration $x^\prime$ by sampling uniformly from $\{0, \dots, 11\}$. The Metropolis-Hastings probability of setting $x_{n+1} = x^\prime$ is given by $$\alpha(x_n, x^\prime) = \min\left(\frac{f(x^\prime) \frac{1}{12}}{f(x_n) \frac{1}{12}}, 1 \right) = \min\left(\frac{f(x^\prime)}{f(x_n)}, 1 \right),$$ where the $1/12$s come from the proposal probabilities. With probability $\alpha(x_n, x^\prime)$ we set $x_{n+1} = x^\prime$, otherwise we set $x_{n+1} = x_n$. The sequence $x_1, x_2, \dots$ is a Markov chain with equilibrium distribution Bin(11, 0.3).