I am interested in any example of construction distribution by coin flipping.
Actually I want to show the process of construction a random variable $X$ with distribution $(p_1,...,p_n)$ by coin flipping and to prove expectation of the number of coin flipping is at least entropy $H(X)$.
So far I have no idea where to start from. Will appreciate for examples and hits.
I'm not sure exactly what you're trying to ask here. You'll probably be able to find this answer in a textbook somewhere. If you were asking something else it would be helpful if you gave a little more detail in your question.
Suppose $p_i< 2^{-k}$. If I toss a coin $k$ times I have $2^k$ possible events each of probability $2^{-k}$. So I cannot generate an event of probability $p_i$ with $k$ coin flips.
Therefore if my generator returns $X=i$ it must have flipped the coin at least $-\log_2(p_i)$ times.
So if $K$ is the number of coin flips before my algorithm returns a value we must have $$\mathbb E(K|X=i) \geq \log_2(p_i).$$
Now we just add these up $$\begin{array}{rcl}\mathbb E(K) &=& \sum_{i=1}^n \mathbb E(K|X=i)\mathbb P(X = i) \\ &\geq& \sum_{i=1}^n -\log_2(p_i)p_i \\ &=&\mathcal H(X).\end{array}$$
In reality it can take much longer. If I have a random variable with $p_1 = 0.50001$ $p_2 = 0.49999$ I need a very large number of coin flips to simulate that exactly, even though the entropy is pretty much one.
The easiest way to generate your random number is to think of an infinite string of coin flips as a random number.
If you replace every Head with a $1$ and every Tail with a $0$, Then a sequence of coinflips $THTHHT\dots$ would represent a binary number $0.010110\dots\equiv 0.17187$.
Let's make this rigorous.
Set $$\begin{array}{rcl} d_k &=& \left\{\begin{array}{rl} 2^{-k} &\text{if the $k$th toss is a head} \\0 &\text{if the $k$th toss is a tail}\end{array}\right. \\ X_i &=& \sum_{k=1}^i d_i \\ X &=& \sum_{k=1}^\infty d_i \\ q_i&=& \sum_{k=1}^i p_i \\ &=& \mathbb P(M\leq i). \end{array}$$
Now we want to set $m=i$ whenever $p_{i-1} < X < p_{i}$ So if we toss the coin $k$ times we know that $X_k < X < X_k + 2^{-k}$ so we stop tossing the coin the first time $k$ such that $p_{i-1} <X_k < X_k + 2^{-k} < p_{i}$ for some $i$.