Prove that Sauer's lemma (theorem 3.17) is tight, i.e., for any set X of m > d elements, show that there exists a hypothesis class $\mathcal{H}$ of VC-dimension d such that $\Pi_\mathcal{H}(m) = \sum_{i=0}^d \binom{m}{i}$, where $\Pi_\mathcal{H}(m)$ is the growth function.
It is the exercise in the "Foundations of Machine Learning". I know that upper bounds : $\Pi_\mathcal{H}(m) \leq \sum_{i=0}^d \binom{m}{i}$ because it is shown in the chapter 3 of the textbook. Therefore, I have to prove the lower bound for some hypothesis set. I assume that it is shown by decomposing $\sum_{i=0}^d \binom{m-1}{i}$ to $\sum_{i=0}^{d-1} \binom{m-1}{i} + \sum_{i=0}^d \binom{m-1}{i}$. However I can't come up with other ideas to prove that. Please tell me some idea.
Reference
Mohri, M., Rostamizadeh, A., and Talwalkar, A. (2018) "Foundations of Machine Learning second edition" MIT Press.