I am going through the proof of Theorem 3.1 in chen and white 1999. The authors use the upper bound of the $L_2$ metric entropy with bracketing $\mathcal{H}_{[]}(w, \Theta_n) = 2^kr_nB_n(1+d)\log(2^kr_nB_n(1+d)/w)$ to derive the rate of convergence of a sieve estimator.
Unfortunately, they do not provide the calculations for the bounds of $\mathcal{H}(w, \Theta_n)$. I tried to derive the bound from the following inequality $$ N(\varepsilon, \Theta_n, ||\cdot||_2) \leq \big(\frac{3}{\varepsilon}\big)^d\frac{vol(\Theta_n)}{vol(NormBall)} $$ where $NormBall$ is the unit norm ball and $d$ the dimension. But i get a rather different result ($\leq constant*2^kr_n\log(B_n/w)$). Could anyone show me how the authors arrived at their result for $\mathcal{H}(w, \Theta_n)$?