The following norms are standard norms in $\mathbb{R^n}$ $$||x||_1 = \sum_{i=1}^{n} |x_i|,\quad ||x||_2 = \left(\sum_{i=1}^{n} |x_i|^2\right)^{1/2},\quad ||x||_\infty = \max_{1\leq i\leq n} |x_i|$$ associated with the metric $d_1$, $d_2$, $d_\infty$ in $\mathbb{R^n}$. Show that $(a)$ We always have, for $x\in \mathbb{R^n}$, $$||x||_\infty \leq ||x||_1 \leq \sqrt{n} ||x||_2 \leq n||x||_\infty$$ $(b)$ For $E\in \mathbb{R^n}$ bounded and $\delta$ > 0, denote by $N_1(E, \delta)$, $N_2(E, \delta)$, $N_\infty(E, \delta)$ the metric entropy numbers of E associated with $d_1$, $d_2$, $d_\infty$, respectively. Then, $$N_\infty(E, \delta) \leq N_2(E, \delta) \leq N_1(E, \delta) \leq N_2(E, \delta / \sqrt{n}) \leq N_\infty(E, \delta / n).$$ $(c)$ Let $Q$ = [0,1]$^n$ be the unit cube in $\mathbb{R^n}$. Find its metric dimension with respect to the metrics $d_1, d_2, d_\infty$.
Sorry for the long problem. I just wanted to make sure all of the information that I have at my disposal is available here as well!
I'm not really sure how to prove any of these. For $(a)$ do I need to use the distance metrics at all, or should I be able to prove this inequality from the definition itself?
Sorry if I seem clueless - I just had a midterm for this class and am dead tired and thus not thinking straight. However, a push in the right direction with any part of this problem is much appreciated.
edit: I have figured out part $(a)$, but am still a little stuck on $(b)$ and $(c)$.
For $(a)$ :
I might be able to help you for $(b)$ and $(c)$ but I'm not familiar with the terms "metric entropy" and "metric dimension".
Edit :
And for $(c)$ maybe you have seen some theorems that can help ? because it might be possible to solve it "by hand" but it seems difficult.