I am reading Book: High-Dimensional Probability-An Introduction with Applications in Data Science By Roman Vershynin. I try to do Exercise 7.3.5.
The problem is following: Given symmetric $n \times n$ Gaussian random matrix $A$ whose entries above the diagonal are independent $N(0,1)$ random variables, and the diagonal entries are independent $N(0,2)$ random variables. We are expected to use Sudakov-Fernique's inequality or Gordon’s inequality to derive the bound on operator norm of $A$.
My approach is to use Sudakov-Fernique's inequality since $$||A||=\max _{u \in S^{n-1}}\langle A u, u\rangle$$ we can denote first gaussian process $X_u:=\langle A u, u\rangle,u\in T=S^{n-1}$. I also compute the increment: $$\mathbb{E}(X_u-X_v)^2=2\sum_{i,j}(u_iu_j-v_iv_j)^2$$
But I do not know how to construct another gaussian process $Y_u$ which dominate $X_u$, i.e \begin{equation} \mathbb{E}(X_u-X_v)^2 \leq \mathbb{E}(Y_u-Y_v)^2\quad(1) \end{equation} and also satisfy $\mathbb{E}(\sup_{u\in S^{n-1}}Y_u) \leq 2\sqrt{n}$. I try to set $Y_u:=2\langle u, g\rangle$ where $g\sim N(0,I_n)$, by Jensen inequality we know $\mathbb{E}(\sup_{u\in S^{n-1}}Y_u)=2\mathbb{E}(||g||_2) \leq 2\sqrt{n}$, but I do not know how to check (1).
Can Anyone help to construct $Y_u$ that satisfy above mentioned two properties? Really appreciate.
My solution: Actually I already solve the problem, take $Y_u=2\langle u,g \rangle$.$\mathbb{E}(Y_u-Y_v)^2=4||u-v||^2$. We only need to show $$\sum_{i,j}(u_iu_j-v_iv_j)^2\leq 2||u-v||^2$$ It is easy to check above inequality holds: \begin{align*} LHS=||uu^T-vv^T||^2_F &=||u(u-v)^T+(u-v)v^T||^2_F\\ &\leq ||u-v||^2+||u-v||^2\quad ||u||_2=||v||_2=1\\ &= RHS \end{align*}
Actually, your answer is not correct since you relied on false assertion that $\|A\| = \max_{u\in S^{n-1}}\langle Au, u \rangle$. You can easily see this is false by taking $A = -I_n$,i.e. $\|-I_n\| = 1$ while $\langle -I_n u, u\rangle = -1$. This can be fixed by making it absolute value, i.e. $\|A\| = \max_{u\in S^{n-1}}|\langle Au, u \rangle|$. But, unfortunately, $|\langle Au, u \rangle|$ is not a mean-zero Gaussian random variable any more. So, you cannot apply Sudakov-Fernique comparison inequality.
To solve this exercise, we can actually imitate the proof of Theorem 7.3.1 of Vershynin's HDP book.
Let $X_{uv} = \langle Au, v\rangle$. Then, $\|A\| = \max_{u, v\in S^{n-1}}X_{uv}$.
With this, we have
\begin{array} \, \mathbb{E} (X_{uv}-X_{wz})^2 &= \mathbb{E} \{\sum_{i=1}^nA_{ii}(u_iv_i-w_iz_i)+\sum_{i<j}A_{ij}(u_jv_i-w_jz_i + u_iv_j-w_iz_j)\}^2 \\ &= 2\sum_{i=1}^n(u_iv_i-w_iz_i)^2 + \sum_{i<j}(u_jv_i-w_jz_i + u_iv_j-w_iz_j)^2 \\ &\leq 2\sum_{i=1}^n(u_iv_i-w_iz_i)^2 + \sum_{i<j}2\{(u_jv_i-w_jz_i)^2 + (u_iv_j-w_iz_j)^2\} \\ &= 2\sum_{i,j}(u_jv_i-w_jz_i)^2 = 2(\|u-w\|_2^2 + \|v-z\|_2^2) \end{array}
The rest must be straightforward by letting $Y_{uv} = \sqrt{2}\langle g,u \rangle + \sqrt{2}\langle h,v \rangle$, where $g,h \backsim N(0, I_n)$ are independent random vectors.
The Sudakov-Fernique comparison inequality yields $E\|A\| \leq 2\sqrt{2n}$. Note that this is actually $\sqrt{2}$ multiple of the bound stated in Exercise 7.3.5. I guess Vershynin made a similar mistake as yours.