Concentration of norm of linearly transformed normal random vector as dimension go to infinity

147 Views Asked by At

Following no response, recently asked on MO.

Let $X=(X_1 \dots X_n) \in \mathbb{R}^n, X_i\sim N(0,1), iid.$ Let $B: \mathbb{R}^n \to \mathbb{R}^n $ be the diagonal linear map: $Bx_k:= x_k/ {k}, 1 \le k \le n.$ Then $||B||_F^2= \sum_{k=1}^{n}\frac{1}{k^2}$. Then is: $lim_{n \to \infty}|E||BX|| - ||B||_F |=0?$. What is $E||BX||$ anyway? Note that, if $B$ were $I_n,$ the answer would be yes even for cases $X_i$ non Gaussian, c.f. this question on MO.


Motivation for this question (not needed to answer the question): concentration

Note that: $E[||BX||^2]=||B||_F^2, ||.||_F$ denoting the Frobenius norm. For those familiar with concentration of measure OR Hanson-Wright inequality for concentration of quadratic forms, we could expect that $||BX||^2$ should be concentrated around $E[||BX||^2]=||B||_F^2.$ My question is: is the concentration asymptotically tight when dimension goes to infinity?

Following motivation from the fact: $lim_{n \to \infty}|E||X|| - \sqrt{n} |=0,$ I wonder: does $lim_{n \to \infty}|E||BX|| - ||B||_F |=0?$ Or if not, could we at least have: $\frac{||BX||}{||B||_F } \to _{p} 1$ in probability, as dimension $n \to \infty?$

I purposefully chose $B$ above so that the ratio of Frobenius norm to operator norm of $B$, i.e. $\frac{||B||_F}{||B||}$ does not go to $\infty.$ If it does go to infinity as $n \to \infty$, then we do have: $\frac{||BX||}{||B||_F } \to _{p} 1$, which follows from Hanson-Wright inequality. See the top/first equation from P.144 from this book.

1

There are 1 best solutions below

3
On BEST ANSWER

EDIT: After typing my answer below, I saw that it is essentially what was written on MO. Since I give a few more details, my answer might still be interesting to you.

Unfortunately, what you want to show is not true, which can be seen as follows: Since $\gamma := \sum_{k=1}^\infty k^{-2} < \infty$, the monotone convergence theorem shows that $$ Y := \sum_{k=1}^{\infty} k^{-2} X_k^2 \in L^1(\Bbb{P}), $$ where $\Bbb{P}$ is your underlying probability measure. Define $Y_n := \sum_{k=1}^n k^{-2} X_k^2$ for $n \in \Bbb{N}$.

Note that $Y \geq 0$. Define $Z := \sqrt{Y} \in L^2(\Bbb{P})$, and note that $Y$ (and hence $Z$) is not almost surely constant. Indeed, if $Y$ was almost surely constant, we would have $Y \equiv \Bbb{E}[Y] = \gamma$ almost surely, and hence $$ \gamma = \gamma \cdot \Bbb{E}[X_1^2] = \Bbb{E} [X_1^2 Y] = \Bbb{E} [X_1^4] + \sum_{k=2}^\infty k^{-2} \Bbb{E} [X_1^2] \Bbb{E}[X_k^2] = 3 + \sum_{k=1}^\infty k^{-2} - 1 = 2 + \gamma, $$ which is a contradiction. Here, I used that $\Bbb{E}[X_1^4] = 3$.

Since $Z$ is not almost surely constant, we have a strict inequality in the following application of the Cauchy-Schwarz inequality: $$ \Bbb{E} [\| B X \|] = \Bbb{E} \sqrt{Y_n} \leq \Bbb{E} \sqrt{Y} = \Bbb{E} [1 \cdot Z] < \sqrt{\Bbb{E}[1^2]} \cdot \sqrt{\Bbb{E}[Z^2]} = \sqrt{\gamma} . $$ From this, it is easy to see $$ \lim_{n \to \infty} \big| \Bbb{E} \| B X \| - \| B \|_F \big| \geq \lim_{n \to \infty} \| B \|_F - \lim_{n \to \infty} \Bbb{E} \| B X \| \geq \sqrt{\gamma} - \Bbb{E} [\sqrt{Y}] > 0 . $$

Finally, note that if we had $\frac{\| B X \|}{\| B \|_F} \to 1$ in probability, then we would have convergence almost everywhere for a subsequence and then $\| B X \| = \frac{\| B X \|}{\| B \|_F} \cdot \| B \|_F \to \sqrt{\gamma}$ almost everywhere along this subsequence. Since $0 \leq \| B X \| = \sqrt{Y_n} \leq \sqrt{Y} = Z \in L^2(\Bbb{P}) \subset L^1(\Bbb{P})$, the dominated convergence theorem then shows $$ \Bbb{E} \| B X \| \xrightarrow[n\to\infty]{} \Bbb{E}[\sqrt{\gamma}], $$ contradicting what was shown above.