Bound on expectation of inner product of a Gaussian rv divided by product of norms

473 Views Asked by At

Let $Y$ be a $n$-dimensional Gaussian random vector with expectation $0$ and covariance matrix $\Sigma$ (which explicitely is not diagonal). Let $\gamma$ be a vector in $\mathbb{R}^n$, let $\| \cdot \|$ denote the Euclidean norm in $\mathbb{R}^n$ and let $\langle \cdot, \cdot \rangle$ denote the Euclidean scalar product. My question is if it is possible to find a vector $\gamma$ such that it holds that $$ \mathbb{E} \left[ \frac{| \langle \gamma, Y \rangle |}{\| \gamma\| \| Y \| } \right] \leq \frac{1}{n}$$ or if this inequality is false for every $\gamma$. Obviously, Cauchy-Schwarz gives the upper bound $1$ for any $\gamma$, but I would be content with finding just a single $\gamma$ that fulfills the stricter bound.

1

There are 1 best solutions below

5
On BEST ANSWER

What you want to show is not true. Either the integrand has to be squared or the right-hand side needs to be $\frac 1 {\sqrt n}$.

To see this, choose a basis in which $\Sigma$ is diagonal and order its eigenvalues: $0\le\lambda_1\le\dots\le \lambda_n$. Let $\gamma=(1,0,\dots,0)$ (in this basis). Then for standard independent normals $N_i$ we have: $$E_n^2=E\left(\frac{| \langle \gamma, Y \rangle |}{\| \gamma\| \| Y \| }\right)^2\le E\left(\frac{| \langle \gamma, Y \rangle |^2}{\| \gamma\|^2 \| Y \|^2 }\right) =E\left(\frac {{\lambda_1}N_1^2} {\sum_{i=1}^{n}\lambda_iN_i^2}\right)\le E\left(\frac {N_1^2}{\sum_{i=1}^n N_i^2}\right)=\frac 1 n$$ The last equality is achieved iff all eigenvalues are equal. However the first inequality is not sharp if $n>1$. Clearly it's the worst if all eigenvalues are equal so assume that. When $n=2$ we get:

$$\sqrt 2E_2=\sqrt 2 E\left(\frac {|N_1|}{\sqrt {N_1^2+N_2^2}}\right)=\sqrt 2E\left(\frac 1 {\sqrt {1+C^2}}\right)=\frac {2\sqrt 2} \pi\approx {0.90032}$$ where $C=\frac {N_2}{N_1}$ is Cauchy - already better than $1$. Asymptotically by SLLN we get: $$\sqrt n E_n=E\left(\frac {\frac{\sum_1^n |N_i|}n}{\sqrt{\frac{\sum_1^n N_i^2}n}}\right)\to \frac {E(|N_1|)}{\sqrt{E(N_1^2)}}=\sqrt{\frac 2 \pi}\approx 0.79788$$ which is better than the original simple bound of $1$.