Does Covariance inequality still hold when variable is a vector?

82 Views Asked by At

When I were studying probability theory, I came across the following conclusion ("Covariance inequality"):

If $x$ and $y$ are two random variable (scalar), the covariance of $x$ and $y$ is: $$ \rm{Cov}(x, y) = > \mathbb{E}[(x-\mathbb{E}[x])(y-\mathbb{E}[y])]=\mathbb{E}[xy]-\mathbb{E}[x]\mathbb{E}[y]. > $$ Here $\mathbb{E}[\cdot]$ means math expectation, and note that $\rm{Var}(x)=\mathbb{E}[(x-\mathbb{E}[x] )^2]$ we have: $$ (\rm{Cov}(x,y))^2 \le \rm{Var}(x) \rm{Var}(y). $$

For a proof of this inequality, refer to here. However, I am confused if $\bf{X}$ is a $m\times n$ matrix and $\bf{y}$ is a $n\times 1$ vector, does the following inequality still hold? $$ ||\mathbb{E}[\bf{X}\bf{y}]-\mathbb{E}[X]\mathbb{E}[\bf{y}]||^2_2 \le ||X-\mathbb{E}[\bf{X}]||^2_F ||\bf{y}-\mathbb{E}[y]||^2_2 $$

1

There are 1 best solutions below

1
On BEST ANSWER

Yep - this follows pretty simply from Cauchy-Schwarz. Taking $X$ and $y$ to be mean zero, and $X_i$ to be the rows of $X$, \begin{align*} \lVert\mathbf{E} Xy\rVert^2 = \sum_{i=1}^n \mathbf{E}( X_i^{\mathrm{T}} y)^2 \leq \sum_{i=1}^n \mathbf{E}\lVert X_i\rVert^2 \mathbf{E}\lVert y\rVert^2 = \mathbf{E}\lVert X\rVert_F^2 \mathbf{E}\lVert y \rVert^2. \end{align*}

For non-centered $X$ and $y$, we apply this to $$ \lVert \mathbf{E} Xy - \mathbf{E} X\, \mathbf{E}y\rVert^2 = \lVert\mathbf{E}(X - \mathbf{E}X)(y - \mathbf{E}y)\rVert^2 \leq \mathbf{E}\lVert X - \mathbf{E}X \rVert_F^2 \mathbf{E}\lVert y - \mathbf{E}y \rVert^2. $$