If $X$ and $Y$ are random scalars, then Cauchy-Schwarz says that $$| \mathrm{Cov}(X,Y) | \le \mathrm{Var}(X)^{1/2}\mathrm{Var}(Y)^{1/2}.$$ If $X$ , $Y \in \mathrm{R}^n$ are random vectors, is there a way to bound the covariance matrix $\mathrm{Cov}(X,Y)$ in terms of the matrices $\mathrm{Var}(X)$ and $\mathrm{Var}(Y)$?
(Note that, $\mathrm{Cov}(X,Y), \, \mathrm{Var}(X), \,\mathrm{Var}(Y) \in \mathrm{R}^{n\times n}$ and $\mathrm{Var}(X)_{ij} = \mathrm{Cov}(X_i,X_j)$ )
In particular, is it true that $$\mathrm{Cov}(X,Y) \preceq \mathrm{Var}(X)^{1/2}\left(\mathrm{Var}(Y)^{1/2}\right)'$$ where the square roots are Cholesky decompositions, and the inequality is read as meaning that the right hand side minus the left hand side is positive semidefinite?
EDIT: I will put what i am trying to do, maybe it helps for an answer.
I have $X$, $Y, \, Z \in \mathrm{R}^2$ such that $Z = X + Y$. I know $\mathrm{Var}(X)$ and $\mathrm{Var}(Y)$ and I need $\mathrm{Var}(Z)$.
As i don't know $\mathrm{Cov}(X,Y)$ I intend to over-estimate $\mathrm{Var}(Z)$ with a matrix $V$ such that $V-\mathrm{Var}(Z) \succeq 0$. That's why i want to know if Cauchy-Schwarz holds for random vectors. If it does, then: \begin{align*} \mathrm{Var}(Z) & = \mathrm{Var}(X)+\mathrm{Var}(Y)+\mathrm{Cov}(X,Y)+\mathrm{Cov}(Y,X) \\ & \preceq \mathrm{Var}(X)+\mathrm{Var}(Y) + \mathrm{Var}(X)^{1/2}\left(\mathrm{Var}(Y)^{1/2}\right)' +\mathrm{Var}(Y)^{1/2}\left(\mathrm{Var}(X)^{1/2}\right)' \\ & = \left( \mathrm{Var}(X)^{1/2} +\mathrm{Var}(Y)^{1/2}\right)\left(\mathrm{Var}(X)^{1/2} +\mathrm{Var}(Y)^{1/2} \right)' \end{align*}
For sure i know i could take $V = 2\left(\mathrm{Var}(X)+\mathrm{Var}(Y)\right)$ and it will work. But it seems to extreme and i am looking for a better bound.