Vector-gaussian rate distortion theory with a covariance constraint

131 Views Asked by At

Fix positive-definite $D\prec\Sigma \in \mathbb{R}^{n\times n}$. $x[t]\overset{iid}\sim\mathcal{N}(0,\Sigma)$ for $t>0$. Is the minimum entropy-rate $r$ known such that an encoding $\hat{x}$ typically has $\frac{1}{T}\sum_{t=1}^T (\hat{x}[t]-x[t])(\hat{x}[t]-x[t])' \preceq D$ for $T$ large enough?

I am trying to show $r=\min_S I(x;x+z)$ where $z\sim \mathcal{N}(0,S)$ and $S$ is among those where$\operatorname{var}(x-\mathbb{E}[x|x+z]) \preceq D$. The details are tricky and I'm not sure if this is the right answer.


If $\Sigma$ and $D$ are simultaneously diagonalizable then the information content of the source coding problem is identical to just $n$ independent scalar problems. I am having difficulty with the fact that $\Sigma$ and $D$ might not be simultaneously diagonalizable.

Can anyone help complete the argument, or find a flaw? A literature reference would be welcome too.