Variance of a conditional distribution of a multivariate gaussian distribution

137 Views Asked by At

I am reading about multivariate Gaussian distributions and ran into the following equation about condtional distributions which says that any conditional distribution is also Gaussian.

$$ p(x,y) = \mathcal{N} \Bigg([\mu_x~\mu_y],\begin{bmatrix}\Sigma_x & \Sigma_{xy} \\ \Sigma_{xy}^T & \Sigma_y\end{bmatrix}\Bigg) $$

$$ p(x|y) = \mathcal{N}\bigg( \mu_x + \Sigma_{xy}\Sigma_y^{-1}(y-\mu_y), \Sigma_x - \Sigma_{xy}\Sigma_y^{-1}\Sigma_{xy}^T \bigg) $$

I am having a difficult time convincing myself that the variance of this conditional distribution at a particular $y$ is independent of $y$.

To visualize this I am trying to imagine a simple case (without much loss of generality) where $\Sigma_{xy} = 0$ so that there is no correlation between the random variables, which gives

$$ p(x|y) = \mathcal{N}(\mu_x, \Sigma_x) $$

I am thinking of this conditional distribution as a cross section of the bivariate at that particular $y$. How can the variance be a constant($\Sigma_x$) at any cross section of the bivariate? The reasoning seems obvious when I think of $y$ that lies far away from $[\mu_x~\mu_y]$. Could someone please help me understand the fault in my reasoning?

Thank you