Jointly Gaussian random vectors

1.4k Views Asked by At

$\newcommand{cov}{\operatorname{cov}}$Suppose two scalar valued random variables $X$ and $Y$ are jointly Gaussian. We then have the joint density $$f_{X, Y}(x, y) = \frac{1}{2 \pi \sqrt{|K|}} \exp \left\{ -\frac{1}{2} \begin{bmatrix} x - \mu_X\\ y - \mu_Y \end{bmatrix}^T K^{-1} \begin{bmatrix} x - \mu_X\\ y - \mu_Y \end{bmatrix} \right\} $$

where $K = \begin{bmatrix} \cov(X, X) & \cov(X, Y)\\ \cov(Y, X) & \cov(Y, Y) \end{bmatrix}$.

Now how do we write the joint density when $X \in \mathbb{R}^m$ and $Y \in \mathbb{R}^n$ are random vectors.

Does $K$ become a third order tensor? And what goes in the exponential power?

I think $K \in \mathbb{R}^{m \times n}$ where $K_{ij} = \cov(X_i, Y_j)$ even though strict analogy with the scalar case would suggest something like $K \in \mathbb{R}^{2 \times m \times n}$

4

There are 4 best solutions below

0
On BEST ANSWER

You state "We then have the joint density" etc. But that neglects the case where $K$ is singular. However, that is not essential to the question.

Perhaps normality is not essential to the question either.

If $X,Y$ are random variables that take values is $\mathbb R^m$ and $\mathbb R^n$ respectively, then $$ \left[ \begin{array}{c} X \\ Y \end{array} \right] \in \mathbb R^{m+n} $$ and one can write \begin{align} K & = \operatorname E\left( \left(\left[ \begin{array}{c} X \\ Y \end{array} \right] - \left[ \begin{array}{c} \mu_X \\ \mu_Y \end{array} \right] \right)\left( \left[ \begin{array}{cc} X^\top, & Y^\top \end{array} \right] - \left[ \begin{array}{cc} \mu_X^\top, & \mu_Y^\top \end{array} \right] \right) \right) \\[12pt] & = \left[ \begin{array}{cc} \operatorname E((X-\mu_X)(X-\mu_X)^\top & \operatorname E((X-\mu_X)(Y-\mu_Y)^\top) \\ \operatorname E((Y-\mu_Y)(X-\mu_X)^\top) & \operatorname E((Y-\mu_Y)(Y-\mu_Y)^\top) \end{array} \right] \\[10pt] & \in \mathbb R^{(m+n)\times(m+n)}. \end{align}

One also writes $$ \operatorname{cov}(X,Y) = \operatorname E((X-\mu_X)(Y-\mu_Y)^\top) \in \mathbb R^{m\times n} $$ and then one has $$ \operatorname{cov}(X,Y) = \big( \operatorname{cov}(Y,X)\big)^\top, $$ i.e., unlike in the scalar-valued case, the covariances with the arguments interchanged are not equal to each other, but are transposes of each other.

0
On

It is not possible to write such a thing without knowing the covariance between the components of X and Y, or among different components of X and Y each among themselves.

If you do know that information, then simply break down X and Y in to scalar components, and write a jointly Gaussian distribution using a larger covariance matrix, which is a square matrix with the same number of dimension as the total number of scalar component. It really doesn't matter which one is part of $X$ and which one is part of $Y$, they are just a bunch of scalar random variables.

0
On

It will be a multivariate normal distribution.

Please refer to https://en.wikipedia.org/wiki/Multivariate_normal_distribution

0
On

Let $G = \begin{bmatrix} X\\ Y \end{bmatrix} \in \mathbb{R}^{m + n}$ be a new random vector.

Let $\mu_G = \begin{bmatrix} \mu_X\\ \mu_Y \end{bmatrix} \in \mathbb{R}^{m + n}$

Let $K \in \mathbb{R}^{(m + n) \times (m+n)}$ where $K_{ij} = \text{cov}(G_i, G_j)$

Then, the joint density of random vectors $X$ and $Y$ is the usual multivariate distribution of the random vector $G$:

$\begin{align*} f_{X, Y}(x, y) &= f_{G}(g)\\ &= \frac{1}{(2 \pi)^{\frac{m + n}{2}} \sqrt{|K|}} \exp \{-\frac{1}{2} \left ( g - \mu_G \right )^T K^{-1} \left ( g - \mu_G \right ) \} \end{align*}$

where $g = \begin{bmatrix} x\\ y \end{bmatrix} \in \mathbb{R}^{m + n}$

Assuming $K$ is non-singular or equivalently, is a positive definite covariance matrix.