If $X\in\mathbb{R}^3$ is a multivariate random variable such that $X \sim \mathcal{N}(0, \Gamma)$, show that it has a probability density function

50 Views Asked by At

Let $X \in \mathbb{R}^3$ be a multivariate random variable such that $X \sim \mathcal{N}(0, \Gamma)$ where $$\Gamma = \begin{bmatrix} 3 & -1 & 0\\ -1 & 3 & 0\\ 0 & 0 & 2 \end{bmatrix}.$$ Does it have a probability density function? And find a matrix $A$ such that $AX$ is independent.

Ok, I know that assuming the probability density function exists, it would be equal to$$ f_X(x)\frac{1}{\sqrt{(2 \pi)^3 \det\Gamma}}\exp\left(\frac{1}{2}(x-\mu)^t\Gamma^{-1}(x-\mu)\right).$$ As $\mu=0$ we get $$f_X(x)\frac{1}{\sqrt{(2 \pi)^3 \det\Gamma}}\exp\left(\frac{1}{2}x^t\Gamma^{-1}x\right).$$ So the condition for the existence of the probability density function is that $\det(\Gamma) \not = 0$. As $\det(\Gamma) = 20$, the probability density function does exist. Am I right?

Now, for the second part of the question. We have a property that states the following:

If $X \sim \mathcal{N}(\mu, \Gamma)$, then $X = DZ + \mu$ where $D$ is the square root of $\Gamma$ and $Z$ is a multivariate random variable whose random variables are independent.

In this case, we have $\mu = 0$. Thus $X = DZ$. Now to find $D$ I need to find the square root of $\Gamma$. Hopefully, $\Gamma$ is a diagonalizable matrix. I thus have$$P^{-1} \Gamma P = \begin{bmatrix} 2 & 0 & 0\\ 0 & 2 & 0\\ 0 & 0 & 4 \end{bmatrix},$$ where $$P = \begin{bmatrix} 1 & 0 & -1\\ 1 & 0 & 1\\ 0 & 1 & 2 \end{bmatrix}.$$ Let's put $$B= \begin{bmatrix} \sqrt{2} & 0 & 0\\ 0& \sqrt{2} & 0\\ 0& 0&2 \end{bmatrix},$$ then $\Gamma = PBP^{-1}PBP^{-1}$. So, $D = P^{-1}BP$. I thus have found a matrix such that $X = DZ$. So $D^{-1}X =Z$. Let's take $A = D^{-1}$, and we are done. Is my reasoning correct?

1

There are 1 best solutions below

0
On BEST ANSWER

$\def\Γ{{\mit Γ\ }}$For the first part, your reasoning is correct since this is the definition of normal density functions as long as $|\Γ| ≠ 0$.

For the second part, your reasoning is correct but the theorem takes a bit of detour for this part. In fact, note that for any $C \in M_{3×3}(\mathbb{R})$, there is $CX \sim N(0, C\Γ C^T)$. Since the quadratic form associated with $\Γ$ is$$ g(x_1, x_2, x_3) = 3x_1^2 + 3x_2^2 + 2x_3^2 - 2x_1x_2 = (x_1 + x_2)^2 + 2(x_1 - x_2)^2 + 2x_3^2, $$ then for $C = \begin{pmatrix}1&1&0\\1&-1&0\\0&0&1\end{pmatrix}^T = \begin{pmatrix}1&1&0\\1&-1&0\\0&0&1\end{pmatrix}$, there is $C\Γ C^T = \begin{pmatrix}4&0&0\\0&8&0\\0&0&2\end{pmatrix}$, and the components of $CX$ are independent.