Finding $\alpha,\beta,\gamma,\delta $ such that $V =\alpha X + \beta Y + \gamma Z + \delta\sim N(0,1) $

191 Views Asked by At

I'm preparing for final test at University doing a bank of exercises and I found this one and I don't know how to even start it:

Let random vector $(X,Y,Z)$ have a Gaussian distribution with parameters $ \mu=(1,1,1) $ and covariance matrix: $$\operatorname{Cov}(X,Y,Z) = \begin{bmatrix} 4 & 4 & -3 \\ 4 & 16 & -12 \\ -3 & -12 & 9 \end{bmatrix} $$ Find $\alpha,\beta,\gamma,\delta $ such that $V =\alpha X + \beta Y + \gamma Z + \delta $ has distribution $N(0,1)$ and $V$ is independent of $Y$ and $Z$.

I'm also very sorry for every mistake at translation it from my language because in my country there is no math forum where I can find the solution of this problem.

1

There are 1 best solutions below

4
On BEST ANSWER

Like other elementary multivariate normal question, you will need to know that any affine transformation of a multivariate normal vector is still multivariate normal.

So

$$ V = \begin{bmatrix} \alpha & \beta & \gamma \end{bmatrix} \begin{bmatrix} X \\ Y \\ Z \end{bmatrix} + \delta $$

also has a univariate normal as $V$ is a affine transformation from the multivariate normal vector. As it is given that $V \sim \mathcal{N}(0, 1)$, by matching the required moments of $V$, we obtain $2$ equations:

$$ E[V] = \begin{bmatrix} \alpha & \beta & \gamma \end{bmatrix}\mu + \delta = 0 \tag 1$$

$$ Var[V] = \begin{bmatrix} \alpha & \beta & \gamma \end{bmatrix}\Sigma \begin{bmatrix} \alpha \\ \beta \\ \gamma \end{bmatrix} = 1 \tag 2$$

On the other hand, note that $(V, Y)$ and $(V, Z)$ can be expressed as

$$ \begin{bmatrix} V \\ Y \end{bmatrix} = \begin{bmatrix} \alpha & \beta & \gamma \\ 0 & 1 & 0 \end{bmatrix} \begin{bmatrix} X \\ Y \\ Z \end{bmatrix}$$

$$\begin{bmatrix} V \\ Z \end{bmatrix} = \begin{bmatrix} \alpha & \beta & \gamma \\ 0 & 0 & 1 \end{bmatrix} \begin{bmatrix} X \\ Y \\ Z \end{bmatrix} $$

so each of them is also bivariate normal.

The covariance matrix of them are given by $$ \Sigma_{VY} = \begin{bmatrix} \alpha & \beta & \gamma \\ 0 & 1 & 0 \end{bmatrix} \Sigma \begin{bmatrix} \alpha & 0 \\ \beta & 1 \\ \gamma & 0 \end{bmatrix} $$

$$ \Sigma_{VZ} = \begin{bmatrix} \alpha & \beta & \gamma \\ 0 & 0 & 1 \end{bmatrix} \Sigma \begin{bmatrix} \alpha & 0 \\ \beta & 0 \\ \gamma & 1 \end{bmatrix} $$

The independence between two random variables implies that they are uncorrelated. So by computing the covariance by reading the off-diagonal entries of $\Sigma_{VY}, \Sigma_{VZ}$, this give you another two equations:

$$ Cov[V, Y] = \begin{bmatrix} 1 & 0 \end{bmatrix} \Sigma_{VY} \begin{bmatrix} 0 \\ 1 \end{bmatrix} = 0 \tag 3$$

$$ Cov[V, Z] = \begin{bmatrix} 1 & 0 \end{bmatrix} \Sigma_{VZ} \begin{bmatrix} 0 \\ 1 \end{bmatrix} = 0 \tag 4$$

This will gives you a system of $4$ equations with $4$ unknowns. The actual computation work will left to you.