Normal random Vector

780 Views Asked by At

Question: Prove that linear functions of the form $\bar{y}=\bar{b}+\mathrm{B}\bar{x}$ are normal random vectors provided that $\bar{x}$ is a normal random vector. Find $E(\bar{y})$ and $V(\bar{y})$. Prove that the normal random variables in $\bar{y}$ are independent iff $V(\bar{y})$ is a diagonal matrix.

My Doubt:

I understood that $\bar{y}$ is a column matrix of order $n \times 1$ having $n$ random variables. Similarly, $\bar{x}$ is a column matrix of order $n \times 1$ having $n$ random variables. but what is $\bar{b}$? Is it a column matrix of random variables or real numbers?

EDIT: Secondly, Using the concept of $Cov(\bar{y})$;still I am not able to understand the statement. Prove that the normal random variables in $\bar{y}$ are independent iff $Var(\bar{y})$ is a diagonal matrix. the diagonal of the matrix $Cov(\bar{y})$ gives the variance. Its obvious that since all $x_i$s are independent $Var(y_i)$ will be independent as it is a function of $x_i$s. So what we have to prove??

1

There are 1 best solutions below

0
On BEST ANSWER

The definition of $X=(X_1,\ldots,X_n)$ being ($n$-dimensional) normally distributed is that $$ \langle t,X\rangle=t\cdot X^\intercal=\sum_{i=1}^nt_iX_i=t_1X_1+\cdots+t_nX_n $$ follows a (one-dimensional) normal distribution for all $t=(t_1,\ldots,t_n)\in\mathbb{R}^n$.

Now, let $X$ be $n$-dimensional normally distributed and let $B=\{b_{ij}\}$ be an $n\times n$ matrix and $a=(a_1,\ldots,a_n)\in\mathbb{R}^n$ be a row vector. Then we want to show that $Y=(Y_1,\ldots,Y_n)$ is $n$-dimensional normally distributed where $$ Y^\intercal=a^\intercal+BX^\intercal. $$ Therefore, let $t\in\mathbb{R}^n$. Then $$ \begin{align} \langle t,Y\rangle&=t\cdot Y^\intercal=t\cdot a^\intercal+t\cdot B\cdot X^\intercal\\ &=\sum_{i=1}^n t_ia_i+\sum_{i=1}^n \sum_{j=1}^n t_ib_{ij}X_j\\ &=\sum_{i=1}^n t_ia_i+\sum_{j=1}^n c_jX_j \end{align} $$ where $c_j=\sum_{i=1}^nt_ib_{ij}$. From this we see that $\langle t,Y\rangle$ follows a normal distribution, and since $t\in\mathbb{R}^n$ was arbitrary we conclude that $Y$ follows an $n$-dimensional normal distribution.

To find the mean vector and covariance matrix belonging to $Y$ you can choose $t$ wisely. For example, let $t=e_i$, where $e_i$ is the vector of $0$'s except for a $1$ at the $i$th place.

To prove the statement "the elements $Y_1,\ldots,Y_n$ of $Y$ are independent if and only if the covariance matrix $\Sigma_Y$ is diagonal" we need the following results:

If $X\sim\mathcal{N}_n(\mu,\Sigma)$ is $n$-dimensional normally distributed with mean vector $\mu$ and covariance matrix $\Sigma$, then the characteristic function of $X$ is $$ \varphi_X(t)=\exp\left(i\langle t,\mu\rangle-\tfrac12 t\Sigma t^\intercal\right),\quad t\in\mathbb{R}^n. $$

and

Any random variables $X_1,\ldots,X_n$ are independent if and only if $$ \varphi_{(X_1,\ldots,X_n)}(t)=\prod_{j=1}^n\varphi_{X_j}(t_j),\quad\text{for all }\,t=(t_1,\ldots,t_n)\in\mathbb{R}^n. $$

Now to the proof: That $Y_1,\ldots,Y_n$ being independent implies that $\Sigma_Y$ is diagonal is obvious. So let us show the other direction. We assume that $$ \Sigma_Y:=\{\mathrm{Cov}(Y_i,Y_j)\}_{i,j=1}^n $$ is diagonal, that is, $\mathrm{Cov}(Y_i,Y_j)=0$ for $i\neq j$. This implies that for $t\in\mathbb{R}^n$ we have $$ t\Sigma t^\intercal=\sum_{j=1}^n t_j^2\sigma_j^2, $$ where $\sigma_j^2=\mathrm{Var}(Y_j)$. Thus the characteristic function of $Y$ is $$ \varphi_Y(t)=\exp\left(i\langle t,\mu_Y\rangle-\tfrac12 t\Sigma_Yt^\intercal\right)=\prod_{j=1}^n\exp\left(it_i\cdot \mu_Y^j-\tfrac12 t_j^2\sigma_j^2\right)=\prod_{i=1}^n\varphi_{Y_i}(t_i) $$ from which we conclude that $Y_1,\ldots,Y_n$ are independent.