Two-look Gaussian channel

347 Views Asked by At

I'm reading through a solution from Elements of Information Theory by Thomas A. Cover. This is the two-look Gaussian channel, where the input to the channel is $X$ and the output is $(Y_1, Y_2)$.

$Y_1 = X + Z_1$

$Y_2 = X + Z_2$

where $(Z_1, Z_2) = Normal(0, K)$, where $K = \begin{bmatrix} N & N \rho \\ N \rho & N \end{bmatrix}$

There is a power constraint on $X$ : $Var(X) = P$

Now, I understand that the channel capacity is maximized when the distribution of $X$ is Gaussian - $Normal (0, P)$. And since $(Y_1, Y_2)$ are combinations of $X$ and $(Z_1, Z_2)$, they must also be normally distributed.

The solution says that $(Y_1, Y_2)$ is distributed as $Normal\left( 0, \begin{bmatrix} N + P & N \rho + P \\ N \rho + P & N + P \end{bmatrix} \right)$

How is this distribution of $(Y_1, Y_2)$ obtained? Why does $P$ get added to all elements of the covariance matrix?

1

There are 1 best solutions below

0
On BEST ANSWER

It is known that the Gaussian noises $Z_1, Z_2$ are zero mean. If we try to find the variance of each of the $Y_1, Y_2$ pairs, we could write the elements of the covariance matrix $L$ of the form: $$ L_{11} = E[Y_1Y_1] = E[(X+Z_1)(X+Z_1)] = E[X^2] + 2E[XZ_1] + E[Z^2] = P + 0 + N = P+N $$ $$ L_{12} = E[Y_1Y_2] = E[(X+Z_1)(X+Z_2)] = E[X^2] + E[XZ_1] + E[XZ_2] + E[Z_1Z_2] = P + 0 + 0 + \rho N = P+ \rho N $$ The other two elements of the covariance matrix can be obtained through symmetry.