Non Gaussian Additive Channel Capacity

320 Views Asked by At

If I have a additive, stationary, memoryless non-Gaussian noise channel $$ Y_i = X_i + Z_i $$ With the fixed mean and covariance on the noise $Z_i$ $$ \mathbb{E}(Z_i) = 0,\ Var(Z_i) = 1 $$ and a covariance constraint on $X_i$ $$ \frac{1}{n}\sum x_i^2 \leq P. $$ How can I prove the following bound on the channel capacity $$ \frac{1}{2}\log(1+P) \leq C(P)\leq \frac{1}{2}\log(1+P) + D(P_z||\mathcal{N}(0,1))? $$

I think I can get the lower bound as follows. Letting $X_g \sim \mathcal{N}(0,P)$ \begin{align} C(P) =& \max_{P_x : Var(X) \leq P} I(X;Y)\\ =& \max_{P_x : Var(X) \leq P} I(X;X+Z)\\ \geq& I(X_g; X_g + Z)\\ \geq& \min_{P_z} I(X_g;X_g+Z)\\ =& I(X_g;X_g+Z^*)\\ =& \frac{1}{2}\log(1-P) \end{align}

Because the noise distribution that minimizes mutual information is Gaussian $P_{Z^*} = \mathcal{N}(0,1)$.

But I'm having trouble with the upper bound.

1

There are 1 best solutions below

4
On BEST ANSWER

I am not sure I follow the proof for the lower bound. You can obtain a lower bound by setting $X$ to be Gaussian (which, is not optimal, in general), and then note that the worst distribution for $Z$ when $X$ is Gaussian is Gaussian as well! Therefore, the capacity is always lower bound by the capacity of the "standard" Gaussian input - Gaussian noise channel.

To prove the upper bound, start by writing

$$ \begin{align} I(X;Y)&= h(Y)-h(Y|X)\\ \tag{1} &=h(Y)-h(Z), \end{align} $$

assuming that $X$ and $Z$ are independent. Now, the entropy of any variable with a given variance is upper bounded by the entropy of a Gaussian variable with the same variance, therefore,

$$ \tag{2} h(Y)\leq \frac{1}{2}\ln\left((1+P) \sqrt{2 \pi e} \right). $$

In order to find a lower bound for $h(Z)$, let $p_Z(z)$ denote the probability distribution (pdf) of $Z$ and $p_G(z)\triangleq \frac{1}{\sqrt{2 \pi}} e^{-z^2/2}$. It holds

$$ \begin{align} D(p_Z\|p_G)&\triangleq \int p_Z(z) \ln\left(\frac{p_Z(z)}{p_G(z)} \right)dz\\ &=-h(Z)-\int p_Z(z) \ln\left(p_G(z)\right)dz\\ &\stackrel{(a)}{=} -h(Z)-\int p_G(z) \ln\left(p_G(z)\right)dz\\ &=-h(Z)+\frac{1}{2}\ln\left(\sqrt{2 \pi e} \right), \end{align} $$ where $(a)$ is due to $\ln p_G(z)$ being a quadratic function of $z$ and $p_G$, $p_Z$ having the same variance (thanks to @user187815 for noting this!). Therefore,

$$ \tag{3} h(Z) = \frac{1}{2}\ln\left(\sqrt{2 \pi e} \right)- D(p_Z\|p_G) $$

Using (2) and (3) in (1), the upper bound follows.