Analytic expression for continuous-variable mutual information of uniform distributions

304 Views Asked by At

I want to quantify how mutual information depends on the variance of one of the variables. Here's a simple test I came up with

$$X \sim U(0, 1)$$ $$Y \sim U(0, 1)$$ $$Z = (X + Y) / 2$$

where $U$ denotes the uniform distribution. I am interested in finding an analytical expression for the mutual information $I(\alpha X, Z)$ for some positive value $\alpha$. I need this test to check the performance of a library that performs numerical calculation of mutual information.

Edit: I don't actually care what $U$ is. If it is simpler to calculate the result for standard normal distributions, you may assume that instead.

Edit 2: Perhaps it is possible to produce a result for a general probability distribution. For example, according to wiki article,

$$H(\alpha X) = H(X) + \log(|\alpha|)$$

Perhaps anybody knows how to prove this? If one can prove this, and a similar result for $H(\alpha X, Z)$, then the mutual information would be a simple subtraction

Edit 3: The result for univariate entropy can be proven by considering a pdf transformation. If $y = \alpha x$, then $\rho_y(y) = \frac{1}{|\alpha|} \rho_x(y / \alpha)$. Then one can simply integrate the definition of the differential entropy to obtain the desired result. The extension to multivariate case appears to be somewhat more difficult

1

There are 1 best solutions below

2
On

As you weren't considerate about the underlying distributions, lets assume that $$X\sim \mathcal{N}(0,\sigma^2_X) \text{ , } Y\sim \mathcal{N}(0,\sigma^2_Y)$$

Just to ease out computation, for a non scaled additive Gaussian Channel, $Z=X+Y \sim \mathcal{N}(0,\sigma^2_X+\sigma^2_Y)$ whose differntial entropy is $$h(Z)=\frac{1}{2}\log\left[2\pi e (\sigma^2_X+\sigma^2_Y)\right]$$

By definition $I(X;Z)=h(Z)-h(Z|X)=\frac{1}{2}\log\left[2\pi e (\sigma^2_X+\sigma^2_Y)\right]-\frac{1}{2}\log\left[2\pi e (\sigma^2_Y)\right] = \frac{1}{2}\log\left(1+\frac{\sigma^2_X}{\sigma^2_Y}\right)$

Also note that, $\text{Var}(\alpha X)=\alpha^2\sigma^2_X$ and hence $$I(\alpha X;Z)=\frac{1}{2}\log\left(1+\frac{\alpha^2\sigma^2_X}{\sigma^2_Y}\right)$$

EDIT: Based on comments:

(TBA)