Result regarding mutual information of bounded random variable

57 Views Asked by At

I have a random variable $X$ which takes values in $[0,1].$ Thus $X$ can be written as $$ X=0.X_1X_2...X_k..... $$ where $(X_1,X_2,...,X_k)$ denotes the random vector corresponding to first $k$-bits of $X.$ Let $X_1^k=(X_1,X_2,...,X_k),\ X_{k+1}^{\infty}=(X_{k+1},X_{k+2},...)$. I am trying to prove the following:

$$ I(X_1^k;X_{k+1}^{\infty})= \lim_{m \rightarrow \infty} I(X_1^k:X_{k+1}^{k+m}). $$

Is this correct ? I am not sure where to start with to prove the above result. One can assume that it has nice properties like having density, etc if required.

1

There are 1 best solutions below

1
On

I think you are taking the wrong approach to the formula. That equality is not a property to be proved from the particular properties of the vectors involved. It's rather something like a (general) definition.

The mutual information $I(X;Y)$ is well defined for any random variables $X$, $Y$. This includes multivariate random variables, which can be represented as vectors of arbitrary (finite) dimensions: $I(X_{1}^n;Y_1^m)$ Now, we might want to extend this to vectors of "infinite dimensions" (sequences). And the natural way to define the mutual information betweeen a finite vector $X_{1}^n$ and an infinite vector $Y_1^\infty$ would be:

$$I(X_1^n;Y_{1}^{\infty})= \lim_{m \rightarrow \infty} I(X_1^k:Y_{1}^{m})$$