Entropy of noisy signal

78 Views Asked by At

We have input signal $X$, the output signal Y and random noise $Z$, then:

$$Y=X+Z$$

Of course, the mutual entropy: $$I(Y,X)=H(X)-H(X\mid Y)=H(X)-H(X-Y\mid Y) \geq H(X)-H(X-Y)$$

Could we say that $H(X-Y\mid Y)=H(X-Y)$ means we have the perfect reconstruction?

Thank you very much!

1

There are 1 best solutions below

0
On

Rather on the contrary: $H(X\mid Y)=H(Z\mid Y)=H(Z)$ implies that the output does not tell you anything about the input (or the noise).