Does an information measure for a signal do a better job if it assumes some things about the signal?
For example: I have a digital stream of data, 0s and 1s coming at a clock rate $r$. What is the best way to measure the information content of my stream? Now I add some 'jitter' to my clock, i.e. it is $r+\delta r$, where $\delta r$ is a random variable. Obviously I have more information 'hidden' in the jitter. How can I best quantify the information content now? Is there a systematic way to choose how I measure information based on what I know?
I'm sorry if this is not the right forum to post this.
Let's say your source is $X$, your observation is $Y$, and your noise source is $Z$. (Shannon's) Entropy of $Y$ increases as your noise gets stronger as you noted. However, the mutual information $I(X;Y)$ can only decrease when you add more noise (due to information processing inequality). Mutual information measures how much information you have about the source, and not in the noise, in this case.