Information contained in single random sample

48 Views Asked by At

Have two parties. Both parties know a fixed probability distribution $P(t)$ with mean 0. Both parties know two fixed arbitrary numbers $a,b, a < b$. The first party chooses an arbitrary number $t_0 \in [a,b]$, generates 1 random sample $\tilde{t}$ from $P(t-t_0)$, and communicates it to the second party. How much information (in bits) can be communicated between the parties via this protocol?

The to the best of my understanding, $\tilde{t}$ is the best estimator of $t_0$, and I should somehow extract the information quantity from the distribution of $\tilde t$ via Shannon theorem, but I do not quite understand how to proceed

1

There are 1 best solutions below

1
On BEST ANSWER

The (information) capacity of a discrete memoryless channel is given by

$$ \tag{1} C = h(Y)-h(Y|X), $$

in bits per channel use, where $X$ is the input (symbol) of the channel, $Y$ is the output (symbol) of the channel, and $$h(Z)\triangleq -\int p(z) \log_2p(z) dz$$ is the differential entropy of a random variable $Z$ (assuming the integral exists), with $p(z)$ denoting the distribution function of $Z$).

In your case, the the conditional distribution of $Y$ given $X$ equals $$ p(y|x)=P(y-x), $$ whereas the (unconditional) distribution of $Y$ equals $$ \begin{align} p(y)&=\int p(x)p(y|x)dx\\ &=\frac{1}{b-a}\int_a^b P(y-x)dx. \end{align} $$

Given the above distribution you are now in a position to compute the entropies of the righ-hand side of (1). Of course, in the general case, the form of $P(\cdot)$ will not allow for simple, closed form expressions of the entropies (and, therefore, capacity), and one will have to computer the entropies numerically.