Information limit for digital signal

55 Views Asked by At

Wikipedia give the Shannon-Hartley theorem as:

$$ C = B \log_2 \left(1+ \frac{S}{N}\right) $$

Where $S/N$ is the signal to noise ratio, with each quantity measured in watts.

What if the channel is purely digital, and simply has a probability $p$ to flip each bit transmitted? Is that a special case of the above equation? How can the channel capacity be calculated?

2

There are 2 best solutions below

0
On BEST ANSWER

The Shannon-Hartley theorem concerns the Gaussian channel, i.e., a channel in which Gaussian noise with variance $N$ is added to the input signal. The information you get across this channel depends on the distribution of the input. The maximum information you can get across is called the capacity, and the input distribution that achieves this capacity is the Gaussian distribution for a Gaussian channel. The capacity of the channel is precisely what is stated in the Shannon-Hartley theorem.

The other example you give, in which you have a digital input and in which you just flip a bit with probability $p$ is a binary symmetric channel (assuming that you only allow ones and zeros at the channel input). Of course, also here the information you get across the channel depends on the input distribution. The maximum, i.e., the capacity, is achieved in this case by the uniform (or, Bernoulli-$1/2$) distribution: Zeros and ones are transmitted with equal probability. However, the capacity of this channel differs from the one of the Gaussian channel. It can be shown easily that the capacity in this case is $C=1-h_2(p)$, where $h_2(p)=-p-\log p-(1-p)\log(1-p)$.

1
On

I am adding this is as an answer, but I am not sure if it is correct:

I think this would be a very simple case of the theorem. Since each bit has probability $p$ of being corrupted, then "on average" $p$ of the bits will be erroneous, or noisy.

By this logic, the signal-to-noise ratio is $p$, so the modified equation will be $C= B \log_2{(1+p)}$.