The well-known capacity formula for additive white Gaussian noise (AWGN) channel is given by \begin{align} C=\frac{1}{2}\log_2(1+{\sf SNR}) \text{ [bits/channel use]} \end{align} where ${\sf SNR}=P/N$ is the signal-to-noise ratio. In other words, there exists some code enabling reliable communication with a rate less than $\frac{1}{2}\log_2(1+{\sf SNR})$, and there is no such code enabling reliable communication with a rate larger than $\frac{1}{2}\log_2(1+{\sf SNR})$.
This can be directly verified from Shannon's channel coding theorem with mutual information computation. But, for information theory newbies, the so-called "sphere-packing argument" is welcomed due to its intuitiveness. As shown in the above Wikipedia link, the converse part (an upper bound of achievable rates) posits that $n$-dimensional balls whose centers are codewords and radii are $\sqrt{nN}$ should not be overlapped to rid of the error.
What I'm wondering is the validity of this argument. To the best of my knowledge, the sphere-packing density is upper-bounded by $2^{-0.599n}$. When applying the sphere-packing argument again with this bound, \begin{align} \text{(density)}=\frac{M\times \sqrt{nN}^n}{\sqrt{n(P+N)}^n}\leq2^{-0.599n}, \end{align} which boils down to \begin{align} R=\frac{\log_2 M}{n}\leq \frac{1}{2}\log_2(1+{\sf SNR})-0.599. \end{align}
This clearly contradicts with the Shannon theory. I guess that error can be asymptotically zero even when approving slight overlapping. My questions are:
- Is my thought -- sphere-packing argument contains logical flaw -- reasonable?
- If so, how can we modify the argument?
- And is my guess true?