I encounter this question when studying the Theil index for income inequality, which is invented by Theil who borrows a lot from information theory.
My question is just as that in the title. Can we calculate (0/0)log(0/0)? I know in information theory we have 0log(0)=0 and 0log(0/0)=0. But what about (0/0)log(0/0)?
Thanks!
Actually, what we have is that in several formulas (esp. the entropy) appears the term $ p_i \log(p_i)$ - that expression is not defined when $p_i=0$, because $0 \log(0)$ is not defined, but it's readily seen that the formulas remain valid if, we take the convention $x \log(x)|_{x=0}=0$ (this is also reasonable by a limit argument). Or, writen more concisely (but also more confusingly) $0 \log(0)=0 $. But it must be understood that this "rule" only applies for that kind of expression. Elsewhere, we fall into absurd consequences.
... and this might be one. I'm not sure where you've read this, and it's not clear what this means. If you mean,as before $x \log(x/x)|_{x=0}=0$ ... well, that's obviously true in the limit, because we get $0 \log(1) = 0 \times 0 =0$.
Again, the question as stated makes little sense. Obviously, $(x/x) \log(x/x) $ tends to $1 \log(1)=1 \times 0 =0$ as $x\to 0$.
But to take as such a general "rule" that $(0/0)\log(0/0)=0$ would be as wrong/objectionable and prone to errors as assuming $0/0=1$.