Confusion on Shannon's joint entropy

59 Views Asked by At

Given the definition of Shannon's joint entropy $H(X,Y) = -\sum_{x,y}p(x,y)log(p(x,y))$, I am not sure of how to compute it for a distribution of random variables such as $X = p$ and $Y = 1 - p$. It seems to be very simple but I think I misunderstood the defition. This ptoblem came from an exercise from Nielsen and Chuang's book:

"Suppose $ \rho = p\lvert0\rangle\langle0\rvert + \dfrac{(1-p)}{2}(\lvert0\rangle+\lvert1\rangle)(\langle0\lvert + \langle1\lvert) $

Evaluate $S(\rho)$ and $H(p,1-p)$"

I am stucked in evaluating the Shannon's entropy.

The factor $p(x,y)$ is the joint probability distribution, so it can be written as $p(x,y) = p(Y = y|X=x)p(X=x)$

But, for $X = p$ and $Y = 1-p$ how can I compute $H(X = p , Y =1-p)$? how can the result of this be different from the binary entropy when $X$ is distributed as $p$ and $1-p$?

Is $H(X = p , Y = 1 - p) = -\sum_{x,y}p(p , 1-p)log(p(p,1-p))$ = $p(Y = 1-p|X = p)p(X = p)log(p(Y = 1-p|X = p)p(X = p)) = (1-p)p\times log((1-p)p)$?

If not, what concept am I missing?