understanding the conditional entropy in the case of having uniform distribution?

93 Views Asked by At

Would you please help me to understand the conditional entropy in this example which I got stuck in?

The example Considers 4 uniformly popular binary vectors, for example; {f1,f2,f3,f4} each with entropy F bits. So H(f1)=H(f2)=H(f3)=H(f4)=F bitS. It is assumed that pairs {f1,f2} and {f3,f4} are independent, while correlations exist between f1 and f2, and between f3 and f4.

Now they mentioned that "Specifically, H(f1|f2) = H(f2|f1) = F/4 and H(f3|f4) = H(f4|f3) = F/4." My question is how the answer is F/4?and why it seems it is so obvious?

I understand that since these vectors are uniformly popular so the probability of picking each of them is 1/4. and since pairs {f1,f2} and {f3,f4} are independent so the probability of picking each pair is 1/2. So I think the conditional entropy H(f1|f2) = H(f2|f1) for example, should be F/2, not F/4. Could somebody please help me to understand the answer? why it is F/4?