Calculating Entropy

136 Views Asked by At

Hi there kind people,

I'm studying for an Artificial Intelligence test in a week or so, and this question is from a past paper - and it has really stumped me. Any help would be appreciated.

Thank you very much in advance :)


Assume the data shown in the table below are split by a decision tree node.

Calculate the Entropy of these data knowing that log2(0.2) = -2.3 and log2 (0.8) =-0.3.?

╔════╦════╦═══════╗
║ X1 ║ X2 ║ Class ║
╠════╬════╬═══════╣
║  0 ║  0 ║     1 ║
║  0 ║  1 ║     1 ║
║  0 ║  0 ║     0 ║
║  0 ║  1 ║     0 ║
║  0 ║  1 ║     0 ║
║  1 ║  0 ║     0 ║
║  1 ║  0 ║     0 ║
║  1 ║  1 ║     0 ║
║  0 ║  0 ║     0 ║
║  0 ║  1 ║     0 ║
╚════╩════╩═══════╝
1

There are 1 best solutions below

1
On BEST ANSWER

You seem to have two output classes and given the hint, the inputs must be equally likely since 2 of 10 inputs lead to class 1, the rest class zero, corresponding to 0.2 and 0.8 respectively.

The definition of entropy is $$-\sum_{x} p_x \log_2(p_x)$$ which will give the numerical value of $$-0.2 \times (-2.3) -0.8 \times (-0.3)$$as the entropy.