I am trying to understand entropy. From what I know we can get the entropy of a variable lets say X.
What i dont understand is how to calculate the entropy of a matrix say m*n. I thought if columns are the attributes and rows are object we can sum the entropy of individual columns to get the final entropy(provided attributes are independent). I have couple of question
- IS my understanding right in case of independent attributes?
- What if the attributes are dependent? what happens to entropy? Is there where conditional entropy comes in?
Thanks
First of all, you shall keep in mind that there are actually many entropy definitions. The most common one is so called Shannon information entropy
$$H(X) = -\sum_{i=1}^{n}p_i\log p_i$$ where $p_i$ is the probability of seeing the $i$th possible outcome of $X$.
Therefore
I am not sure in what context you want to find the entropy for a matrix, but in image processing, where images are represented by matrices. The way that we measure the entropy of an image is to