Suppose I have a random vector $\bar{X}=[X_{1},X_{2}, X_{3}]$. X1,X2 and X3 can take values from the alphabet {0,1,2,3} . (This can be even generalized to a finite set of cardinality N). I don't have any information on the probability distribution of these random variables.
If X is random variable then its entropy is:
$H(X) = -\displaystyle\sum_{x} p(x)\log p(x).$
Similarly, I understand that we have definitions for joint entropy for the random vector $\bar{X}=[X_{1},X_{2}, X_{3}]$.
I understand that If no constraints exist the entropy maximizing distribution is uniform distribution. So here the joint pdf will be uniform pdf when no constraints exist.
My questions:
- What if I have a constraint on $X_{3}$ such that $E[{X_{3}}^{2}] <\alpha$
- What if I have a constraint on $X_{2}$ and $X_{3}$ such that $E[{X_{2}}^{2}] +E[{X_{3}}^{2}]<\alpha$
How will we go about finding the entropy maximising distributions in this case? Can anyone give some details.
Thanks.