I am asked to determine what the minimum entropy of a discrete random variable might be.
I have a hunch that the result will be zero, given that $$H(X) = \sum_i p(x_i)\log(\frac{1}{p{x_i}})$$
Let's say that our random variable takes in $n$ values. We can easily zero out $n-1$ terms of the sum by setting their probabilities to zero. Since they need to add up to $1$, there is one event with the probability of $1$, and it will contribute $1 \log (1) = 0$ to the overall entropy.
So, the minimum entropy is $0$ and there are exactly $n$ distributions to achieve that: We choose one element and give it the probability of $1$ ($n$ ways to do that) and the rest are zero.
Is it the correct solution?
Yes, if a variable evaluates to one of the values $\{0, 1, 2, ..., n\}$ but always evaluates to the number, e.g., 2, then there is no uncertainty about the outcome, hence the entropy is 0