How to proof information inequality with uncertainty entropy?

95 Views Asked by At

The uncertainty (= entropy) of a discrete random variable X in [bit] is defined by H(X) = E {−ld(pX(X))} = − P x∈supp(pX ) pX(x)ld(pX(x))

Problem Using the information inequality which says that ld(x) ≤ (x − 1)ld(e) with equality if and only if x = 1, show that 0 ≤ H(X) ≤ ld(L) for X ∈ {x1, . . . , xL} and equality if and only if X is uniformly distributed with P(X = xi) = 1/L for all i.

Can somebody help me with this problem?