Nice dimension independent proof of entropy inequality?

46 Views Asked by At

How does one prove that for arbitrary finite number of random variables $X,Y,Z,T,...$ the Shannon Entropy inequality holds

$H(X,Y,Z,T,...) \leq H(X)+H(Y)+H(Z)+H(T)+...$

I know how to do it with Lagrange multipliers, but I was wondering if there is something more tidy out there