Consider a bookstore that has an inventory of used books $U' = \{\text{`To Kill a Mockingbird', `1984'}, \ldots\}$, and an inventory of new books $N' = \{\text{`Project Hail Mary', `The Midnight Library'}, \ldots\}$. The book selected by a customer buying a used book is a random variable $U$ taking values in the set $U'$, with entropy $H(U)$. Similarly, the book chosen by a customer buying a new book is a random variable $N$ taking values in the set $N'$, with its entropy denoted by $H(N)$. The bookstore is interested in the random variable $X$, which represents the next customer's choice of book, where $X$ takes values in the set $X' = U' \cup N'$. Let $p$ represent the probability that the next customer chooses a used book, then $X$ can be defined as: \begin{equation} X = \begin{cases} U & \text{with probability } p \\ N & \text{with probability } 1 - p \end{cases} \end{equation} Given this scenario, how can the bookstore's uncertainty, $H(X)$, be expressed in terms of the given $H(U)$, $H(N)$, and $p$?
My answer to this part is $H(X)= pH(U)+(1-p)H(N)+ H_b(p)$, where $H_b(p$ is the binary entropy. I used the fact that $X= U'\cup N'$ and wrote $H(X)$ as two summations since U, N are mutually exclusive.
Suppose we allow $p$ to vary. How can we find $\max_p\{H(X)\}$ and $\min_p\{H(X)\}$?
The expansion looks correct. Write $$ H(X)=p \left[H(U)+\log(1/p)\right]+ (1-p) \left[H(N)+ \log (1/(1-p))\right] $$ which is a convex combination of two nonnegative functions. You probably can assume $H(U)$ and $H(V)$ are given constants and then use Lagrange multipliers (or other favourite method) to optimize over $p.$