If I have a bag of $b$ many balls, each numbered from $1, 2, \ldots, b$, and I uniformly-randomly pick one ball. Then I ask you "how much information would you gain should I tell you the ball which I have selected?". Your answer will be $\log_2(b)$. This is just Shannon's entropy of picking one ball at uniform-random.
Now, imagine that I repeated the same ball picking from the $b$-balls bag, uniformly-randomly, with replacement, and took $n$ many balls, and formed a list. The list has $n$ many balls, possibly with repetition. My question is: what is the entropy of this $n$-balls list?
What I did so far: I think the total space of possibilities is $b^n$, so $n\log_2(b)$?
Or is the space $nb$, hence $\log(nb)$?
After all, the size of this list is just $n$ many balls! Or, is it just $\log_2(n)$?
From your description it is clear that each of the $n$ uniform selections from $b$ balls are independent. So the overall entropy is $n$ times the entropy of picking one ball uniformly, or $n \log b.$
For any two random variables, $X,Y$ which are independent, their joint entropy is given by $H(X,Y)=H(X)+H(Y).$
For you the $i^{th}$ random variable is $X_i$ with entropy $\log b$ and you apply the above by induction and get $$ H(X_1,\ldots,X_n)=H(X_1)+\cdots+H(X_n)=n H(X_1)=n \log b. $$